Metrics Handlers
This example shows how to manage Modelscape™ test metrics and their associated threshold objects using MetricsHandler
objects.
The MetricsHandler
object produces reports that summarize the metrics and their status in the container relative to their thresholds.
For more information about test metrics and thresholds, see Credit Scorecard Validation Metrics and Fairness Metrics in Modelscape. To learn how to write your own metrics, see Test Metrics in Modelscape.
In this example, you set up metrics and thresholds for mock data of a credit scoring model. You create a MetricsHandler
object to visualize the metrics and summarize the results. You then set an overall status for the handler based on these metrics.
Set Up Test Metrics and Thresholds
Use the following random data as examples of training response data (defaultIndicators
) and model predictions (scores
).
rng("default");
scores = rand(1000,1);
defaultIndicators = double(scores + rand(1000,1) < 1);
Create these metrics:
Area under the receiver operating characteristic curve (AUROC)
Cumulative accuracy profile (CAP accuracy) ratio
Kolmogorov-Smirnov statistic
For the AUROC and CAP accuracy ratios, designate values greater than 0.8 as a pass, values less than 0.7 as a failure, and values between these as undecided, requiring further inspection. Set no thresholds for the Kolmogorov-Smirnov statistic.
import mrm.data.validation.TestThresholds import mrm.data.validation.pd.* auroc = AUROC(defaultIndicators,scores); aurocThresholds = TestThresholds([0.7 0.8],["Fail","Undecided","Pass"]); cap = CAPAccuracyRatio(defaultIndicators,scores); capThresholds = TestThresholds([0.6 0.7],["Fail","Undecided","Pass"]); ks = KSStatistic(defaultIndicators,scores);
Add Metrics to Metrics Handler Object
Add the metrics to a MetricsHandler
object and display the result.
import mrm.data.validation.MetricsHandler
mh = MetricsHandler;
append(mh,auroc,aurocThresholds);
append(mh,cap,capThresholds);
append(mh,ks);
disp(mh)
MetricsHandler with properties: KS: [1x1 mrm.data.validation.pd.KSStatistic] AUROC: [1x1 mrm.data.validation.pd.AUROC] CAP: [1x1 mrm.data.validation.pd.CAPAccuracyRatio]
The handler contains these three metrics you can access as properties of this handler object. Use these properties to access the constituent metrics diagnostics and visualizations.
visualize(mh.AUROC);
Interrogate Metrics Handlers
View the performance of the model relative to the given metrics by using the report method
.
The model performs well on AUROC, but the undecided status of the Accuracy Ratio suggests the model requires a closer look.
summaryTable = report(mh); disp(summaryTable)
Metric Value Status Diagnostic ____________________________ _______ ___________ ___________ Area under ROC curve 0.82905 Pass (0.8, Inf) Accuracy ratio 0.65809 Undecided (0.6, 0.7] Kolmogorov-Smirnov statistic 0.51462 <undefined> <undefined>
When the handler carries complex, non-scalar metrics, use Keys
and Metrics
arguments with report
. For more information, see Fairness Metrics in Modelscape.
Set Overall Status for the Handler
For a handler with many metrics, set an overall status for the handler by associating a status interpreter with the handler. In this example, you use a built-in Modelscape interpreter that is compatible with your threshold objects. The status descriptions of the individual metrics determine the overall status. In this case, the overall status is undecided, corresponding to the worst individual status.
mh.StatusInterpreter = @mrm.data.validation.overallStatus; summaryTable = report(mh); disp(summaryTable)
Metric Value Status Diagnostic ____________________________ _______ ___________ ___________ Area under ROC curve 0.82905 Pass (0.8, Inf) Accuracy ratio 0.65809 Undecided (0.6, 0.7] Kolmogorov-Smirnov statistic 0.51462 <undefined> <undefined> Overall NaN Undecided <undefined>
Implementing thresholding systems with other descriptive strings requires a custom status interpreter. See the instructions before the StatusInterpreter declaration in the MetricsHandler implementation.
edit mrm.data.validation.MetricsHandler
Alternatively, modify the interpreter for your needs.
edit mrm.data.validation.overallStatus
You can also set the StatusInterpreter
property of the handler when you create the object, using this command.
mh2 = MetricsHandler(StatusInterpreter=@mrm.data.validation.overallStatus)