specifity and sensitivity measures in PAD - where to find them
where can I find the exact specificity and sensitivity values for each predictive model developed in PAD? I am interested to see true positives/negatives and false positives/negatives per model. Are they available in the PAD 6.8 version?
***Edited by Moderator Marissa to update platform capability tags****
I'm assuming you're referring to binary classification models ("scoring" in PAD terms). Our binary classification models do not return a binary outcome (true/false, accept/decline, churn/loyal etc) but return a probability: the probability to accept, to churn, etc.
The mapping from probabilities to labels is done in two steps. The fist one is in the Score Distribution in PAD, this maps probability (score) ranges to a distinct number of classes. The second step is an (optional) mapping of these classes to outcome labels that you can then use in your decision strategies.
The performance of the model is thus not measured by a confusion matrix (with associated metrics like sensitivity, specificity, F1 etc) but by a measure like AUC which associates probabilities with binary labels. AUC (called COC in older product versions) is the default metric for scoring models but you can change that in the project settings.