Exporting ROC Curve and Confusion Matrix computation code from the Classification Learner App

8 visualizaciones (últimos 30 días)
Hi everyone,
I am using the Classification Learner App to train a Linear SVM classifier using k-fold cross-validation. When I export the code, I get the instructions to train the classifier and to obtain some validation metrics (accuracy, prediction and scores):
partitionedModel = crossval(trainedClassifier.ClassificationSVM, 'KFold', 5);
% Compute validation accuracy
validationAccuracy = 1 - kfoldLoss(partitionedModel, 'LossFun', 'ClassifError');
% Compute validation predictions and scores
[validationPredictions, validationScores] = kfoldPredict(partitionedModel);
Once I have these variables, I do not know how to obtain the ROC Curve and the Confusion Matrix computed by the Classification Learner App.
  • ROC CURVE:
I have tried with the perfcurve function this way:
[X,Y,T,AUC,OPTROCPT,SUBY,SUBYNAMES] = perfcurve(response,validationScores(:,1),'PositiveClass');
figure, plot(X,Y,OPTROCPT(1),OPTROCPT(2),'r*');
I obtain a ROC Curve but I do not know if the result is correct because the ROC curve and the AUC are a bit different from those performed by the Classification Learner App. Moreover, I do not know how to compute the "Current Classifier Point" obtained in the App.
  • CONFUSION MATRIX
I have tried with:
C = confusionmat(response,validationPredictions);
CP = classperf(response,validationPredictions);
but I do not know how to interpret these results in order to build the Confusion Matrix of True Positive Rates / False Negative Rates or Positive Predictive Values / False Discovery Rates.
Any help or advice?
Thanks in advance, Rafa
  6 comentarios
Alessandro Fascetti
Alessandro Fascetti el 5 de Oct. de 2018
Same problem here. It is absolutely unbelievable that we cannot export figures from the Classification App. Saving the figure with the handles found in the app does not work as well. Plus the plotconfusion function appears to have a bug that hangs Matlab and I cannot use that either. This is ridiculous.
Ismat Mohd Sulaiman
Ismat Mohd Sulaiman el 17 de Mzo. de 2021
Now it is possible to export figures from the Classification App, I can't see the script to generate the ROC and AUC from the Generate Function tab.
This is so frustrating!

Iniciar sesión para comentar.

Respuestas (1)

Alex van der Meer
Alex van der Meer el 21 de Abr. de 2018
Editada: Alex van der Meer el 21 de Abr. de 2018

Your methodology is correct, you should use perfcurve to obtain the ROC curve outside of the app. This will return the tresholds that are used to compute each point in the ROC curve. You can use these thresholds on the validationScores values to classify (one threshold at a time). Then the result of this classification ( the binary predicted class values) can be used in the confusionmat function which contains TP,TN,FP,FN values. The reason that when you run this you get different results then the app, is that the crossval function in here:

partitionedModel = crossval(trainedClassifier.ClassificationSVM, 'KFold', 5);

uses a random partioning of the folds. It then takes 5 "blank" models and trains them on these folds. The randomness produces somewhat different results each time. There is a way to keep the random seed constant if you would want that. For that google something like "matlab set the random seed"

For more info see my answer to this question https://nl.mathworks.com/matlabcentral/answers/346479-how-does-the-classification-learner-app-generate-roc-curves-for-decision-trees-and-how-do-i-tune-the#answer_316364

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by