Evaluation metrics for deep learning model model

What is the command to be used for computing the evaluation metrics for a deep learning model such as precision, recall, specificity, F1 score.
Should it explicitly computed from the Confusion matrix by using the standard formulas or can it be directly computed in the code and displayed.
Also are these metrics computed on the Validation dataset.
Kindly provide inputs regarding the above.

 Respuesta aceptada

Pranjal Kaura
Pranjal Kaura el 23 de Nov. de 2021
Editada: Pranjal Kaura el 23 de Nov. de 2021

0 votos

Hey Sushma,
Thank you for bringing this up. The concerned parties are looking at this issue and will try to roll it in future releases.
For now you can compute these metrics using the confusion matrix. You can refer to this link.
Hope this helps!

2 comentarios

Sushma TV
Sushma TV el 25 de Nov. de 2021
Thanks Pranjal. I went through the link that you sent but have a doubt in plotting Precision and Recall plots. Computation of values using Confusion matrix was possible but could not figure out the plots. What are the arguments of the function perfcurves to plot Precision- Recall curve?
'perfcurve' is used for plotting performance curves on classifier outputs. To plot a Precision-Recall curve you can set the 'XCrit' (Criterion to compute 'X') and YCrit to 'reca' and 'prec' respectively, to compute recall and precision. You can refer the following code snippet:
[X, Y] = perfcurve(labels, scores, posclass, 'XCrit', 'reca', 'YCrit', 'prec');

Iniciar sesión para comentar.

Más respuestas (0)

Categorías

Más información sobre Deep Learning Toolbox en Centro de ayuda y File Exchange.

Productos

Versión

R2020b

Preguntada:

el 18 de Nov. de 2021

Comentada:

el 26 de Nov. de 2021

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by