Is it possible to add softmax layer and use cross entropy with layrecnet?

I want to do classification using a RNN, but I am having difficulties adjusting the layrecnet to match a classification problem.
For classification, I would like to:
  • add a softmax layer before output
  • using cross-entropy for loss calculation
net = layrecnet(1:3, 10);
However, the trainlm does not support crossentropy, while transcg with crossentropy will result in error due to attempt of memory allocation of 30+ GB. My question is therefore:
How can I modify layrecnet to do classification instead of regression?

 Respuesta aceptada

Greg Heath
Greg Heath el 3 de Jun. de 2016
For c classes just use {0,1} c-dimensional unit vectors in the output.
The assigned class is obtained from the maximum value.
If you need a posterior probability estimate just use LOGSIG and divide the result by the sum.
Hope this helps.
Greg

2 comentarios

Hi Greg. Sure, that would be possible, but it is not a feasible solution, since the training would be based on a regression error/performance measure.
I found out the solution:
  1. Initialize the network as a patternnet
  2. Manually modify network to have recurrent threads
This seems to work.
Don't you dare belittle using regression measures of performance for classification!!!
Before MATLAB was a twinkle in Cleve's eye, we (MIT Lincoln LAB) used Fortran regression programs for pattern-recognition and classification of radar targets.
You don't have to think too hard about it. Neural networks are UNIVERSAL APPROXIMATORS. Using {0,1} targets doesn't change that.
You might be more convinced by using the TYPE command to compare PATTERNNET and FITNET. (Remember, both are special cases of FEEDFORWARDNET).
You may also want to compare the performance of FITNET and PATTERNET on some of the classification datasets obtained from
help nndatasets
and/or
doc nndatasets.
Hope this helps.
Greg

Iniciar sesión para comentar.

Más respuestas (0)

Categorías

Más información sobre Deep Learning Toolbox en Centro de ayuda y File Exchange.

Productos

Preguntada:

el 2 de Jun. de 2016

Comentada:

el 4 de Jun. de 2016

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by