how it is possible to have overfitting before the network learn properly?
3 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Fereshteh....
el 21 de Dic. de 2014
Editada: Greg Heath
el 20 de Feb. de 2015
my question is when my network has a performance about 90 - 98% i mean my learning error is about 98%,(I suppose such performance means my net didn't learn anything yet), how it is possible that my net stops training due to early stopping point?
0 comentarios
Respuesta aceptada
Greg Heath
el 21 de Dic. de 2014
Editada: Greg Heath
el 20 de Feb. de 2015
Poorly worded question
Are we supposed to guess
1. That you are referring to a classifier ?
2. Which MATLAB function you are using ... patternnet ?
3. The number of classes c ?
4. The dimensionality of the inputs I ?
5. The number of hidden nodes H ?
6. The trn/val/tst ratio 0.7/0.15/0.15 ?
Overfitting only means that you have more unknown weights than training equations
Nw > Ntrneq
where
Ntrneq = Ntrn*c
Nw = (I+1)*H+(H+1)*c
Validation stopping has nothing to do with training data performance. It has to do with
OVERTRAINING AN OVERFIT NET
It means that the training has reached the point where validation set performance (mse or cross-entropy) has undergone a local minimum indicating that if you don't stop, you will have over-trained an over-fit net to the point where further training will probably make the net perform worse on val, tst and unseen non-training data.
Remember:
The goal of design is to use training data to obtain a net that works well on all nontraining data:
validation + test + unseen
Hope this helps.
Thank you for formerly accepting my answer
Greg
0 comentarios
Más respuestas (0)
Ver también
Categorías
Más información sobre Deep Learning Toolbox en Help Center y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!