Pretrained Neural network ALEX-NET training process.

3 visualizaciones (últimos 30 días)
Deepika B
Deepika B el 13 de Feb. de 2020
Comentada: Srivardhan Gadila el 25 de Feb. de 2020
Is the modelshown below is overfitting or not? sometimes it seems that mini-batch accuracy is less than validation accuracy why?.

Respuesta aceptada

Srivardhan Gadila
Srivardhan Gadila el 19 de Feb. de 2020
I think the model is not overfitting. The validation loss normally decreases during the initial phase of training, as does the training loss. However, when the network begins to overfit the data, the loss on the validation set typically begins to rise and clearly there is not much difference between the training loss and validation loss. You can refer to Improve Shallow Neural Network Generalization and Avoid Overfitting for more understanding of Overfitting and steps to avoid overfitting.
The Validation accuracy can be higher than the training (mini-batch) accuracy, one possible situtation is when the network has layers that behave differently during prediction than during training for example, dropout layers. It also depends on how the training & validation data are split.
  3 comentarios
Deepika B
Deepika B el 19 de Feb. de 2020
sir, I have another question.. Is pretrained neural network is used for any new task or particularly some specific task? Is vgg19 is suitable for eye disease classiifcation problem using fundus image by having 16gb ram cpu? thank you in advance sir.

Iniciar sesión para comentar.

Más respuestas (0)

Categorías

Más información sobre Image Data Workflows en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by