BACK PROPAGATION WITH 2 HIDDEN LAYERS
2 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
NAYANA KAMAL
el 17 de Jun. de 2014
Comentada: Greg Heath
el 20 de Jun. de 2014
I need to train a neural network with 2 hidden layers.please post the matlab code for 2 hidden layers. In neural network,nntool i can only change the no. of hidden nodes how can i change the no. of hidden layers.
3 comentarios
Respuesta aceptada
Greg Heath
el 18 de Jun. de 2014
My recommendation is to FIRST use one hidden layer and try to minimize the number of hidden nodes while achieving an adjusted R-square >= 0.99. For more advice, I need more information re number and dimensions of input/target examples as well as an explanation of what you are trying to model.
I have posted tens of examples. Search on
greg fitnet % for regression/curve-fitting
greg patternnet % for classification/pattern-recognition
Once a single hidden layer solution is found, you can find a double hidden layer solution by changing fitnet(H) to fitnet([H1,H2]). I explain how to choose H. You are on your own with [H1,H2] except I have heard that, typically, H1+H2 < H. However, I don't believe a proof exists.
Hope this helps.
Thank you for formally accepting my answer
Greg
2 comentarios
Greg Heath
el 20 de Jun. de 2014
Again, 1 hidden layer is sufficient for minimizing MSE. Using more hidden layers does not guarantee a better result. You just need to increase the number of hidden nodes, H, AND try multiple (e.g., 10) cases of initial weights for each value of H.
See my examples. Search using
greg fitnet Ntrials
Más respuestas (0)
Ver también
Categorías
Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!