Back Propogation Algorithm
The training input vectors and target vectors are read from files data1in and data1out respectively. The no of nodes in input and output layer is decided depending on the no. of rows in these datasets.
The no of hidden layers, No of nodes in each hidden layer and the target error (put 0.1) is to be input by the user.
Learning curve is plotted after every 100 epochs.
Learning factor can be varied using the slider at the bottom. This idea was picked from an algorithm created by by AliReza KashaniPour & Phil Brierley.
Activation function for hidden layers is logsig and linear for output layer!
Just press F5 and ve funn!
anshuman0387[at]yahoo[dot]com :)
Citar como
Anshuman Gupta (2024). Back Propogation Algorithm (https://www.mathworks.com/matlabcentral/fileexchange/23528-back-propogation-algorithm), MATLAB Central File Exchange. Recuperado .
Compatibilidad con la versión de MATLAB
Compatibilidad con las plataformas
Windows macOS LinuxCategorías
- AI, Data Science, and Statistics > Deep Learning Toolbox > Function Approximation, Clustering, and Control > Function Approximation and Clustering > Define Shallow Neural Network Architectures >
Etiquetas
Agradecimientos
Inspirado por: Function Approximation Using Neural Network Without using Toolbox
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!Descubra Live Editor
Cree scripts con código, salida y texto formateado en un documento ejecutable.
Versión | Publicado | Notas de la versión | |
---|---|---|---|
1.0.0.0 |