Borrar filtros
Borrar filtros

I need help with back propagation algorithm implementation !

2 visualizaciones (últimos 30 días)
alex
alex el 1 de Sept. de 2012
Hi All !
I have to implement simple version of back propagation algorithm that have to recognize hand written digits:'2','3','4','8'.
I have set of images of these characters that used for training and for testing the neuronal network after teaching process.
BUT: my implementation went wrong somewhere,because if to teach network to recognize '2',and to test the recognition of '2'-all '2's are recognized.
BUT IF to teach network to recognize '2',and afterwards to teach network to recognize '3' and to test the recognition-all '2's are failed to be recognized.
I think that I have error in the error update step while changing the weights of graph that represent the neuronal network.
here the sites i used to learn about the algorithm http://fann.sourceforge.net/report/node4.html http://www.speech.sri.com/people/anand/771/html/node37.html HERE IS THE PART OF THE SUSPICIOUS CODE: CAN SOMEONE TELL ME WHAT I DID WRONG ?
function [weightIH,weightHO]=backward_propagate(weightIH,weightHO,x,y,z,t,eta)
%function does backward propagation of the signal in the neuronal network
%flow of changes is like
%input<-hidden<-output
%Input:
%x-input values,the lined image
%y-hidden stage output signal
%t- desired output
%z-actual output
%eta-parameter of the step size on update
%Output:updated values of the neuronal network
%weightIH-input to hidden network relative values
%weightHO-hidden to output network values
%weight-line of matrix w(k,:)-outgoing connections weights from given node
to nodes in the next layer
t=t(:)';
z=z(:)';
y=y(:)';
x=x(:)';
[M,N]=size(weightIH);%M-input layer size,N-hidden layer size
L=size(weightHO,2);%L-output layer size
% Hidden to Output layer
delta_k=z.*(1-z).*(t-z);%(1-o)o(t-o)%error of hidden to output stage
for k=1:N
weightHO(k,:)=weightHO(k,:)+eta*delta_k.*y(k);
end
%Input to Hidden layer
sigma=weightHO*delta_k';
delta_j=y.*(1-y).*sigma';
for k=1:M
weightIH(k,:)=weightIH(k,:)+eta*x(k)*delta_j; %
end
end %function

Respuestas (1)

Greg Heath
Greg Heath el 1 de Sept. de 2012
You have to learn all of the cases simultaneously.
If you learn the twos and then learn the threes, the twos will be forgotten.
The cure is to learn a mixture of twos and threes.
In particular, if you learn on a set of data and another set of data becomes available later, you should train with the last set of data AND a sufficiently large subset of the previous data so that the latter is not forgotten.
Hope this helps.
Greg

Categorías

Más información sobre Image Data Workflows en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by