Weights don't initialize.
2 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
I created the following network:
P = dataH;
T = dataXsm;
net=network;
net.numInputs = 1;
net.numLayers = 3;
net.biasConnect(1) = 1;
net.biasConnect(2) = 1;
net.biasConnect(3) = 1;
net.inputConnect = [1; 0; 0];
net.layerConnect = [0 0 0; 1 0 0; 0 1 0];
net.outputConnect = [0 0 1];
net.inputs{1}.size = 2;
net.layers{1}.size = 2;
net.layers{1}.transferFcn = 'hardlim';
net.layers{1}.initFcn = 'initnw';
net.layers{2}.size = 10;
net.layers{2}.transferFcn = 'hardlim';
net.layers{2}.initFcn = 'initnw';
net.layers{3}.size = 10;
net.layers{3}.initFcn = 'initnw';
net.layers{3}.transferFcn = 'hardlim';
net.initFcn = 'initlay';
net.IW{1,1}, net.IW{2,1},
net.LW{3,2}
net.b{1}, net.b{3}
net.trainFcn = 'trainc';
net.performFcn = 'sse';
net.adaptFcn = 'trains';
net.trainParam.goal=0.01;
net.trainParam.epochs=100;
net.trainParam.passes = 1;
net = init(net);
a = sim(net,P), e = T-a
net=train(net,P,T);
net.adaptParam.passes = 100;
[net,a,e] = adapt(net,P,T); e
twts = net.IW, tbiase = net.b
but it doesn't work, weights don't initialize and it gives all 1 as result: twts =
[2x2 double]
[]
[]
a =
1 1 1...1
...
1 1 1...1
Is something wrong with layer connection? Or do I initialize something wrong?
0 comentarios
Respuesta aceptada
Vito
el 30 de Oct. de 2011
No.
Multilayer percetron doesn't contain 'hardlim'(hardlim -is capable to classify only linearly separable set. Two or more layers in network - aren't separable linearly. ). Using 'logsig'.
The equivalent network - multilayer percetron.
P =[0 1 0 1; 0 0 1 1];
T = [0 0 0 1];
net=newff(minmax(P),[2,10,1],{'logsig','logsig','logsig'},'trainbfg');
net.trainParam.epochs = 100;
net = init (net);
net.IW{1,1}, net.IW{2,1},
net.LW{3,2}
net.b{1}, net.b{3}
net=train(net,P,T);
a = sim(net,P)
'trainbfg' – back propagation learning.
Error in network design.
1 comentario
Greg Heath
el 31 de Oct. de 2011
Typically, only one hidden layer is needed.
Use as many defaults as possible (help newff).
newff automatically initializes weights with initnw
.
if [I N] = size(p) and [O N] = size(t) then
there are Neq = N*O training equations and
Nw = (I+1)*H+(H+1)*O unknown weights. For
accurate weight estimation it is desired that
Neq >> Nw
Typically Neq >= 10*Nw is adequate. However,
sometimes a larger ratio (e.g., > 30) is needed
and sometimes a smaller ratio (e.g., 2) will suffice.
Hope this helps.
Greg
Más respuestas (0)
Ver también
Categorías
Más información sobre Define Shallow Neural Network Architectures en Help Center y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!