why I obtained different result from NN sim and manual calculation using the weights
2 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
I use newff generate a very simple neural network with one input layer, one hidden layer (just one neuron for easy manual calculation), and one output layer like this:
>>x1=[-15:1:15];
>>y1=0.05*x1.^3-0.2*x1.^2-3*x1+20;
% the hidden layer and output layer both use "purelin" as transfer function.
>>net=newff(x1,y1, 1, {'purelin'});
%training the network
>>net=train(net,x1,y1);
% I use 15 as test, the output from sim is 49.8170
>> output=sim(net, 15)
output = 49.8170
%%%%%%and I also manually calculate the output using 15 as input,
%my calculation is like this , first, I obtained the input and layer weights, and the bias are as following:
>>IW=net.IW
IW =
[0.6972]
[]
>>LW=net.LW
LW =
[] []
[0.5009] []
>> b=net.b
b =
[-0.5228]
[ 0.5173]
% the my calculation like this
>> (15*0.6972-0.5228)*0.5009+0.5173
ans = 5.4938
Why the "sim" and my manual calculation is different? Where is wrong? ??
0 comentarios
Respuestas (4)
Thijs
el 15 de Mayo de 2012
doing some matlab hackery i think i found the answer. both the inputs and the outputs of the neural network are mapped onto a minmax range, the inputs are mapped from -15 to 15 onto 0 to 1 (so 15 becomes 1). these mapped inputs are then passed through the neural network. purelin((purelin(1*0.6972-0.5228))*0.5009+0.5173)=0.8196. This output is then mapped from -1 to 1 onto ymin to ymax.
You can find ymin and ymax by entering: >>net.outputs{2}.range
conclusion: to manually duplicate the network performance:
input=15;
%find the input and output mapping
input_range=net.inputs{1}.range;
output_range=net.outputs{2}.range;
%perform the mapping
in=(input-input_range(1))/(input_range(2)-input_range(1));
%pass the input through the net
temp=(IW{1}*1+b{1})*LW{2,1}+b{2}
%map the output
y1=(temp--1)/2*(output_range(2)-output_range(1))+output_range(2)
0 comentarios
hassan khatir
el 19 de Jul. de 2023
Editada: hassan khatir
el 19 de Jul. de 2023
use this function:
function y2=sim2(net,x)
xoffset=net.inputs{1}.processSettings{1}.xoffset;
gain=net.inputs{1}.processSettings{1}.gain;
ymin=net.inputs{1}.processSettings{1}.ymin;
w1 = net.IW{1}; % (10x6)
w2 = net.LW{2}; % (2x10)
b1 = net.b{1}; % (10x1)
b2 = net.b{2};
% Input 1
y1 = (x-xoffset).*gain+ymin;
% Layer 1
a1 = 2 ./ (1 + exp(-2*(repmat(b1,1,size(x,2)) + w1*y1))) - 1;
% output
outputs=repmat(b2,1,size(x,2)) + w2*a1;
gain = net.outputs{2}.processSettings{:}.gain;
ymin = net.outputs{2}.processSettings{:}.ymin;
xoffset = net.outputs{2}.processSettings{:}.xoffset;
y2 = (outputs-ymin)./gain + xoffset;
end
0 comentarios
Ver también
Categorías
Más información sobre Define Shallow Neural Network Architectures en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!