How to manually calculate a Neural Network output?
25 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Jeff Chang
el 2 de Mayo de 2018
Comentada: DarZim
el 4 de En. de 2024
Dear everyone,
I am exploring the Neural Network Toolbox and would like to manually calculate output by hand. I used one of the example provided by Matlab with the following code. Unfortunately, my output is incorrect. Does anyone know why? Thanks
%%below is the sample code from Matlab
[x, y] = crab_dataset;
size(x) % 6 x 200
size(y) % 2 x 200
setdemorandstream(491218382);
net = patternnet(10);
[net, tr_info] = train(net, x, y);
testX = x(:, tr_info.testInd);
testT = y(:, tr_info.testInd);
testY = net(testX);
testIndices = vec2ind(testY);
[error_rate, conf_mat] = confusion(testT, testY);
fprintf('Percentage Correct Classification : %f%%\n', 100*(1 - error_rate));
fprintf('Percentage Incorrect Classification : %f%%\n', 100*error_rate);
%%Manually calculate the output by hand
% nFeatures = 6
% nSamples = 200
% nHiddenNode = 10
% nClass = 2
% input layer => x (6x200)
% hidden layer => h = sigmoid(w1.x + b1)
% = (10x6)(6x200) + (10x1)
% = (10x200)
%
% output layer => yhat = w2.h + b2
% = (2x200)
w1 = net.IW{1}; % (10x6)
w2 = net.LW{2}; % (2x10)
b1 = net.b{1}; % (10x1)
b2 = net.b{2}; % (2x1)
h = sigmoid(w1*x + b1);
yhat = w2*h + b2;
[testY' yhat']
[vec2ind(testY)' vec2ind(yhat)']
0 comentarios
Respuesta aceptada
JESUS DAVID ARIZA ROYETH
el 2 de Mayo de 2018
you missed several normalization parameters, here I leave the solution :
[x, y] = crab_dataset;
size(x) % 6 x 200
size(y) % 2 x 200
setdemorandstream(491218382);
net = patternnet(10);
[net, tr_info] = train(net, x, y);
xoffset=net.inputs{1}.processSettings{1}.xoffset;
gain=net.inputs{1}.processSettings{1}.gain;
ymin=net.inputs{1}.processSettings{1}.ymin;
w1 = net.IW{1}; % (10x6)
w2 = net.LW{2}; % (2x10)
b1 = net.b{1}; % (10x1)
b2 = net.b{2};
% Input 1
y1 = bsxfun(@times,bsxfun(@minus,x,xoffset),gain);
y1 = bsxfun(@plus,y1,ymin);
% Layer 1
a1 = 2 ./ (1 + exp(-2*(repmat(b1,1,size(x,2)) + w1*y1))) - 1;
% output
n=repmat(b2,1,size(x,2)) + w2*a1;
nmax = max(n,[],1);
n = bsxfun(@minus,n,nmax);
num = exp(n);
den = sum(num,1);
den(den == 0) = 1;
y2 = bsxfun(@rdivide,num,den);%y2==outputnet == net(x)
2 comentarios
Sadaf Jabeen
el 14 de Mzo. de 2022
How can I compute the same with linear activation function and no bias values?
Más respuestas (3)
Amir Qolami
el 12 de Abr. de 2020
Editada: Amir Qolami
el 12 de Abr. de 2020
The In{i} and Out{i} are inputs and outputs of i(th) hidden(and also output) layer. There are two rescales before the input and after the output layer.
function output = NET(net,inputs)
w = cellfun(@transpose,[net.IW{1},net.LW(2:size(net.LW,1)+1:end)],'UniformOutput',false); b = cellfun(@transpose,net.b','UniformOutput',false); tf = cellfun(@(x)x.transferFcn,net.layers','UniformOutput',false);
%% mapminmax on inputs if strcmp(net.Inputs{1}.processFcns{:},'mapminmax') xoffset = net.Inputs{1}.processSettings{1}.xoffset; gain = net.Inputs{1}.processSettings{1}.gain; ymin = net.Inputs{1}.processSettings{1}.ymin; In0 = bsxfun(@plus,bsxfun(@times,bsxfun(@minus,inputs,xoffset),gain),ymin); else In0 = inputs; end
%% In = cell(1,length(w)); Out = In; In{1} = In0'*w{1}+b{1}; Out{1} = eval([tf{1},'(In{1})']); for i=2:length(w) In{i} = Out{i-1}*w{i}+b{i}; Out{i} = eval([tf{i},'(In{',num2str(i),'})']); end
%% reverse mapminmax on outputs if strcmp(net.Outputs{end}.processFcns{:},'mapminmax') gain = net.outputs{end}.processSettings{:}.gain; ymin = net.outputs{end}.processSettings{:}.ymin; xoffset = net.outputs{end}.processSettings{:}.xoffset; output = bsxfun(@plus,bsxfun(@rdivide,bsxfun(@minus,Out{end},ymin),gain),xoffset); else output = Out{end}; end
end
2 comentarios
Soumitra Sitole
el 20 de Abr. de 2022
Editada: Soumitra Sitole
el 20 de Abr. de 2022
Thanks, this also worked for a relatively deep regression network
Shounak Mitra
el 8 de Oct. de 2018
Unfortunately, using deepDreamImage() is not possible in this case.
0 comentarios
Ver también
Categorías
Más información sobre Define Shallow Neural Network Architectures en Help Center y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!