Neural Network neuron values

6 visualizaciones (últimos 30 días)
Riley
Riley el 29 de Jul. de 2014
Respondida: Greg Heath el 13 de Ag. de 2014
I'm interested in seeing how and when the inputs and weights are modified using a pre-trained standard patternnet classifier. For example using the crab dataset (6 inputs, 2 layers, 2 outputs, and 10 hidden neurons). Can the steps be confirmed as:
1) For each of n hidden neurons sum product of input i(1-6) with input weight(n)(1-6)
2) Add specific bias(n) to sum
3) Normalize to -1:1 range using mapminmax
4) For each of 2 output neurons sum product of hidden neurons with layer2 weights
5) Add layer 2 specific bias.
6) Normalize again using mapminmax

Respuesta aceptada

Greg Heath
Greg Heath el 30 de Jul. de 2014
Editada: Greg Heath el 1 de Ag. de 2014
0. [ I N ] = size(input) (I=6), [ O N ] = size(output) (O = 2),
1) Use mapminmax to obtain 6xN dimensional normalized (-1:1 column range)input column matrix xn from original 6xN matrix, x.
2) Add the 10 dimensional input bias column vector b1, to the product of the 10x6 dimensional input weight matrix, IW, with x.
3) Apply the tansig function to the sum to get the 10-dimensional hidden node signal, h = tansig(b1+IW*xn)
4) Obtain the normalized output,yn, by adding the 2 dimensional output bias column vector b2, to the product of the 2x10 dimensional output weight matrix, LW, with h.
5) Obtain output y by unnormalizing yn using mapminmax
  1 comentario
Riley
Riley el 30 de Jul. de 2014
Editada: Riley el 30 de Jul. de 2014
I've been trying without success, I think the problem lies in the reverse mapminmax. So far unable to get comparable results. Normalizing the inputs by column seems to result in information loss. Also, net.b{1} is a 10x1.
[x,t] = crab_dataset;
setdemorandstream(491218382)
net = patternnet(10);
[net,tr] = train(net,x,t);
nntraintool;
testY = net(x); %These are the results I expect generate (y)
testIndices = vec2ind(testY);
IW = net.IW{1};
b1 = net.b{1}
b2 = net.b{2}
LW = net.LW{2}
x = x';
[x, PS] = mapminmax(x); %1)
x = x';
h = bsxfun(@plus, b1,IW*x) %2)
H = tansig(h) %3
yn = bsxfun(@plus, b2,(LW*h)) %4)
y = mapminmax('reverse', yn', PS) %5)

Iniciar sesión para comentar.

Más respuestas (2)

Greg Heath
Greg Heath el 1 de Ag. de 2014
Editada: Greg Heath el 1 de Ag. de 2014
1. THE BASIC PROBLEM IS THE DOCUMENTATION
Neither help patternnet nor doc patternnet indicate that
>> outputlayertransferfunction = net.layers{2}.transferFcn
outputlayertransferfunction = softmax
help softmax
doc softmax
type softmax
2. When debugging it is usually best to use the data in the help and doc documentation
Hope this helps
Thank you for formally accepting my answer
Greg
P.S. THANK YOU FOR THIS QUESTION. I DID NOT REALIZE THAT THE OUTPUT TRANSFER FUNCTION HAD BEEN CHANGED
  4 comentarios
Riley
Riley el 12 de Ag. de 2014
I've since figured out this problem. Here is the working code:
close all, clear all, clc
[x,t] = crab_dataset;
[ I N ] = size(x)
[O N] = size(t)
setdemorandstream(491218382)
net = patternnet(10);
[net,tr] = train(net,x,t);
nntraintool;
testY = net(x);
testIndices = vec2ind(testY) %These are the results I expect my process to generate (y)
IW = net.IW{1,1};
b1 = net.b{1};
b2 = net.b{2};
LW = net.LW{2,1};
for i = 1:I
l = x(i,:);
xmin = min(l);
xmax = max(l);
xn(i,:) = -1+2*(l-xmin)/(xmax-xmin);
end
h = tansig(IW*xn+b1*ones(1,N));
yn = bsxfun(@plus, b2, LW*h);
Yn=softmax(yn);
Yn=vec2ind(Yn)
Greg Heath
Greg Heath el 13 de Ag. de 2014
Good!
1. If possible, do not use commands that are already defaults; e.g.,
nntraintool
2. Reserve "test" (and "val", "validate, or "validation" ) to refer to
data division
3. Confusing to use the same variable on both sides of an equation;
e.g., x=x' and Yn = vec2ind(Yn)
4. Less confusing to use lc i,j,k,,m,n for loop variables
5. Loop is uneccessary for normalization/unnormalization.
6. Confusing to mix UC and lc for noninteger variables. It helps to
try to only use UC for cell variables and lc for doubles; e.g.
x = cell2mat{X}.
7. You never calculated the difference (mse and PctErr) between
your two answers

Iniciar sesión para comentar.


Greg Heath
Greg Heath el 13 de Ag. de 2014
close all, clear all, clc
[x,t] = crab_dataset;
[ I N ] = size(x) % [ 6 200 ]
[O N] = size(t) % [ 2 200 ]
xt = [ x; t ];
minmaxxt = minmax(xt)
% minmaxxt = 0 1
% 7.2 23.1
% 6.5 20.2
% 14.7 47.6
% 17.1 54.6
% 6.1 21.6
% 0 1
% 0 1
% Outlier check
z = zscore(xt')' ;
minmaxz = minmax(z)
% minmaxz = -0.9975 0.9975
% -2.3983 2.1506
% -2.4243 2.8995
% -2.4449 2.1765
% -2.4536 2.3102
% -2.3156 2.2102
% -0.9975 0.9975
% -0.9975 0.9975
trueclassindices = vec2ind(t);
MSE00 = mean(var(t',1)) %0.025
% For this example ignore default trn/val/tst = 140/30/30 data division
% Ntrneq = Ntrn*(O-1), MSEtrn00 = mean(var(ttrn',0)), etc
setdemorandstream(491218382)
H = 10
Nw = (I+1)*H+(H+1)*O % 92 < Ntrneq = 140
net = patternnet(H);
[net,tr,y0,e] = train(net,x,t); % Note LHS
% y0=net(x); e = t-y0; tr contains trn/val/tst info;
NMSE = mse(e)/MSE00 % 0.04418
R2 = 1-NMSE % 0.95582
assignedindices = vec2ind(y0);
Nerr = sum(assignedindices~=trueclassindices) % 2
PctErr = 100*Nerr/N % 1
IW = net.IW{1,1}; % [ 10 6 ]
b1 = net.b{1}; % [ 10 1 ]
b2 = net.b{2}; % [ 2 1 ]
LW = net.LW{2,1}; % [ 2 10 ]
minx = repmat(min(x')',1,N);
maxx = repmat(max(x')',1,N);
xn = -1+2*(x-minx)./(maxx-minx);
h = tansig(IW*xn+b1*ones(1,N));
yn = LW*h + b2*ones(1,N);
y = softmax(yn);
check = mse(y-y0)/MSE00 % 1.8875e-31
whos IW LW b1 b2 xn h yn y
% Name Size Bytes Class
% IW 10x6 480 double
% LW 2x10 160 double
% b1 10x1 80 double
% b2 2x1 16 double
% xn 6x200 9600 double
% h 10x200 16000 double
% y 2x200 3200 double
% yn 2x200 3200 double
minmaxyny = minmax( [yn;y] )
% minmaxyny = -8.560 9.032
% -9.075 6.047
% 7.436e-7 1
% 1.502e-8 1
ind1 = find(t(1,:)==1);
ind2 = find(t(1,:)==0);
y1 = y(1,ind1)
y2 = y(1,ind2)
figure, hold on
plot( 1:N/2, y1, 'ro' )
plot( N/2+1:N, y2, 'bo' )
plot( 1:N, 0.5*ones(1,N), 'k--' )

Categorías

Más información sobre Networks en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by