How can I define Hidden layers with custom functions?
1 visualización (últimos 30 días)
Mostrar comentarios más antiguos
Combining traditional linear models and ANN's are done in two steps (linear, then, from residuals, non-linear modeling). To avoid this model specification error hybrid model with parallel linear and non-linear components with different weights has to be created. Instead of y=L+N, it's suggested to use y=w31*L + w32*N + b3 (w31 - weight between linear and output layers, w32 - nonlinear and output, b - bias). I just started using MATLB to create this model. I have tried to write code to get desired architecture and NARNET based training. Here is my code:
T = tonndata(Data,false,false);
trainFcn = 'trainlm'; % Levenberg-Marquardt
feedbackDelays = 1:2;
hiddeSizes = [8 8];
net = narnet(feedbackDelays,hiddeSizes,'open',trainFcn);
net.biasConnect = [1; 1; 1];
net.inputConnect = [1; 1; 0];
net.layerConnect = [0 0 0; 0 0 0; 1 1 0];
net.outputConnect = [0 0 1];
net.inputweights{2,1}.delays = 1:2;
net.layers{1}.transferFcn = 'purelin';
net.layers{1}.initFcn = 'initwb';
net.layers{2}.transferFcn = 'tansig';
net.layers{2}.initFcn = 'initnw';
net.initFcn = 'initlay';
net.performFcn = 'mse';
net.plotFcns = {'plotperform','plottrainstate','plotresponse', ...
'ploterrcorr', 'plotinerrcorr'};
net.input.processFcns = {'removeconstantrows','mapminmax'};
[x,xi,ai,t] = preparets(net,{},{},T);
net.divideFcn = 'dividerand';
net.divideMode = 'time';
net.divideParam.trainRatio = 80/100;
net.divideParam.valRatio = 11/100;
net.divideParam.testRatio = 9/100;
% Train the Network
[net,tr] = train(net,x,t,xi,ai);
% Test the Network
y = net(x,xi,ai);
e = gsubtract(t,y);
performance = perform(net,t,y)
% Recalculate Training, Validation and Test Performance
trainTargets = gmultiply(t,tr.trainMask);
valTargets = gmultiply(t,tr.valMask);
testTargets = gmultiply(t,tr.testMask);
trainPerformance = perform(net,trainTargets,y)
valPerformance = perform(net,valTargets,y)
testPerformance = perform(net,testTargets,y)
view(net)
% Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, plotresponse(t,y)
%figure, ploterrcorr(e)
%figure, plotinerrcorr(x,e)
netc = closeloop(net);
[xc,xic,aic,tc] = preparets(netc,{},{},T);
yc = netc(xc,xic,aic);
perfc = perform(net,tc,yc)
[x1,xio,aio,t] = preparets(net,{},{},T);
[y1,xfo,afo] = net(x1,xio,aio);
[netc,xic,aic] = closeloop(net,xfo,afo);
[y2,xfc,afc] = netc(cell(0,5),xic,aic);
nets = removedelay(net);
[xs,xis,ais,ts] = preparets(nets,{},{},T);
ys = nets(xs,xis,ais);
stepAheadPerformance = perform(net,ts,ys)
if (false)
genFunction(net,'myNeuralNetworkFunction');
y = myNeuralNetworkFunction(x,xi,ai);
end
if (false)
genFunction(net,'myNeuralNetworkFunction','MatrixOnly','yes');
x1 = cell2mat(x(1,:));
xi1 = cell2mat(xi(1,:));
y = myNeuralNetworkFunction(x1,xi1);
end
if (false)
gensim(net);
end
Original idea of that kind of hybrid model is published by Ufuk Yolcu, Erol Egrioglu, Cagdas H. Aladag (2013) "A new linear & nonlinear artificial neural network model for time series forecasting". Images from this work are below. Authors used multiplicative neuron model and Particle swarm optimisation and claim that they used MATLAB for all modelling
I think that architecture that i built (picture below) should do the job. I don't know how to combine layers using parallel function, but it shouldn't be an issue.
1. How can I define Hidden layers with functions (net1 and net2) from original work? 2. Any comments on my decision to use NARNET instead of Multiplicative neuron model? (personaly, i don't see why i't shouldn't work) 3. Do you think that this model (after assigning layers with net1 and net2 functions) will be able to capture linear and non linear components of time series, because current code has nothing to do with Linear component?
UPDATE. I found out, that i should use either linearlayer or newlind functions to define first hidden layer, but i don't know how to do it? Any sugestions?
3 comentarios
Greg Heath
el 26 de Mzo. de 2015
No. Your original premise is incorrect.
The single hidden layer MLP is a universal approximator for bounded nonlinear functions. For example:
y = B2 + LW * tanh( B1 + IW * x ) % FITNET
Respuestas (0)
Ver también
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!