Borrar filtros
Borrar filtros

Prediction using narx Network

6 visualizaciones (últimos 30 días)
Unnikrishnan PC
Unnikrishnan PC el 19 de Sept. de 2014
Comentada: Nawras Kh el 23 de Feb. de 2020
%Neural network to create a Fibinocci series
u=[1 2 3 4 5 6 7 8 9 10]; %Input Series
y=[1 2 3 5 8 13 21 34 55 89]; % Target series
[u,us] = mapminmax(u);
[y,ys] = mapminmax(y);
y = con2seq(y);
u = con2seq(u); d1 = [1:2];
d2 = [1:2];
narx_net = narxnet(d1,d2,10);
narx_net.divideFcn = '';
narx_net.trainParam.epochs = 1000;
narx_net.trainParam.min_grad = 1e-10;
[p,Pi,Ai,t] = preparets(narx_net,u,{},y);
% Train the Network-Open Loop
narx_net = train(narx_net,p,t,Pi);
% Simulate the Network-Open Loop
yp = sim(narx_net,p,Pi);
y_again=mapminmax('reverse',yp,ys)
%view(narx_net); %error_OL = cell2mat(yp)-cell2mat(y(3:end));
%Close narx net for future prediction
narx_net_closed = closeloop(narx_net);
[p1,Pi1,Ai1,t1] = preparets(narx_net_closed,u,{},y);
% Train the Network-Closed Loop
% narx_net_closed = train(narx_net_closed,p1,t1,Pi1);
% Simulate the Network-Closed Loop
yp1 = narx_net_closed(p1,Pi1,Ai1);
yp1_again=mapminmax('reverse',yp1,ys)
Please answer the following questions:
1. How can I make one step prediction without closing loop?
2. How can I get the next 5 numbers in the series? please provide the code if possible. I went through all your posts but could not solve it.
3. When I close the narx net, I get the same results as of open loop without training.
4. If I train the close loop, the outputs deviate from the target. How can I reduce this error?
Thanks in advance Regards
  1 comentario
Nawras Kh
Nawras Kh el 23 de Feb. de 2020
hello,plz can u help me i need the NARX code for predict my data

Iniciar sesión para comentar.

Respuesta aceptada

Greg Heath
Greg Heath el 19 de Sept. de 2014
The Fibonacci series does not result from an input/output relationship. It is autoregressive
Either y(1:2) = [ 0 1 ] or y(1:2) = [ 1 1 ] and then
y(n+1) = y(n) + y(n-1)
Obviously this can be implemented with a NARNET WITH NO HIDDEN NODES.
Hope this helps.
Thank you for formally accepting my answer
Greg

Más respuestas (1)

Greg Heath
Greg Heath el 25 de Sept. de 2014
% Prediction using narx :
% Sent By Unnikrishnan P.C. On:Sep 09/19/14 1:45 PM
%Neural network to create a Fibinocci series
close all, clear all, clc, plt = 0
X = { 1 2 3 4 5 6 7 8 9 10 }; % Input Series
T = { 1 2 3 5 8 13 21 34 55 89 }; % Target series
d1 = [ 1:2 ]; d2 = [ 1:2 ];
net = narxnet( d1, d2, [] );
net.divideFcn = '';
[ Xs, Xi, Ai, Ts ] = preparets( net, X, {}, T );
whos X T Xs Xi Ai Ts
% Name Size Bytes Class
% Ai 1x0 0 cell % ==> Timedelaynet
% T 1x10 1200 cell
% Ts 1x8 960 cell % T(3:end)
% X 1x10 1200 cell
% Xi 2x2 480 cell % { X(1:2) ; T(1:2) }
% Xs 2x8 1920 cell % { X(3:end) ; T(3:end)}
xs = cell2mat(Xs);
xs1 = xs(1,:);
ts = cell2mat( Ts );
MSE00s = var( ts',1 ) % 789
% Open-Loop Training
rng('default')
view(net)
[ net tr Ys Es Xf Af ] = train( net, Xs, Ts, Xi, Ai);
view(net)
% Ys = net( Xs, Xi, Ai );
% Es = gsubtract(net, Ts, Ys)
whos Xs Ts Xi Ai Ys Es Xf Af
% Name Size Bytes Class
% Af 1x0 0 cell
% Ai 1x0 0 cell
% Es 1x8 960 cell
% Ts 1x8 960 cell
% Xf 2x2 480 cell
% Xi 2x2 480 cell
% Xs 2x8 1920 cell
% Ys 1x8 960 cell
ys = cell2mat( Ys );
es = cell2mat( Es );
R2s = 1 - mse( es )/MSE00s % 1
IW = net.IW{:} % 0.023 -0.023
b = net.b{:} % 1.0632
LW = net.LW{:} % []
[ xsn xsettings ] = mapminmax(xs);
[ tsn tsettings ] = mapminmax(ts);
ysn = tansig(b(1) + IW*xsn);
plt = plt+1, figure( plt )
figure( plt )
subplot( 211 )
title('Fibonacci Series Model')
hold on
plot( xs1, ts )
plot( xs1, ys, 'ro' )
legend('Fibonacci','Open Loop Model',2)
subplot(212)
plot( xs1, es, 'ko-' )
legend('Open Loop Model Error')
%Close narx net for future prediction
[ netc Xic Aic ] = closeloop( net, Xi, Ai );
isequalX = isequal(Xic,Xi) % 0
isequalA = isequal(Aic,Ai) % 0
[ Xc, Xic, Aic, Tc ] = preparets( netc, X, {}, T );
isequalT = isequal(Tc,Ts)
tc = ts;
MSE00c = MSE00s % 789
Yc = netc( Xc, Xic, Aic );
yc = cell2mat(Yc);
R2c = 1-mse(tc-yc)/MSE00c % 1
IW = netc.IW{:} % 0.023 -0.023
b = netc.b{:} % 1.0632
LW = netc.LW{:} % 1 1
% Train the Closed Loop Network?
[ netc trc Yc Ec Xfc Afc ] = train(netc, Xc, Tc, Xic, Aic);
IW = netc.IW{:} % 0.023 -0.023
b = netc.b{:} % 1.0632
LW = netc.LW{:} % 1 1
% Predict Future Outputs
Xcf = con2seq(11:20);
Ycf = netc( Xcf , Xfc, Afc );
xcf = cell2mat( Xcf );
ycf = cell2mat( Ycf );
plt=plt+1, figure(plt)
subplot(211)
hold on
plot( xs1, ts )
plot( xs1, ys, 'ro' )
subplot(212)
hold on
plot( xs1, ts )
plot( xs1, ys, 'ro' )
plot(xcf, ycf, 'r-o')
}

Categorías

Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by