How can I predict future values of time series in neural network ?
3 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Ugur Can
el 4 de Mzo. de 2016
Respondida: Abolfazl Nejatian
el 23 de Nov. de 2018
I have a time series that has internet traffic rates. 14772 row value and 1 column. I use NARnet at NN Time Series Toolbox and train it with %70 and test with %30 of series. I need the MAPE, so I divided the TargetSeries(Actual Values from .xlsx) to two matrix: TrainSeries(first 10340) and TestSeries(last 4432 value). Calculating the MAPE with TestSeries and last 4432 values of TargetSeries. Finally, I need to predict the future values of time series which I want. For example, I want the NAR predict to 15000th value. But I am confused. I deactivated the close loop and predict section because dont how to use. How to predict the future values ? With Close Loop Network or Step Ahead Prediction Network or both ? I searched the internet and found a code piece by Greg Health. But dont now how to arrange the parameters for my problem(ex. Xf, Af, Xs, Xi etc.) :
[ net tr ] = train( net, Xs, Ts, Xi, Ai );
[ Ys Xf Af ] = net( Xs, Xi, Ai );
Es = gsubtract(net,Ts,Ys);
Finally, to predict into the future M timesteps beyond the end of the target data
Xic2 = Xf;
Aic2 = Af;
Ypred = netc2( cell(1,M), Xic2, Aic2);
%%%%%%%%%%%%%%%% Here is my code :
% Solve an Autoregression Time-Series Problem with a NAR Neural Network
% Script generated by Neural Time Series app
% Created 20-Feb-2016 15:46:59
input=xlsread('input.xlsx');
TargetSeries=input.';%Transpose
TargetSeries=num2cell(TargetSeries);%convert
% Choose a Training Function
trainFcn = 'trainlm'; % Levenberg-Marquardt backpropagation.
% Create a Nonlinear Autoregressive Network
feedbackDelays = 1:17;
hiddenLayerSize = 5;
net = narnet(feedbackDelays,hiddenLayerSize,'open',trainFcn);
% Prepare the Data for Training and Simulation
% The function PREPARETS prepares timeseries data for a particular network,
% shifting time by the minimum amount to fill input states and layer
% states. Using PREPARETS allows you to keep your original time series data
% unchanged, while easily customizing it for networks with differing
% numbers of delays, with open loop or closed loop feedback modes.
% Setup Division of Data for Training, Validation, Testing
net.trainParam.mu_max=1.00e+18;
net.trainParam.epochs=50;
net.trainParam.max_fail=1;
net.performParam.normalization ='standard';
net.divideFcn='divideind';
%[trainInd,testInd] = divideind(14772,1:10340,10341:14772);
net.divideParam.trainInd = 1:10340;
net.divideParam.valInd = 10340:10340;
net.divideParam.testInd = 10341:14772;
[x,xi,ai,t] = preparets(net,{},{},TargetSeries);
for i=1:10340
TrainSeries(:,i)=TargetSeries(:,i);
end
TrainSeriess=cell2mat(TrainSeries);
for i=1:4431
TestSeries(:,i)=TargetSeries(:,10341+i);
end
% Train the Network
[net,tr] = train(net,x,t,xi,ai);
% Test the Network
y = net(TestSeries,xi,ai);
performance = perform(net,TestSeries,y);
% View the Network
view(net)
E=cell2mat(TestSeries);
M=cell2mat(y);
%Plots
%plot(M,'b*');
%hold on
%plot(E,'r*');
%legend('PredictionValues','TargetValues');
%MAPE
MEAP=abs(M-E);
MEAP2=abs(MEAP./E);
MEAP3=sum(MEAP2);
MEAP4=(MEAP3/length(M))*100;
% Plots
% Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, ploterrhist(e)
%figure, plotregression(t,y)
%figure, plotresponse(t,y)
%figure, ploterrcorr(e)
%figure, plotinerrcorr(x,e)
% Closed Loop Network
% Use this network to do multi-step prediction.
% The function CLOSELOOP replaces the feedback input with a direct
% connection from the outout layer.
% netc = closeloop(net);
% netc.name = [net.name ' - Closed Loop'];
% view(netc)
% [xc,xic,aic,tc] = preparets(netc,{},{},T);
% yc = netc(xc,xic,aic);
% closedLoopPerformance = perform(net,tc,yc)
% Step-Ahead Prediction Network
% For some applications it helps to get the prediction a timestep early.
% The original network returns predicted y(t+1) at the same time it is
% given y(t+1). For some applications such as decision making, it would
% help to have predicted y(t+1) once y(t) is available, but before the
% actual y(t+1) occurs. The network can be made to return its output a
% timestep early by removing one delay so that its minimal tap delay is now
% 0 instead of 1. The new network returns the same outputs as the original
% network, but outputs are shifted left one timestep.
% nets = removedelay(net);
% nets.name = [net.name ' - Predict One Step Ahead'];
% view(nets)
% [xs,xis,ais,ts] = preparets(nets,{},{},TargetSeries);
% ys = nets(xs,xis,ais);
% stepAheadPerformance = perform(nets,ts,ys);
2 comentarios
Respuesta aceptada
Greg Heath
el 4 de Mzo. de 2016
0. There is no lower case "L" in Heath
1. Capitals for cells, lower case for doubles
2. OL and 'o' for OpenLoop, CL and 'c' for Closed Loop
3. Search: NEWSGROUP ANSWERS
narnet 40 165
narnet greg 14 144
narnet tutorial 8 38
4. Apply your code to the example data in help/doc narnet and/or one of the other example datasets in help/doc nndatasets
5. Run the example(s) with all defaults except divideblock before considering your own data with nondefault settings
Hope this helps
Thank you for formally accepting my answer
Greg
close all, clear all, clc, plt=0
T = simplenar_dataset;
t = cell2mat(T); [ I N ] = size(t) % [ 1 100 ]
vart1 = var(t,1) % MSE Reference 0.063306
% In general vart1 = mean(var(t',1))
Ntst = round(0.15*N), Nval = Ntst % 15, 15
Ntrn = N-Nval-Ntst % 70
% ASSUME no statistical differences in trn/val/tst
% subsets so that DIVIDEBLOCK can be used
trnind = 1:Ntrn; valind= Ntrn+1:Ntrn+Nval;
tstind = Ntrn+Nval+1:N;
ttrn = t(trnind); tval = t(valind); ttst=t(tstind);
plt = plt+1, figure(plt), hold on
plot(trnind,ttrn,'k','LineWidth',2)
plot(valind,tval,'b','LineWidth',2)
plot(tstind,ttst,'g','LineWidth',2)
% Plot shows no significant statistical differences
% in trn/val/tst subsets
% In general:
% A. deduce significant positive feedback delay lags,
% FD, from the autocorrelation function of ttrn
% B. For MSEgoal = vart1/200, detemine the smallest
% successful number of hidden nodes, H, by trial and error
% C. For 1st run use defaults except for DIVIDEBLOCK
FD =1:2, H = 10
neto = narnet; neto.divideFcn = 'divideblock';
[ Xo ,Xoi, Aoi, To ] = preparets( neto, {}, {}, T );
to = cell2mat(To); varto1 = var(to,1) %0.061307
[ neto tro Yo Eo Xof Aof] = train( neto , Xo, To, Xoi, Aoi );
% [ Yo = neto(Xo, Xoi, Aoi ); Eo = gsubtract(To,Yo);
NMSEo = mse(Eo)/varto1 % 6.3328e-09
% Use training record tro to isolate predicted future
% nontraining (i.e., val and test) outputs and performance.
% For further predictions, must use the CL configuration.
%BUG WARNING: Division indices and ratios in tro are not
% consistent with those used above and stored in neto.
0 comentarios
Más respuestas (1)
Abolfazl Nejatian
el 23 de Nov. de 2018
here is my code,
this piece of code predicts time series data by use of deep learning and shallow learning algorithm.
best wishes
abolfazl nejatian
0 comentarios
Ver también
Categorías
Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!