How to predict next value using time series?

2 visualizaciones (últimos 30 días)
WT
WT el 24 de Oct. de 2014
Editada: WT el 29 de Oct. de 2014
Hi, is there a way to use neural network toolbox to predict the next value in a time series?

Respuesta aceptada

Greg Heath
Greg Heath el 29 de Oct. de 2014
1. Override 'dividerand' to obtain constant intervals between points. For example
net.divideFcn = 'divideblock'
2. Obtain the final states Xf,Af from training
[ net tr Ys Es Xf Af] = train(net,Xs,Ts,Xi,Ai);
%Ys = net(Xs,Xi,Ai);
%Es = gsubtract(Ts,Ys);
3. You have to know the new initial state conditions Xinew, Ainew to predict points after the original data; Then
Ynew = net(NaN(n),Xinew,Yinew))
will yield n predictions. If Xinew,Ainew = Xf,Af, then Ynew is a gapless contnuation of Ys.
Hope this helps.
Thank you for formally accepting my answer
Greg
  1 comentario
WT
WT el 29 de Oct. de 2014
Editada: WT el 29 de Oct. de 2014
NaN(n) is not a number? so can i put inputs and targets variables as dummy variables? if Xinew,Ainew = Xf,Af, then do i run this? Ynew = net(NaN(n),Xf,Af) and what do i put for the value n?

Iniciar sesión para comentar.

Más respuestas (2)

Greg Heath
Greg Heath el 24 de Oct. de 2014
Try
net = nar(1);
For details see
help nar
doc nar
Hope this helps
Thank you for formally accepting my answer
Greg
  1 comentario
WT
WT el 25 de Oct. de 2014
Sorry i think my ans is unclear.. i am using trying to use ntstool to predict the value using the code shown below.However what i dun understand is why i cant set my target value to a dummy value (eg.zero) and try to generate a predicted outputs .. the outputs i get is will be zero too..
The code: % Solve an Autoregression Problem with External Input with a NARX Neural Network % Script generated by NTSTOOL % Created Thu Oct 23 12:36:34 SGT 2014 % % This script assumes these variables are defined: % % CPI_IN - input time series. % CPI_Target_OUT - feedback time series.
inputSeries = tonndata(CPI_IN,true,false); targetSeries = tonndata(CPI_Target_OUT,true,false);
% Create a Nonlinear Autoregressive Network with External Input inputDelays = 1:4; feedbackDelays = 1:4; hiddenLayerSize = 10; net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize);
% Choose Input and Feedback Pre/Post-Processing Functions % Settings for feedback input are automatically applied to feedback output % For a list of all processing functions type: help nnprocess % Customize input parameters at: net.inputs{i}.processParam % Customize output parameters at: net.outputs{i}.processParam net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'}; net.inputs{2}.processFcns = {'removeconstantrows','mapminmax'};
% Prepare the Data for Training and Simulation % The function PREPARETS prepares timeseries data for a particular network, % shifting time by the minimum amount to fill input states and layer states. % Using PREPARETS allows you to keep your original time series data unchanged, while % easily customizing it for networks with differing numbers of delays, with % open loop or closed loop feedback modes. [inputs,inputStates,layerStates,targets] = preparets(net,inputSeries,{},targetSeries);
% Setup Division of Data for Training, Validation, Testing % The function DIVIDERAND randomly assigns target values to training, % validation and test sets during training. % For a list of all data division functions type: help nndivide net.divideFcn = 'dividerand'; % Divide data randomly % The property DIVIDEMODE set to TIMESTEP means that targets are divided % into training, validation and test sets according to timesteps. % For a list of data division modes type: help nntype_data_division_mode net.divideMode = 'value'; % Divide up every value net.divideParam.trainRatio = 70/100; net.divideParam.valRatio = 15/100; net.divideParam.testRatio = 15/100;
% Choose a Training Function % For a list of all training functions type: help nntrain % Customize training parameters at: net.trainParam net.trainFcn = 'trainlm'; % Levenberg-Marquardt
% Choose a Performance Function % For a list of all performance functions type: help nnperformance % Customize performance parameters at: net.performParam net.performFcn = 'mse'; % Mean squared error
% Choose Plot Functions % For a list of all plot functions type: help nnplot % Customize plot parameters at: net.plotParam net.plotFcns = {'plotperform','plottrainstate','plotresponse', ... 'ploterrcorr', 'plotinerrcorr'};
% Train the Network [net,tr] = train(net,inputs,targets,inputStates,layerStates);
% Test the Network outputs = net(inputs,inputStates,layerStates); errors = gsubtract(targets,outputs); performance = perform(net,targets,outputs)
% Recalculate Training, Validation and Test Performance trainTargets = gmultiply(targets,tr.trainMask); valTargets = gmultiply(targets,tr.valMask); testTargets = gmultiply(targets,tr.testMask); trainPerformance = perform(net,trainTargets,outputs) valPerformance = perform(net,valTargets,outputs) testPerformance = perform(net,testTargets,outputs)
% View the Network view(net)
% Plots % Uncomment these lines to enable various plots. %figure, plotperform(tr) %figure, plottrainstate(tr) %figure, plotregression(targets,outputs) %figure, plotresponse(targets,outputs) %figure, ploterrcorr(errors) %figure, plotinerrcorr(inputs,errors)
% Closed Loop Network % Use this network to do multi-step prediction. % The function CLOSELOOP replaces the feedback input with a direct % connection from the outout layer. netc = closeloop(net); netc.name = [net.name ' - Closed Loop']; view(netc) [xc,xic,aic,tc] = preparets(netc,inputSeries,{},targetSeries); yc = netc(xc,xic,aic); closedLoopPerformance = perform(netc,tc,yc)
% Early Prediction Network % For some applications it helps to get the prediction a timestep early. % The original network returns predicted y(t+1) at the same time it is given y(t+1). % For some applications such as decision making, it would help to have predicted % y(t+1) once y(t) is available, but before the actual y(t+1) occurs. % The network can be made to return its output a timestep early by removing one delay % so that its minimal tap delay is now 0 instead of 1. The new network returns the % same outputs as the original network, but outputs are shifted left one timestep. nets = removedelay(net); nets.name = [net.name ' - Predict One Step Ahead']; view(nets) [xs,xis,ais,ts] = preparets(nets,inputSeries,{},targetSeries); ys = nets(xs,xis,ais); earlyPredictPerformance = perform(nets,ts,ys)

Iniciar sesión para comentar.


Star Strider
Star Strider el 24 de Oct. de 2014
I would start by using ntstool.
  3 comentarios
Star Strider
Star Strider el 25 de Oct. de 2014
The targets are the predicted outputs for your training data. If you have no training targets, the net has nothing to adapt its weights with respect to. The correctly trained net will have the predictions as its outputs based on the input data.
WT
WT el 29 de Oct. de 2014
Editada: WT el 29 de Oct. de 2014
Then what are we suppose to do in order to do multi-step ahead prediction? Because i know that the code that matlab has generated only help to generate one step ahead prediction and the result is quite accurate. But i have no idea how to do multi-step ahead prediction.. If targets not known, is it possible to predict using closed-loop prediction step?

Iniciar sesión para comentar.

Categorías

Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by