Prediction of the Sinus Function using Neural Networks

16 visualizaciones (últimos 30 días)
Pedro
Pedro el 15 de Mayo de 2013
Comentada: Greg Heath el 25 de Mzo. de 2016
My objective is to create a NN that is able to predict the sinus function. For that I tried using several types of networks, including feed-forward using the Fit Tool and NARX net using the time series tool.
The sinus has a period of 365.
Using Fiting Tool(default configurations except i give it 5 neurons)
%The input I give for training is:
input = linspace(1,270,100); % I used several variations of this
target = sin(2*pi*input/365);
%Results: Samples MSE R
%Training: 70 7.23e-7 9.9999e-1
%Validation: 15 6.84e-7 9.9999e-1
%Testing: 15 3.171e-6 9.99993e-1
Which I think look pretty good.
In the next step I try to predict the remaining function using the following sample:
pred_inp=linspace(271,365,100);
pred_targ= sin(2*pi*pred_inp/365);
% Results: Samples MSE R
% 100 1.33175e-0 -3.6286e-1
%And this is where it gets crazy, sometimes it gives a good prediction,
%other times it just goes down.
%It gets even worse if I try to predict for more than one period:
pred_inp=linspace(271,730,100);
I have no idea of what is going wrong. Anyone here could assist me? Or showing me another way to do this?

Respuesta aceptada

Greg Heath
Greg Heath el 16 de Mayo de 2013
The rule of thumb for predicting a sinusoid function is (I think) that you have to train on at least 1.5 periods with at least 8 points per period. If this turns out to be wrong, try training on 2 periods with 20 points per period. Then back off.
Hope this helps.
Greg

Más respuestas (1)

Pedro
Pedro el 16 de Mayo de 2013
Editada: Pedro el 16 de Mayo de 2013
Thank you for your answer. I tried what you said but without success. I even tried with up to 4 periods for training and testing if she could predict anything up to another 1.5 periods. The prediction fails terribly.
What else can be done?
EDIT: Just tried with a NARX network trained for 4 periods just as I tried for the Fiting Tool before and it managed to give me a good prediction.
But why can't it with the fiting?
  4 comentarios
Pedro
Pedro el 16 de Mayo de 2013
In sum, lets stick to NARX networks for prediction right?
On that subject, I created one using the GUI and the prediction run just fine. Now, I am trying to run it from the script but it gives me this error:
Error using network/sim (line 280)
Number of inputs does not match net.numInputs.
Error in network/subsref (line 17)
otherwise, v = sim(vin,subs{:});
Error in SinusNarxNet (line 86)
pred = net(test_in,inputStates,layerStates);
The code I am using is:
clear,clc,close all
% Solve an Autoregression Problem with External Input with a NARX Neural Network
% Script generated by NTSTOOL
% Created Thu May 16 15:31:19 CEST 2013
%
% This script assumes these variables are defined:
%
% inputs - input time series.
% targets - feedback time series.
inputs = linspace(1,730,80);
targets = sin(2*pi*inputs/365);
inputSeries = tonndata(inputs,true,false);
targetSeries = tonndata(targets,true,false);
% Create a Nonlinear Autoregressive Network with External Input
inputDelays = 1:2;
feedbackDelays = 1:2;
hiddenLayerSize = 10;
net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize);
% Choose Input and Feedback Pre/Post-Processing Functions
% Settings for feedback input are automatically applied to feedback output
% For a list of all processing functions type: help nnprocess
% Customize input parameters at: net.inputs{i}.processParam
% Customize output parameters at: net.outputs{i}.processParam
net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'};
net.inputs{2}.processFcns = {'removeconstantrows','mapminmax'};
% Prepare the Data for Training and Simulation
% The function PREPARETS prepares timeseries data for a particular network,
% shifting time by the minimum amount to fill input states and layer states.
% Using PREPARETS allows you to keep your original time series data unchanged, while
% easily customizing it for networks with differing numbers of delays, with
% open loop or closed loop feedback modes.
[inputs,inputStates,layerStates,targets] = preparets(net,inputSeries,{},targetSeries);
% Setup Division of Data for Training, Validation, Testing
% The function DIVIDERAND randomly assigns target values to training,
% validation and test sets during training.
% For a list of all data division functions type: help nndivide
net.divideFcn = 'dividerand'; % Divide data randomly
% The property DIVIDEMODE set to TIMESTEP means that targets are divided
% into training, validation and test sets according to timesteps.
% For a list of data division modes type: help nntype_data_division_mode
net.divideMode = 'value'; % Divide up every value
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
% Choose a Training Function
% For a list of all training functions type: help nntrain
% Customize training parameters at: net.trainParam
net.trainFcn = 'trainlm'; % Levenberg-Marquardt
% Choose a Performance Function
% For a list of all performance functions type: help nnperformance
% Customize performance parameters at: net.performParam
net.performFcn = 'mse'; % Mean squared error
% Choose Plot Functions
% For a list of all plot functions type: help nnplot
% Customize plot parameters at: net.plotParam
net.plotFcns = {'plotperform','plottrainstate','plotresponse', ...
'ploterrcorr', 'plotinerrcorr'};
% Train the Network
[net,tr] = train(net,inputs,targets,inputStates,layerStates);
% Test the Network
outputs = net(inputs,inputStates,layerStates);
errors = gsubtract(targets,outputs);
performance = perform(net,targets,outputs)
% Prediction sample
test_in = linspace(731,1200,80);
test_tr = sin(2*pi*test_in/365);
inputSeries = tonndata(test_in,true,false);
targetSeries = tonndata(test_tr,true,false);
pred = net(test_in,inputStates,layerStates);
errors = gsubtract(test_tr,pred);
performance = perform(net,test_tr,pred);
% real function - for comparison
x = linspace(1,1200,1200);
y = sin(2*pi*x/365);
figure,plot(x,y,inputs,outputs,'*r',test_in,pred,'*g'),grid;
legend('sinus','training sample','prediction');
% Recalculate Training, Validation and Test Performance
trainTargets = gmultiply(targets,tr.trainMask);
valTargets = gmultiply(targets,tr.valMask);
testTargets = gmultiply(targets,tr.testMask);
trainPerformance = perform(net,trainTargets,outputs)
valPerformance = perform(net,valTargets,outputs)
testPerformance = perform(net,testTargets,outputs)
% View the Network
%view(net)
% Plots
% Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, plotregression(targets,outputs)
%figure, plotresponse(targets,outputs)
%figure, ploterrcorr(errors)
%figure, plotinerrcorr(inputs,errors)
% Closed Loop Network
% Use this network to do multi-step prediction.
% The function CLOSELOOP replaces the feedback input with a direct
% connection from the outout layer.
netc = closeloop(net);
netc.name = [net.name ' - Closed Loop'];
view(netc)
[xc,xic,aic,tc] = preparets(netc,inputSeries,{},targetSeries);
yc = netc(xc,xic,aic);
closedLoopPerformance = perform(netc,tc,yc)
% Early Prediction Network
% For some applications it helps to get the prediction a timestep early.
% The original network returns predicted y(t+1) at the same time it is given y(t+1).
% For some applications such as decision making, it would help to have predicted
% y(t+1) once y(t) is available, but before the actual y(t+1) occurs.
% The network can be made to return its output a timestep early by removing one delay
% so that its minimal tap delay is now 0 instead of 1. The new network returns the
% same outputs as the original network, but outputs are shifted left one timestep.
nets = removedelay(net);
nets.name = [net.name ' - Predict One Step Ahead'];
view(nets)
[xs,xis,ais,ts] = preparets(nets,inputSeries,{},targetSeries);
ys = nets(xs,xis,ais);
earlyPredictPerformance = perform(nets,ts,ys)
Greg Heath
Greg Heath el 25 de Mzo. de 2016
1. There was no attempt to find the significant auto and cross correlation lags.
2. With smooth curves the minimum number of hidden nodes is equal to the number of local extrema
Hope this helps
Greg

Iniciar sesión para comentar.

Categorías

Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by