SUGGESTIONS TO IMPROVE OPEN LOOP NARNET MODEL PERFORMANCE FOR MULTISTEP AHEAD PREDICTIONS (WIND SPEED DATASET)
1 visualización (últimos 30 días)
Mostrar comentarios más antiguos
Dears, I am Student working on Project (to predict wind Speed), and I am Student of MATHWORKS too learning how to build my model it have been almost a year i am reading and fixing. I got the steps and idea/suggestions specially from Sir Greg and by following the steps and idea i can get good results for DataSet in MATLAB, but for my DataSet its not working. let me explain... the below steps are explained by Sir Greg in many Posts, so i started to follow the steps...
% 1. Use NNCORR to determine the statistically significant input and/or % feedback lags to be stored in row vectors ID and/or FD with lengths % LID and LFD, respectively. % SZ: Since I am using NARnet, I will only find FD.
% 2. Use an outer loop search over a range of ~10 values (for number of % hidden nodes h = Hmin:dH:Hmax )
% 3. An inner loop search over Ntrials ~10 RNG states (i=1:Ntrials) that % determine random initial weights and random trn/val/tst data division.
%4. MSE goal ~ 0.01*mean(var(target',1)) (Models ~99% of target variance) % 5. MinGrad = MSEgoal/100
all these steps are followed, with data in MATLAB "simplenar_dataset" as below, .. everything went smoothly. but when i used my dataset, i am not getting FD from NNCORR steps (I am getting a huge number like ~8400, which is too much to apply my computer get stuck and cant continue with the selections). Please Advise what I need to try else to make my MODEL to predict multi step ahead with better accuracy as its working with Matlab dataset.
% 1. Use NNCORR to determine the statistically significant input and/or
% feedback lags to be stored in row vectors ID and/or FD with lengths
% LID and LFD, respectively. For NARnet you need FD only
% NNCORR
close all, clear all, clc , plt = 0;
S = simplenar_dataset;
T=S(1:80); % remaining 20 for prediction and test
t = cell2mat(T) ;
[ O N ] = size(t) % [ 1 80]
zt = zscore(t,1); tzt = [ t ; zt ];
meantzt = mean( tzt' )' %[ 0.7233 ;-0.0000]
vartzt1 = var( tzt', 1)' % [0.0635 ; 1 ]
minmaxtzt = minmax(tzt)
% minmaxtzt =
% 0.1622 0.9999
% -2.2273 1.0981
Ntst = round(0.15*N), Nval = Ntst % 12, 12
Ntrn = N-Nval-Ntst % 56
trnind = 1:Ntrn; % 1 : 56
valind = Ntrn+1:Ntrn+Nval; % 57 : 68
tstind = Ntrn+Nval+1:N; % 69 : 80
ttrn = t(trnind); tval = t(valind);ttst = t(tstind);
plt = plt+1, figure(plt), hold on
plot( trnind, ttrn,'b' )
plot( valind, tval,'g' )
plot( tstind, ttst, 'r' )
legend('TRAIN','VAL','TEST')
title( ' DATASET TIMESERIES ' )
rng('default');
Ltrn = floor(0.95*(2*Ntrn-1)) % 105
imax = 100
for i = 1:imax
n = randn(1,Ntrn);
autocorrn = nncorr( n,n, Ntrn-1, 'biased');
sortabsautocorrn = sort(abs(autocorrn));
thresh95(i) = sortabsautocorrn(Ltrn);
end
% imax = 100
minthresh95 = min(thresh95) % 0.0983
medthresh95 = median(thresh95) % 0.1841
meanthresh95 = mean(thresh95) % 0.1950
stdthresh95 = std(thresh95) % 0.0515
maxthresh95 = max(thresh95) % 0.3360
sigthresh95 = meanthresh95 % 0.1950
zttrn = zscore(ttrn',1)';
autocorrttrn = nncorr( zttrn, zttrn, Ntrn-1, 'biased' );
siglag95 = find(abs(autocorrttrn(Ntrn+1:2*Ntrn-1)) ...
>= sigthresh95);
Nsiglag95 = length(siglag95) % 30 ~ 0.54*Ntrn
d = min(find(diff(siglag95)>1)) % 5
proof = siglag95(d-1:d+1) % 4 5 7
plt = plt+1; figure(plt); hold on
plot(0:Ntrn-1, -sigthresh95*ones(1,Ntrn),'b--' );
plot(0:Ntrn-1, zeros(1,Ntrn),'k');
plot(0:Ntrn-1, sigthresh95*ones(1,Ntrn),'b--' );
plot(0:Ntrn-1, autocorrttrn(Ntrn:2*Ntrn-1));
plot( siglag95, autocorrttrn(Ntrn + siglag95), 'ro' );
title('SIGNIFICANT AUTOCORRELATIONS VALUES')
to = t(d+1:N);
No = N-d % 75
Notst = round(0.15*No) % 11
Noval = Notst % 11
Notrn = No-Noval-Notst % 53
% --------------------------------------------------------------------- %
% 2. Use an outer loop search over a range of ~10 values (for number of
% hidden nodes h = Hmin:dH:Hmax )
% 3. An inner loop search over Ntrials ~10 RNG states (i=1:Ntrials) that
% determine random initial weights and random trn/val/tst data division.
% -----------------------------------------------------------------------%
% finding the best suited FD and H
H = 10 % Default
neto = narnet(1:d,H); % d=5
neto.divideFcn = 'divideblock';
[ Xo, Xoi, Aoi, To ] = preparets(neto,{},{},T);
to = cell2mat(To); varto1 = var(to,1) % 0.0628
Hub = (Notrn*O-O)/(d+O+1) % 7.4286
Hmin = 1, dH = 1, Hmax = 7
Ntrials = 5
rng('default')
j = 0
for h = Hmin:dH:Hmax
j = j+1
neto.layers{1}.dimensions = h;
for i = 1:Ntrials
neto = configure(neto,Xo,To);
[neto tro Yo Eo Xof Aof ]=train(neto,Xo,To,Xoi,Aoi);
%[Yo Xof Aof =net(Xo,Xoi,Aoi);Eo=gsubtract(To,Yo);
NMSEo(i,j) = mse(Eo)/varto1;
NMSEtrno(i,j) = tro.best_perf/varto1;
NMSEvalo(i,j) = tro.best_vperf/varto1;
NMSEtsto(i,j) = tro.best_tperf/varto1;
end
end
plotresponse(To,Yo)
NumH = Hmin:dH:Hmax
Rsqo = 1-NMSEo
Rsqtrno = 1-NMSEtrno
Rsqvalo = 1-NMSEvalo
Rsqtsto = 1-NMSEtsto
% NumH =
%
% 1 2 3 4 5 6 7
%
%
% Rsqo =
%
% 0.9981 0.9992 0.9990 1.0000 1.0000 1.0000 1.0000
% 0.9981 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000
% 0.9981 1.0000 0.9988 1.0000 1.0000 1.0000 1.0000
% 0.9980 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000
% 0.9981 0.9994 1.0000 1.0000 1.0000 1.0000 1.0000
%
%
% Rsqtrno =
%
% 0.9979 0.9991 0.9988 1.0000 1.0000 1.0000 1.0000
% 0.9979 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000
% 0.9979 1.0000 0.9986 1.0000 1.0000 1.0000 1.0000
% 0.9979 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000
% 0.9979 0.9993 1.0000 1.0000 1.0000 1.0000 1.0000
%
%
% Rsqvalo =
%
% 0.9986 0.9999 0.9991 1.0000 1.0000 1.0000 1.0000
% 0.9986 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000
% 0.9986 1.0000 0.9989 1.0000 1.0000 1.0000 1.0000
% 0.9986 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000
% 0.9986 0.9998 1.0000 1.0000 1.0000 1.0000 1.0000
%
%
% Rsqtsto =
%
% 0.9984 0.9992 0.9996 1.0000 1.0000 1.0000 1.0000
% 0.9984 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000
% 0.9984 1.0000 0.9993 1.0000 1.0000 1.0000 1.0000
% 0.9984 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000
% 0.9984 0.9993 1.0000 1.0000 1.0000 1.0000 1.0000
%-------------------------------------------------------------------%
% From above FD is obtained to be 5 and H neurons, actually we
% should select the min H which gives Rsqtsto >0.995, in this case H=1 is
% suitable.
%------------------------------------------------------------------%
% now using the FD & H to train the network and predict multi steps ahead
% 4. MSE goal ~ 0.01*mean(var(target',1)) (Models ~99% of target variance)
% 5. MinGrad = MSEgoal/100
%
% MSEgoal = ~ 6.2800e-04
% MinGrad = ~6.2800e-06
close all, clear all, clc , plt = 0;
S = simplenar_dataset;
T= S(1:80);
[ I N ] = size(T) % [ 1 100 ]
FD = 1:5; H = 1; % obtained from NNCORR
neto = narnet( FD, H );
neto.divideFcn = 'divideblock';
%view(neto)
[ Xo, Xoi, Aoi, To ] = preparets( neto, {}, {}, T );
to = cell2mat(To); varto1 = var(to,1)
neto.trainParam.epochs = 1000; % DEfault
% MSEgoal = ~ 6.2800e-04
% MinGrad = ~6.2800e-06
neto.trainParam.goal= 6.2800e-04 ;
neto.trainParam.min_grad=6.2800e-06 ;
rng( 'default' )
[ neto, tro, Yo, Eo, Aof, Xof ] = train( neto, Xo, To, Xoi, Aoi );
%[ Yo Xof Aof ] = net( Xo, Xoi, Aoi )
%Eo = gsubtract( To, Yo )
%view( neto )
NMSEo = mse( Eo ) /varto1 % 0.0024
yo = cell2mat( Yo );
% plot( d+1:N, yo, 'ro', 'LineWidth', 2 )
% axis( [ 0 100 0 1.3 ] )
% legend( 'TARGET', 'OUTPUT' )
% title( 'OPENLOOP NARNET RESULTS' )
% now converted to closed loop to do multi step predictions
[ netc, Xci, Aci ] = closeloop( neto, Xoi, Aoi );
%view( netc )
[ Xc, Xci, Aci, Tc ] = preparets( netc, {}, {}, T );
isequal( Tc, To ) , tc = to ; % 1
[ Yc, Xcf, Acf ] = netc( Xc, Xci, Aci );
Ec = gsubtract( Tc, Yc );
yc = cell2mat( Yc );
NMSEc = mse(Ec) /var(tc,1) %0.0245
Tp= S(81:100)
m=1;
n=6;
Tf= Tp (m:n);
[ Xc1, Xci1, Aci1, Tp1 ] = preparets( netc, {}, {}, Tf );
[ Yc1, Xcf, Acf ] = netc( Xc1, Xci1, Aci1 );
Xc2 = cell(1,10);
[ Yc2, Xcf2, Acf2 ] = netc( Xc2,Xcf, Acf );
yc2 = cell2mat(Yc2);
Tpp =cell2mat(Tp);
plt = plt+1; figure(plt), hold on
plot( yc2,'b', 'LineWidth', 2 )
plot( Tpp(7:16), 'r', 'LineWidth', 2 )
legend( 'OUTPUT','TARGET')
title( 'PREDICTION RESULTS' )
% >> yc2
% yc2 =
% 0.9985 0.4726 0.6192 0.9853 0.8055 0.3500 0.9204 0.9794 0.3134 0.7738
%>> Tpp(7:16)
%ans =
% 0.9599 0.5351 0.6340 0.9911 0.7724 0.3774 0.9389 0.9925 0.2680 0.7762
%>> Ep = gsubtract( Tpp(7:16), yc2 )
%Ep =
% -0.0386 0.0625 0.0148 0.0058 -0.0331 0.0274 0.0186 0.0131 -0.0454 0.0024
%>> mse (Ec)
%ans =
% 0.0010
All Above were the full complete steps to Design a NARnet Model, For each Data different FD, H , MSE goal, MinGrad and for sure Training Algorithm and the data division tech. For my data I tried to change all these parameters and do test and trials for different FD & H (as NNCORR not geving good results too), But Iam not getting the desired NMSEo, to continue ... :( Please Help...
below the results for my data, i randomly selected FD=6 and kept H=10 as default..
close all, clear all, clc , plt = 0;
data_1= xlsread('G:\SQU2\Solar_Wind_Project\MAtlabCodes\data_Nov13-Oct15');
T=data_1(1:92394,2);
T=con2seq(T');
[ I N ] = size(T) % [ 1 100 ]
d=6; FD = 1:d; H = 10; % obtained from NNCORR
neto = narnet( FD, H );
neto.divideFcn = 'divideblock';
%view(neto)
[ Xo, Xoi, Aoi, To ] = preparets( neto, {}, {}, T );
to = cell2mat(To); varto1 = var(to,1) % 12.7740
neto.trainParam.epochs = 1000; % DEfault
% MSEgoal = ~ 0.1277
% MinGrad = ~0.0013
neto.trainParam.goal= 0.1277 ;
neto.trainParam.min_grad=0.0013 ;
rng( 'default' )
[ neto, tro, Yo, Eo, Aof, Xof ] = train( neto, Xo, To, Xoi, Aoi );
%[ Yo Xof Aof ] = net( Xo, Xoi, Aoi )
%Eo = gsubtract( To, Yo )
%view( neto )
NMSEo = mse( Eo ) /varto1 % 0.0291
% now converted to closed loop to do multi step predictions
[ netc, Xci, Aci ] = closeloop( neto, Xoi, Aoi );
%view( netc )
[ Xc, Xci, Aci, Tc ] = preparets( netc, {}, {}, T );
%[ netc, trc, Yc, Ec, Acf, Xcf ] = train( netc, Xc, Tc, Xci, Aci );
isequal( Tc, To ) , tc = to ; % 1
[ Yc, Xcf, Acf ] = netc( Xc, Xci, Aci );
Ec = gsubtract( Tc, Yc );
yc = cell2mat( Yc );
NMSEc = mse(Ec) /var(tc,1) %1.9321
0 comentarios
Respuestas (0)
Ver también
Categorías
Más información sobre Define Shallow Neural Network Architectures en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!