neural network training in a loop
1 visualización (últimos 30 días)
Mostrar comentarios más antiguos
I am trying to train neural network in several iterations using FOR loop to set up the number of training epochs (I need that as a preparation for some experiment). However, the results of such training are different from the results of standard training process with the same number of epochs. I suspect it's due to some training settings automatically adjusted at each iteration, but I can't find which exactly. I'd appreciate any help\clues. Here is the code to illustrate the problem:
p = rand(2,10);
t = 2*p(1,:) + p(2,:) + 3;
unet = feedforwardnet(2);
unet.divideFcn = '';
unet = configure(unet,p,t);
unet.trainParam.ShowWindow = 0;
mnet = unet;
mnet.trainParam.epochs = 1;
for i = 1:5
mnet = train(mnet,p,t);
end
anet = unet;
anet.trainParam.epochs = 5;
anet = train(anet,p,t);
I expected anet (trained using standard 5 epochs training) and mnet (trained 5 times using 1 epoch training) would be the same (have the same weights in IW\LW\b), but that's not the case. Thanks, Eugene
0 comentarios
Respuestas (1)
Greg Heath
el 4 de Jul. de 2016
1.The number of epochs to a satisfactory result depends on the random initial weights and random datadivision. Therefore, to be able to reproduce previous results, ALWAYS intialize the RNG to an initial state of your choice.
2. Feedforwardnet is a generic net automatically called by
a. FITNET specialized for regession and curvefitting
b. PATTERNNET specialized for classification and pattern
recognition
3. Typically, it is better to use the specialized vesions AND the documentation examples used in the help and doc documentation
help fitnet
doc fitnet
Additional examples can be found using the command
help nndatasets
4. ALWAYS try to plot and familiarize yourself with the data.
5. For smooth plots it will take at least NLE hidden nodes where
NLE = Number of Local Extrema
% Initialize the RNG
p = rand(2,10);
t = 2*p(1,:) + p(2,:) + 3;
% Plot t vs p
unet.divideFcn = ''; % Equivalent to 'dividetrain'
unet = feedforwardnet(2);
% What makes you think 2 hidden nodes is appropriate??
unet = configure(unet,p,t);
% An empty net will be automatically configured by TRAIN However, for a nonempty net, TRAIN will continue from the existing weights. Therefore, it is only necessary to use configure when removing existing weights and reinitializing settings
unet.trainParam.ShowWindow = 0;
mnet = unet;
mnet.trainParam.epochs = 1;
for i = 1:5
mnet = train(mnet,p,t);
end
%This is not doing what you think because several training parameters, e.g., mu are automatically reinitialized every time train is called
anet = unet;
anet.trainParam.epochs = 5;
anet = train(anet,p,t);
Hope this helps.
Thank you for formally accepting my answer
Greg
Ver también
Categorías
Más información sobre Deep Learning Toolbox en Help Center y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!