How to fix LSTM Model that returns value for epoch size during training different from that in code.

1 visualización (últimos 30 días)
I have this LSTM model constructed for time series analysis. with parameters as such
numFeatures = 6
numHiddenLayers = 100
numResponse = 1
Layers = [...
sequenceInputLayer(numfeatures);...
lstmLayer(numHiddenLayer, "OutputMode", "last")
fullyConnectedLayer(numResponses)
reluLayer
regressionLayer
];
miniBatchSize = 24
Epoch = 50
options = trainingOptions("adam",...
"ExecutionEnvironment", "auto",...
"MaxEpoch", Epoch, ...
"MiniBatchSize", miniBatchSize,...
"Plots","training-progress");
Now during trainig the Epoch size is kept constant at 30, even though I set it to 50 in code. please how can I fix this and what is the cause?

Respuestas (1)

Ashutosh
Ashutosh el 25 de Mayo de 2023
Editada: Ashutosh el 25 de Mayo de 2023
Hi,
I am assuming that you are using the R2023a distribution of MATLAB. In that case, your method of assigning name-value arguments ("MaxEpoch", Epoch,"MiniBatchSize", miniBatchSize etc) may not be valid, instead you may have to use Name=Value format. Refer to the MATLAB Documentation for trainingOptions, more specifically the section for Name-Value Arguments. Note that the comma method is used for pre-R2021a as mentioned. Note also, that the default value for epochs is 30 which is being assigned in your case. It should work as intended if you give the options as below:
options = trainingOptions("adam",...
ExecutionEnvironment = "auto",...
MaxEpochs = Epoch, ...
MiniBatchSize = miniBatchSize,...
Plots = "training-progress");
Hope this helps!

Categorías

Más información sobre Pattern Recognition and Classification en Help Center y File Exchange.

Productos


Versión

R2023a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by