Borrar filtros
Borrar filtros

LSTM Sequence to One Regression

15 visualizaciones (últimos 30 días)
juan pedrosa
juan pedrosa el 26 de Oct. de 2019
Comentada: juan pedrosa el 7 de Jun. de 2021
I'm trying to train a LSTM network for Sequence to one regression, but I'm having problems with my dataset, although I'm using the definition given by Mathworks here
My train set is a N by 1 cell array where N=2,396,493 and each sequence is an 8 by 22 double.
My response set is a N by R matrix where N=2,396,493 and R = 8
I'm using a mini batch size of 300 and when I try to train the network this is the error output:
Error using trainNetwork (line 165)
Unable to perform assignment because the size of the left side is 8-by-300 and the size of the right side is 1-by-300.
I've tried different setups for the response set by transposing it or make it an N by 1 cell array to no results. I did trained a Sequence to sequence network but I think I'll get better results with a Sequence to one network, any advices please?
[EDIT]
It seems that the minibatch size is the problem (bug?), if the minibatch size is set to 1 then the training begins without issues.
Thank you for your time.
  1 comentario
juan pedrosa
juan pedrosa el 17 de Sept. de 2020
it has been almost a year and the error still prevails, I've lost hope in matlab, I'm moving to tensorflow.

Iniciar sesión para comentar.

Respuestas (3)

shubhan vaishnav
shubhan vaishnav el 11 de Feb. de 2021
send the code
  1 comentario
juan pedrosa
juan pedrosa el 11 de Feb. de 2021
No need thank you! Tensorflow is awesome!

Iniciar sesión para comentar.


Josephine Morgenroth
Josephine Morgenroth el 19 de Abr. de 2021
I'm having the same issue - trying to the use sequence-to-one framework using OutpuMode = 'last' with no success. I have a time series dataset with 10 features to predict 3 targets, with a total of 30 sequence/target rows. The code runs fine, but the LSTM predicts the same value for all the sequences! Has anyone seen an example where this structure was successfully used in MATLAB?
  1 comentario
Michael Hesse
Michael Hesse el 11 de Mayo de 2021
hi josephine, i'm working on the same problem. in the last couple days i figured out that the
padding option does have a huge impact on the training and prediction performance.
in particular for my case: the setting 'SequencePaddingDirection', 'left', ... has brought the breakthrough.
hope it helps, michael

Iniciar sesión para comentar.


Niccolò Dal Santo
Niccolò Dal Santo el 7 de Jun. de 2021
Hi Juan,
It would be very helpful if you could share the architecture you want to train, since this error might be caused by a mismatch between the output size of the network and the ground truth responses. In order to make sure that the network outputs the expected size (8 in your case) you can use the analyzeNetwork function to check the output of the network:
R = 8;
numFirstLstmHiddenUnits = 100;
numSecondLstmHiddenUnits = R;
layers = [sequenceInputLayer(R)
lstmLayer(numFirstLstmHiddenUnits, 'OutputMode', 'sequence')
lstmLayer(numSecondLstmHiddenUnits, 'OutputMode', 'last')
regressionLayer];
% Shows that the activations of the second lstmLayer, which is the output
% size of the network is equal to 8.
analyzeNetwork(layers)
In general, you should set the output size of the layer immediately preceding the regressionLayer to the expected size of the responses.
For example, if the layer preceding the regressionLayer is a lstmLayer, you should set its number of hidden units as the expected output size. The following piece of code trains a regression recurrent network with two LSTM layers. The first one has an arbitrary number of hidden units (set to 100), the second LSTM layer immediately precedes the regressionLayer, hence its number of hidden units is set to R=8, which is the size of each output observation:
N = 1000; % Number of observations.
X = cell(1,N);
R = 8; % Input and output size.
seqLength = 22; % Sequence lenghts
% Create training dataset inputs.
for i=1:N
X{i} = rand(R,seqLength);
end
% Create training dataset responses.
Y = rand(N, R);
% The first LSTM layer can have an arbitrary number of units
numFirstLstmHiddenUnits = 100;
% Define number of LSTM hidden size for the secons LSTM layer
% as the number of responses for each observation,
% since LSTM will be directly followed by a regressionLayer
% and that is our target output size.
numSecondLstmHiddenUnits = R;
layers = [sequenceInputLayer(R)
lstmLayer(numFirstLstmHiddenUnits, 'OutputMode', 'sequence')
lstmLayer(numSecondLstmHiddenUnits, 'OutputMode', 'last')
regressionLayer];
options = trainingOptions('sgdm', ...
'MaxEpochs', 2, ...
'MiniBatchSize',300, ...
'ExecutionEnvironment', 'cpu');
% Train network.
net = trainNetwork(X,Y,layers,options);
As an alternative, one can use a fullyConnectedLayer as layer immediately preceding the regressionLayer, with as many neurons as the expected output size of the network. This is shown for instance in this example for time series forecasting. In this case the previous example would be as follows:
N = 1000; % Number of observations.
X = cell(1,N);
R = 8;
seqLength = 22;
% Create training dataset inputs.
for i=1:N
X{i} = rand(R,seqLength);
end
% Create training dataset responses.
Y = rand(N, R);
% Define arbitrarily the number of units for the LSTM layers.
numLstmHiddenUnits = 100;
% Define the number of neurons of the fullyConnectedLayer preceding
% the regressionLayer to the expected output size.
numFullyConnectedNeurons = R;
layers = [sequenceInputLayer(R)
lstmLayer(numLstmHiddenUnits, 'OutputMode', 'sequence')
lstmLayer(numLstmHiddenUnits, 'OutputMode', 'last')
fullyConnectedLayer(numFullyConnectedNeurons)
regressionLayer];
options = trainingOptions('sgdm', ...
'MaxEpochs', 2, ...
'MiniBatchSize',300, ...
'ExecutionEnvironment', 'cpu');
% Train network.
net = trainNetwork(X,Y,layers,options);
Thanks,
Niccolò
  1 comentario
juan pedrosa
juan pedrosa el 7 de Jun. de 2021
No need thank you! Tensorflow is awesome!

Iniciar sesión para comentar.

Categorías

Más información sobre Image Data Workflows en Help Center y File Exchange.

Productos


Versión

R2019a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by