LSTM (more input steps than hidden layers) How does Matlab handle this?

18 visualizaciones (últimos 30 días)
Hello,
I'm wondering what happens if I have more input steps of a sequence than hidden units of LSTM blocks.
For example below: What happens if I have 5 timeseries inputs but my network has only 2 hidenn LSTM blocks. How can the system learn?
Is it just feeding inside the first two timeseries inputs? What happens with the last three?
Code example:
numfeatures
layers = [ ...
sequenceInputLayer(numfeatures)
lstmLayer(2,'OutputMode','last')
fullyConnectedLayer(1)
regressionLayer];
Can someone help?
Thanks.

Respuesta aceptada

Asvin Kumar
Asvin Kumar el 19 de Mayo de 2020
You’ve shown in your diagram that an LSTM unrolls with each cell connected to the next (except the last cell, of course.) The connection between two cells carries forward a vector of data and the length of that vector of information is determined from the ‘NumHiddenUnits’ property while calling the lstmLayer. As mentioned here, the number of hidden units does not immediately have anything to do with the sequence length of the data. An LSTM unrolls to the length of the input signal as required. People choose an appropriate size for the Hidden Units sufficient to capture the information in the typical sequence length for their use case. Have a look at this example. There’s no mention of the sequence length whatsoever. This is a common misconception, hope that clarifies it.

Más respuestas (0)

Categorías

Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.

Productos


Versión

R2020a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by