How to improve the accuracy of LSTM network?
Mostrar comentarios más antiguos
Hello everyone, I hope you're fine and safe
I am working on forecasting time-series data using LSTM network, but the forecasting of the test data seems very inaccurate.
Kindly find the attached code with the dataset.
Any suggestions to improve the accuracy please?
Another question is how can I modify this code to forecast a variable based on multible variables (i.e., Multivariate Multi-Step LSTM Models; Multiple Input Multi-Step Output)?
Thanks in advance,



3 comentarios
MatLabMcLovinPotato
el 11 de Nov. de 2020
Hey Mohamed! The chicken pox timeseries example might help as it does what you're asking about. Could experiement with the options, layer config, and traiing from this, then adjust to suit your needs once you're able to get past the prediction flatlining (had the same issue and going back to this example really helped)
https://au.mathworks.com/help/deeplearning/ug/time-series-forecasting-using-deep-learning.html
Sinan Islam
el 9 de Mayo de 2021
Unfortunately, Matlab dont have a single example of optimizing LSTM.
Bhavick Singh
el 18 de Nov. de 2021
Hi Mohammed, Im having the same problem as well. Were you able to find the solution?
Respuestas (2)
Madderla Chiranjeevi
el 17 de En. de 2022
0 votos
Hi sirs, I am also strucking here, please help me in this regard if you solved this problem.you can mail me also over chiru.madderla@gmail.com
I am so thankful for you for this
yanqi liu
el 18 de Feb. de 2022
0 votos
yes,sir,may be use bilstmLayer to replace lstmLayer,and not use predictAndUpdateState during test
set randperm to shuffle data before make numTimeStepsTrain to split data
1 comentario
omar
el 7 de Ag. de 2025
unfortunatly biLSTM improve only the speed of training
Categorías
Más información sobre Deep Learning Toolbox en Centro de ayuda y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!