Are there any examples of using Transformers for Time Series?

64 visualizaciones (últimos 30 días)
Good afternoon All,
I'm researching how to develop/train/use Transformers (those similar to GPT) for S&P 500 historical data as an example. It seems like Transformers are mainly used for NLP but there are articles out there that utilize them for time series. Can any fellow lead the way in how could I get started in doing these in MATLAB? I have done general things like convolutions, LSTMs, etc. but not on a transformers caliber with the DL Toolbox.
I found this MATLAB package (pasted below) to be somewhat useful but it seems to lean more into language rather than historical data points.
Another side question: It's possibe to use the Deep Network Designer app to layout a transformers architecture?
Any tip is appreciated.
Thank you,
Yeray

Respuesta aceptada

Philip Brown
Philip Brown el 30 de Mayo de 2024
Update in 2024: there's a GitHub repo/FileExchange submission and blog post covering using transformers for time series data which I think closely matches your request.
Deep Network Designer supports transformer layers; more details in this answer.

Más respuestas (1)

Sandeep
Sandeep el 31 de Ag. de 2023
Editada: Sandeep el 31 de Ag. de 2023
Hi Yeray,
Developing and training Transformers for time series data, such as S&P 500 historical data, is indeed possible. While Transformers are widely used in NLP tasks, they can also be applied to other domains, including time series analysis.
To get started, you can take a look at the following tips:
  1. Familiarize yourself with the basics of Transformers and their architecture. Understanding concepts like self-attention, multi-head attention, and positional encoding will be helpful.
  2. Explore the MATLAB package you found Transformer Models. Although it may be more focused on language tasks, you can still adapt it for time series analysis.
  3. Preprocess your S&P 500 historical data into a suitable format for training the Transformer model. You may need to consider factors such as input window size, target prediction horizon, and normalization.
  4. Train the Transformer model using your preprocessed data by adjusting the hyperparameters such as learning rate, batch size, and number of layers, to achieve optimal performance.
  5. Evaluate the trained model's performance on validation or test data using appropriate metrics for time series analysis, such as mean squared error (MSE) or mean absolute error (MAE).
  6. Remember that developing and training Transformers for time series analysis may require some experimentation and fine-tuning to achieve the desired results.
Hope you find it helpful. Good luck with your research!

Categorías

Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.

Productos


Versión

R2022b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by