A Suitable Machine Learning Technique to Learn Y=f(X,t)

2 visualizaciones (últimos 30 días)
H R
H R el 6 de Mayo de 2020
Respondida: Ameer Hamza el 6 de Mayo de 2020
I have a blackbox model that accepts an input vector X (variables) and gives three outputs Ys but as a function of time Y1(t), Y2(t) and Y3(t). In the outputs "t" is the discrete time with a known number of time steps. The model is a simulator that predicts the output quantities as a function of time. Therefore Y1(t1), Y1(t2),... are not independent. Y1(t), Y2(t) and Y3(t) can also have some relationships but for now we can ignore that.
I have several instances (samples) of X with their corresponding outputs. Which machine learning technique can handle this learning process to relate X with Y(t) ?
I am a bit confused because I have always seen the machine learning algorithm to relate X with one output Y which is not time dependent. On the other hand the time series prediction methods only look at Y=f(t) and not the X (i.e. the input is t and the output is Y)
Any suggestion to a specific method is highly appreciated

Respuesta aceptada

Ameer Hamza
Ameer Hamza el 6 de Mayo de 2020
Yes, common neural networks are not well-suited for time-series data. Although you can use them by defining multiple inputs (say n), where each corresponding to a time step value t(n), t(n-1), t(n-2), ..., t(1), but that is not a commonly used way.
For time-dependent series, mainly we use LSTM networks:
See MATLAB examples here:
You can also see the Recurrent Neural networks which are a general form of LTSM.

Más respuestas (0)

Categorías

Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.

Etiquetas

Productos

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by