Dynamic (Recurrent) Neural Network - Input Data Structure- Concurrent inputs
2 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Hi, i am using dynamic neural network (in fact recurrent neural network). my input data samples are in fact several sequences (or time series). the MATLAB Help says that if we have some sequences, we should use a Cell whose elements are matrix. for example this is my input data to network: {[I1] [I2] ... [In]} every Ii (1<=i<=n) is a matrix with the size 20x500. 20 is the input vector size of the Network. and as the HELP says, I1 contains all the sequences of time step 1, I2 has time step 2 and so on. as long as the sequences are not equal, some sequences have the value NaN (mostly at the end of the cell). anyway, this is my network configuration: input-hidden-output layers, with hidden recurrently connected to itself.
here is the problem: when i want to train the network with the input set that i explained above, it takes a lot of time (after 2-3 hour maybe one epoch or less!)
where the problem? am i doing something wrong, or ...? please answer. thank you in advance!
0 comentarios
Respuestas (3)
Lucas García
el 26 de Ag. de 2011
What are your inputs? What is the architecture of your network?
By doing {[I1] [I2] ... [In]}, where each Ii is a 20x500 matrix size, you are presenting to the network 500 sequences at each time, where each sequence is nx20. That is a huge problem to solve...
I am wondering that if you have 20 time series that are inputs to the network, your input to the network should be {[I1] [I2] ... [In]}, with [Ii] being a 20x1 vector containing one element of each time series at time i.
0 comentarios
Lucas García
el 27 de Ag. de 2011
Then I guess you are solving a big problem here. Recurrent networks take much longer than static feedforward networks to be trained. In your case, the architecture of the network is not too big, but your input to the problem is. It is a good idea to try different training algorithms to see how the network behaves. For example, in the case of Levenberg-Marquardt backpropagation (trainlm), MATLAB has to invert a huge matrix. This algorithm is considered a good choice for middle-size problems. However, in your case, that has to be done 500 times per time step. As you have already done, I will try different algorithms: conjugate gradient are generally much faster, traingd, traincgb, etc.
0 comentarios
Ver también
Categorías
Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!