Building a feedback loop in a deep neural network
3 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Simon Hagmeyer
el 26 de Sept. de 2022
Comentada: Simon Hagmeyer
el 19 de Sept. de 2023
Dear Ladies and Gentlemen,
in Matlab I use the deep learning function, where you can build a neural network from different types of layers, e.g. in conjunction with a custom output layer.
What I' m trying to do is to implement a feedback loop according to a classical recurrent neural network. Unfortunately, I can only find the gated RNN layers GRU and LSTM layers in the list of deep learning layers (https://de.mathworks.com/help/deeplearning/ug/list-of-deep-learning-layers.html).
In the best case, I would like to span a feedback loop not only within one layer, but across two layers. I have tried to build such a feedback using the concatenationLayer in the Deep Network Designer, but a delay module is missing for this. So that the analyze function of the designer already identifies this graph as invalid.
Therefore, the following question: apart from the GRU and LSTM layers, is there a way to build a feedback loop using the deep learning function of Maltab?
0 comentarios
Respuesta aceptada
Aditya
el 30 de Ag. de 2023
Hey Simon
I understand that you are looking for different methods for building feed back loops .
Yes, in addition to GRU and LSTM layers, you can build a feedback loop in a neural network using the deep learning functions in MATLAB. One way to achieve this is by using recurrent layers such as the Simple Recurrent Unit (SRU) or the Gated Feedback Unit (GFU) layers.
The SRU layer is a type of recurrent layer that can be used to model temporal dependencies in sequential data. It allows feedback connections from the output of the layer to its input, creating a feedback loop. You can use the `sruLayer` function in MATLAB's Deep Learning Toolbox to add an SRU layer to your network.
layers = [
sequenceInputLayer(inputSize)
sruLayer(hiddenSize)
fullyConnectedLayer(outputSize)
softmaxLayer
classificationLayer
];
Similarly, the GFU layer is another recurrent layer that can be used to create feedback loops in a neural network. You can use the `gfuLayer` function to add a GFU layer to your network.
Here's an example of how to add a GFU layer to a network:
layers = [
sequenceInputLayer(inputSize)
gfuLayer(hiddenSize)
fullyConnectedLayer(outputSize)
softmaxLayer
classificationLayer
];
Both the SRU and GFU layers allow you to model temporal dependencies and create feedback loops within your network architecture. You can experiment with different combinations of layers and architectures to find the best model for your specific task.
It's worth noting that these are just a few examples of recurrent layers available in MATLAB's Deep Learning Toolbox. Depending on your specific requirements, you can also explore other recurrent layers or even customize your own recurrent layer using the `customRecurrentLayer` function.
Remember to adjust the `inputSize`, `hiddenSize`, and `outputSize` parameters according to the dimensions of your data and the desired network architecture.
By incorporating recurrent layers like SRU or GFU into your network, you can introduce feedback loops and capture temporal dependencies in your deep learning models.
Más respuestas (0)
Ver también
Categorías
Más información sobre Deep Learning Toolbox en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!