how can I make a deep learning custom layer to take multiple inputs like addition/concatenation?

2 visualizaciones (últimos 30 días)
I am inplementating attention mechanism for image segmentation, one step in attention is to take dot product of the attention coefficient and input feature map,however, matlab neural network toolbox doesn't support such an action yet. What I tried was to modify the "Addition layer" and relavant functions to make a new class/functions to handle dot product of two inputs and I also added those functions to the path. It kind of works so it does generate an "attention layer" and it can take two inputs, however, when I train the network, it gets error saying Error using nnet.internal.cnn.analyzer.NetworkAnalyzer/propagateSizes (line 223): Index exceeds array bounds.
  2 comentarios
Markus Wagner
Markus Wagner el 2 de En. de 2019
I have the same problem. I want implementate a custome hidden layer and a custome regression layer with 2 inputs like the addition/concatenation layer for bulid up a VAE Network.
After adapt the class of the Addition Layer and use in a layerGraph it failed by a Error:
Error using nnet.internal.cnn.util.validateLayersForLayerGraph (line 20)
Expected input to be one of these types:
nnet.cnn.layer.Layer
Instead its type was latentLayer.
Hanxx Sakura
Hanxx Sakura el 17 de En. de 2019
Editada: Hanxx Sakura el 17 de En. de 2019
I think there is no official way to implement module with multiple inputs, but I finally accomplish it inspired by the Addition.m and AdditionLayer.m as follows:
  1. implement an internal layer (e.g., myAdd) like the "Addition" Class, such as, defining the variables, forward / backforward function;
  2. since the internal layer cannot be used in the layerGraph directly, wrap the internal layer by an external class (e.g., myAddLayer) as in "AdditionLayer.m";
  3. create an object of your module by using myAddLayer();
hope this can help~

Iniciar sesión para comentar.

Respuestas (1)

Maksym Tymchenko
Maksym Tymchenko el 10 de Mzo. de 2023
Since release R2022b, MATLAB directly supports the attention mechanism that you are trying to implement.
Check the documentation of the attention function at the link below:

Categorías

Más información sobre Image Data Workflows en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by