Are there any options to resize/replicate the matrices/vectors between layers of a deep network?
5 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
In a deep learning network, I have two branches operating from same input layer. In one of the branches, I have fully-connected layer, say 1X1XN dimensions. In another branch, I have a convolutional layer giving two-dimensional matrix, say PXQXS. In order to proceed with further convolutions by combining them, I have to concatenate the outputs of these branches by repeating the N-dimensional vector to form PXQXN, so that I will get a PXQX(N+S) matrix. To do this, are there any means to replicate a vector to matrix, analogous to 'repmat()' function, in between deep network layers?
In other words, is there any means by which I can concatenate two layers of different width and height by bringing them to a common size in a deep network?
0 comentarios
Respuestas (1)
Delprat Sebastien
el 25 de En. de 2020
I did a custom reshape layer for that purpose. Read the custom layer doc, it is very simple.there is however a very big limitation. Custom layers cannot change the dlarray format. That means that it is necessary to have a conv layer between the fully connected (output format is SB) and your reshape layer. The conv layer will output a (SSCB) so you can reshape it.
Source:mathworks support
0 comentarios
Ver también
Categorías
Más información sobre Image Data Workflows en Help Center y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!