Size of the input layer is different from the expected input size

24 visualizaciones (últimos 30 días)
MR
MR el 17 de Jun. de 2022
Comentada: MR el 24 de Jun. de 2022
Hi everyone,
I want to combine a feedforward net with 3 features (3x1) with a RNN with 2 time varying features (each having 252 observations). Say I want to concatenate both networks into a single feedforward layer. No matter how I specify the dimentions in the concatenation layer (4,3,2,1), I always get the error message in the app designer "Size of the input layer is different from the expected input size". I also tried to add another feedforward layer after the GRU layer but nothing worked. The network structure I have set up looks the following way:
%Create Layer Graph
lgraph = layerGraph();
%Add Layer Branches
tempLayers = [
sequenceInputLayer(2,"Name","sequence")
gruLayer(200,"Name","gru_1")
gruLayer(200,"Name","gru_2")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
featureInputLayer(3,"Name","featureinput")
fullyConnectedLayer(128,"Name","fc_1")
fullyConnectedLayer(200,"Name","fc_4")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
concatenationLayer(4,2,"Name","concat")
fullyConnectedLayer(200,"Name","fc_2")
fullyConnectedLayer(10,"Name","fc_3")];
lgraph = addLayers(lgraph,tempLayers);
% clean up helper variable
clear tempLayers;
%Connect Layer Branches
%Connect all the branches of the network to create the network graph.
lgraph = connectLayers(lgraph,"gru_2","concat/in2");
lgraph = connectLayers(lgraph,"fc_4","concat/in1");
%Plot Layers
plot(lgraph);
Any comment or feedback is highly appreciated.

Respuesta aceptada

Ben
Ben el 20 de Jun. de 2022
You're trying to concatenate the output of "gru_2" with the output of "fc_4". However "gru_2" outputs sequence data, and "fc_4" doesn't. There are probably 2 things to try depending on your task:
  1. If your target data is not sequences, you can set the OutputMode of gruLayer to "last" to only output the last hidden state. This should be able to concatenate with the output of "fc_4" along dimension 1.
  2. If your target data are sequences, you could concatenate the output of "fc_4" to each sequence element of the output of "gru_2".
An example of 1.
layers = [sequenceInputLayer(2)
gruLayer(200,OutputMode="last")
concatenationLayer(1,2,"Name","cat")
fullyConnectedLayer(10,"Name","output")];
lgraph = layerGraph(layers);
lgraph = lgraph.addLayers([featureInputLayer(3); fullyConnectedLayer(200,"Name","fc")]);
lgraph = lgraph.connectLayers("fc","cat/in2");
% This is fine for dlnetwork. For trainNetwork you will need an output layer:
analyzeNetwork(lgraph,"TargetUsage","dlnetwork");
For 2. it's a little harder, you need a custom layer to repmat the "fc_4" output over the sequence dimension. The shortest way to do this is probably with functionLayer as follows
concatLayer = functionLayer(@concatToSequence,"Formattable",true,"Name","cat");
layers = [sequenceInputLayer(2)
gruLayer(200)
concatLayer
fullyConnectedLayer(10,"Name","Output")];
lgraph = layerGraph(layers);
lgraph = lgraph.addLayers([featureInputLayer(3); fullyConnectedLayer(200,"Name","fc")]);
lgraph = lgraph.connectLayers("fc","cat/in2");
analyzeNetwork(lgraph,"TargetUsage","dlnetwork");
function z = concatToSequence(x,y)
% assume x is sequence and y is not
% x is CBT and y is CB
y = repmat(y,[1,1,size(x,3)]);
% apply labels to y - i.e. add the T label.
y = dlarray(y,"CBT");
z = cat(1,x,y);
end
  6 comentarios
MR
MR el 24 de Jun. de 2022
Then that is the issue why trainNetwork does not work. I am running Matlab R2021b. Good to know, thanks for the pointer. I could have looked at the release notes earlier.
Now everything works for the dlnetwork approach and both ways for the minibatchqueue work for me. Again, thank you so much Ben for your outstanding help!
MR
MR el 24 de Jun. de 2022
Hi Ben, I just had a chance to check your first approach on R2022a via trainNetwork and subsequently via the Deep Learning App. It worked perfectly fine. Thank you again, I really appreciate your help.

Iniciar sesión para comentar.

Más respuestas (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by