connecting concenation layer error
5 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Hello everyone. I have an issue. In the following code, I cant connect concatenationLayer = concat to featureAttention & temporalAttention. Would you please help?
Error Message
Caused by:
Layer 'concat': Unconnected input. Each layer input must be connected to the output of another layer.
++
numFeatures = size(XTrain, 2);
numClasses = numel(categories(YTrain));
% Feature-Level Attention
featureAttention = [
fullyConnectedLayer(64, 'Name', 'fc_feature_attention')
reluLayer('Name', 'relu_feature_attention')
fullyConnectedLayer(1, 'Name', 'fc_feature_weights')
softmaxLayer('Name', 'feature_attention_weights')
];
% Temporal Attention (not used for Iris dataset, but included for completeness)
temporalAttention = [
fullyConnectedLayer(numFeatures, 'Name', 'input_sequence')
lstmLayer(64, 'OutputMode', 'sequence', 'Name', 'lstm_temporal_attention')
fullyConnectedLayer(1, 'Name', 'fc_temporal_weights')
softmaxLayer('Name', 'temporal_attention_weights')
];
% Combine into Hierarchical Attention
hierarchicalAttention = [
featureInputLayer(numFeatures, 'Name', 'input_features') % Input layer for features
featureAttention
temporalAttention
concatenationLayer(1, 2, 'Name', 'concat') % Concatenate feature and temporal attention outputs
];
1 comentario
Respuesta aceptada
Matt J
el 2 de Feb. de 2025
Editada: Matt J
el 2 de Feb. de 2025
Use connectLayers to make your connections programmatically or make the connections manually in the deepNetworkDesigner.
2 comentarios
Matt J
el 2 de Feb. de 2025
Editada: Matt J
el 3 de Feb. de 2025
% Feature-Level Attention Block (Encapsulated)
featureAttention = networkLayer([
fullyConnectedLayer(64, 'Name', 'fc_feature_attention')
reluLayer('Name', 'relu_feature_attention')
fullyConnectedLayer(1, 'Name', 'fc_feature_weights')
softmaxLayer('Name', 'feature_attention_weights')
], 'Name', 'feature_attention_block');
% Temporal Attention Block (Encapsulated)
temporalAttention = networkLayer([
fullyConnectedLayer(numFeatures, 'Name', 'input_sequence')
lstmLayer(64, 'OutputMode', 'sequence', 'Name', 'lstm_temporal_attention')
fullyConnectedLayer(1, 'Name', 'fc_temporal_weights')
softmaxLayer('Name', 'temporal_attention_weights')
], 'Name', 'temporal_attention_block');
% Create a layerGraph with multiple layers but NO connections yet
hierarchicalAttention = layerGraph([
featureInputLayer(numFeatures, 'Name', 'input_features');
featureAttention
temporalAttention
concatenationLayer(1, 2, 'Name', 'concat_attention');
]);
% Connect the layers
hierarchicalAttention = connectLayers(hierarchicalAttention, 'input_features', 'feature_attention_block');
hierarchicalAttention = connectLayers(hierarchicalAttention, 'input_features', 'temporal_attention_block');
hierarchicalAttention = connectLayers(hierarchicalAttention, 'feature_attention_block', 'concat/in1');
hierarchicalAttention = connectLayers(hierarchicalAttention, 'temporal_attention_block', 'concat/in2');
Más respuestas (0)
Ver también
Categorías
Más información sobre Deep Learning Toolbox en Help Center y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!