Combining Feature and Sequence Data in Datastores
Mostrar comentarios más antiguos
Hello, I am trying to use some deep learning to predict data classification that is output from an image processing algorithm my lab uses. I do not necessarily want to use an image processing network (I'm a novice at DL and the image sequences received are movies that are 512 pixels x 512 pixels x 3 channels x 302 time points, and these contain hundreds-thousands of small events I want to analyze). So I opted to just use deep learning on the output from the processing algorithm.
My data comes back with a single value through time for each channel (3x302) and that is loaded into the sequence data (input). I want to use an LSTM to analyze these in tandem. In order to relate these to feature data that the image processing algorithm outputs (holistic measurements that corresponds to the event studied in all 3 channels) I get a 1x13 vector used in the feature input. To reduce dimensionality so these can be related I have used the 'Last' option from the LSTM so that the concatenation works, and I set the concatenation dimensionality to '1'. Matlab's Deep Learning Network Analysis detects no issues in the network (see below).

In order to load multiple inputs into the network, I know I will need to use dataStores. Since these are output through the image processing algorithm, and are already in the workspace after some pre-processing. I have opted to use arrayDatastores. I have provided example datasets with random numbers below.
When I try to train my network using the trainNetwork function. I get the following error:
"Error using trainNetwork
Error during read from datastore.
Caused by:
Error using horzcat
Dimensions of arrays being concatenated are not consistent.
data = horzcat(data{:});"
I can run a network on the sequence data, and the feature data individually, which makes me think I am having issues with the datastores. I thought the issue might be with the sequence data loading, so I read that padding might be required and a transformation on the sequence datastore might be needed. However, if I try to "readall" after performing the following transformation I get this error:
transformed_datastore_sequence = transform(datastore_sequence,@(x) padsequences(x,2,'Direction', 'both', 'length', 20, 'PaddingValue', 'symmetric', 'UniformOutput', false));
readall(transformed_datastore_sequence)
Invalid transform function defined on datastore.
The cause of the error was:
Error using padsequences
Input sequences must be numeric or categorical arrays.
Error in @(x)padsequences(x,2,'Direction','both','length',20,'PaddingValue','symmetric','UniformOutput',false)
Error in matlab.io.datastore.TransformedDatastore/applyTransforms (line 723)
data = ds.Transforms{ii}(data);
Error in matlab.io.datastore.TransformedDatastore/read (line 235)
[data, info] = ds.applyTransforms(data, info);
Error in matlab.io.datastore.TransformedDatastore/readall (line 300)
data{end+1} = read(copyds); %#ok<AGROW>
Any guidance would be appreciated. I think it's close, I just may not fully understand how the preprocessing set-up is required before combining the data into a datastore.
Below are my layer definitions, and options used for the trainNetwork function, for reference I am running MATLAB R2023b:
tempLayers = [
sequenceInputLayer(3,"Name","input")
lstmLayer(256,"Name","lstm","OutputMode","last")
reluLayer("Name","relu")
fullyConnectedLayer(180,"Name","fc_2")
fullyConnectedLayer(13,"Name","fc")
flattenLayer("Name","flatten")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [featureInputLayer(13,"Name","featureinput")
fullyConnectedLayer(13,"Name","fc_3")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
concatenationLayer(1,2,"Name","concat")
fullyConnectedLayer(3,"Name","fc_1")
softmaxLayer("Name","softmax")
classificationLayer("Name","classification")];
lgraph = addLayers(lgraph,tempLayers);
clear tempLayers;
lgraph = connectLayers(lgraph,"flatten","concat/in1");
lgraph = connectLayers(lgraph,"fc_3","concat/in2");
my_options = trainingOptions('adam',...
'MaxEpochs', 12,...
'MiniBatchSize', 300,...
'SequencePaddingValue', 5,...
'ExecutionEnvironment','gpu',...
'shuffle', 'every-epoch',...
'InitialLearnRate', 0.01,...
'LearnRateSchedule','piecewise',...
'LearnRateDropFactor', 0.2,...
'LearnRateDropPeriod', 5,...
'Verbose', true,...
'Plots', 'training-progress');
Respuesta aceptada
Más respuestas (0)
Categorías
Más información sobre Deep Learning Toolbox en Centro de ayuda y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!