Borrar filtros
Borrar filtros

Training a TCN model to predict a Continuous Variable

12 visualizaciones (últimos 30 días)
Isabelle Museck
Isabelle Museck el 3 de Jun. de 2024
Comentada: Isabelle Museck el 4 de Jun. de 2024
Hello there, I am trying to build a TCN model to predict a continuous variable. Similar the the example here: https://www.mathworks.com/help/deeplearning/ug/sequence-to-sequence-classification-using-1-d-convolutions.html#SeqToSeqClassificationUsing1DConvAndModelFunctionExample-11. I have time series data in which I am using 3 input features (accelrometer measuments in x,y,z directions), but instead of classifying an acitivity, I am trying to estimate/predict a continuous variable. My predictors/input features are stored in a 10x1 cell array name "IMUdata" (each cell is one of the 10 trials) with each cell containing a 540x3 double with the acclerometer data from that trial. The target continous varable I am trying to predict is simialrly stored in a 10x1 cell array with each cell contaning a 540x1 double named "Predvar". The code I have been trying so far is:
numfeatures = 3;
numFilters = 64;
filterSize = 5;
droupoutFactor = 0.005;
numBlocks = 4;
net = dlnetwork;
layer = sequenceInputLayer(numFeatures,Normalization="rescale-symmetric",Name="input");
net = addLayers(net,layer);
outputName = layer.Name;
for i = 1:numBlocks
dilationFactor = 2^(i-1);
layers = [
convolution1dLayer(filterSize,numFilters,DilationFactor=dilationFactor,Padding="causal",Name="conv1_"+i)
layerNormalizationLayer
spatialDropoutLayer(Name= "spat_drop_"+i,Probability=droupoutFactor)
convolution1dLayer(filterSize,numFilters,DilationFactor=dilationFactor,Padding="causal")
layerNormalizationLayer
reluLayer
spatialDropoutLayer(Name="spat_drop2_"+i,Probability=droupoutFactor)
additionLayer(2,Name="add_"+i)];
% Add and connect layers.
net = addLayers(net,layers);
net = connectLayers(net,outputName,"conv1_"+i);
% Skip connection.
if i == 1
% Include convolution in first skip connection.
layer = convolution1dLayer(1,numFilters,Name="convSkip");
net = addLayers(net,layer);
net = connectLayers(net,outputName,"convSkip");
net = connectLayers(net,"convSkip","add_" + i + "/in2");
else
net = connectLayers(net,outputName,"add_" + i + "/in2");
end
% Update layer output name.
outputName = "add_" + i;
end
layers = [
fullyConnectedLayer(1)];
net = addLayers(net,layers);
net = connectLayers(net,outputName,"fc");
options = trainingOptions("adam", ...
MaxEpochs=60, ...
miniBatchSize=1, ...
InputDataFormats="CTB", ...
Plots="training-progress", ...
Metrics="rmse", ...
Verbose=0);
net = trainnet(IMUdata,Predvar,net,"mse",options)
However I am getiting an ERROR with the trainnet function saying that: Error setting data statistics of layer "input".nnet.cnn.layer.SequenceInputLayer>iAssertValidStatistics (line 341)Expected input to be of size 3x1, but it is of size 541x1.
Is this because of the way my input data is stored? Im not sure how to fix this error and if this is the correct way to build a TCN model to predict a continuous variable? Any help is greatly appreciated!

Respuestas (1)

Ganesh
Ganesh el 4 de Jun. de 2024
I understand that you are trying to train a "Seq2Seq model" by predicting the Sequence, as a Regression Model, as opposed to Classifying as shown in the example.
From your description of the data you are using, you seem to be using a 10x1 cell, with each row containing a different set of data points. The given problem is dissimar to the one example you have been using and hence you are left with an error.
To elaborate, the example you have been using employs a "1D Convulational Network". The data is in the form of a 1x1 cell, where the input data is the contents of the cell. The prediction made in the output is only dependant on the single cell data. Your code also works perfectly fine when doing the same, i.e. predicting the first data point of "IMUdata" to the corresponding value of "Predvar". You may use the following code for further clarity:
s = load("HumanActivityTrain.mat");
YTest{1} = rand(1,length(XTest{1})); % Initializing Random Variables to YTest for a regression problem
numfeatures = 3;
numFilters = 64;
filterSize = 5;
droupoutFactor = 0.005;
numBlocks = 4;
layer = sequenceInputLayer(numfeatures,Normalization="rescale-symmetric",Name="input");
net = dlnetwork(layer);
outputName = layer.Name;
for i = 1:numBlocks
dilationFactor = 2^(i-1);
layers = [
convolution1dLayer(filterSize,numFilters,DilationFactor=dilationFactor,Padding="causal",Name="conv1_"+i)
layerNormalizationLayer
spatialDropoutLayer(droupoutFactor,Name="spat_drop_"+i)
convolution1dLayer(filterSize,numFilters,DilationFactor=dilationFactor,Padding="causal")
layerNormalizationLayer
reluLayer
spatialDropoutLayer(droupoutFactor,Name="spat_drop2_"+i)
additionLayer(2,Name="add_"+i)];
% Add and connect layers.
net = addLayers(net,layers);
net = connectLayers(net,outputName,"conv1_"+i);
% Skip connection.
if i == 1
% Include convolution in first skip connection.
layer = convolution1dLayer(1,numFilters,Name="convSkip");
net = addLayers(net,layer);
net = connectLayers(net,outputName,"convSkip");
net = connectLayers(net,"convSkip","add_" + i + "/in2");
else
net = connectLayers(net,outputName,"add_" + i + "/in2");
end
% Update layer output name.
outputName = "add_" + i;
end
layers = [
fullyConnectedLayer(1)];
net = addLayers(net,layers);
net = connectLayers(net,outputName,"fc");
options = trainingOptions("adam", ...
MaxEpochs=60, ...
miniBatchSize=1, ...
InputDataFormats="CTB", ...
Plots="training-progress", ...
Metrics="rmse", ...
Verbose=0);
net = trainnet(XTest,YTest,net,"mse",options);
If this is how you intend the code to work, then you may retrain the network for each row in "IMUdata" and it would act as a "Transfer Learning" model.
However, from your test, I see that you would like to predict all 10 rows at a time and create generate outputs. Thus the input data is not of the form "3*T" but will be of the form "10*3*T". T-"Time Steps".
I would recommend looking at a 2D Convulational Layer for this problem. You may use the following links to understand how you can refactor your code according to your requirements:
Hope this helps!
  1 comentario
Isabelle Museck
Isabelle Museck el 4 de Jun. de 2024
Hello there, I apprecitate your response. I am trying to train the model on the data from all 10 trials (so each row in the IMUdata cell). How would I adjust my code to build a transfer learning model that is trained on data from all 10 trials? Or would it be more proper to put the data from all 10 trials into one large table and train the TCN model on all the data at once?

Iniciar sesión para comentar.

Categorías

Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by