Out of memory while training
2 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
layers = [ ...
imageInputLayer([768 1024 3])
convolution2dLayer(5,20)
reluLayer
maxPooling2dLayer(2,'Stride',2)
fullyConnectedLayer(2)
softmaxLayer
classificationLayer];
options = trainingOptions('sgdm', ...
'MaxEpochs',20,...
'InitialLearnRate',1e-4, ...
'Verbose',false, ...
'Plots','training-progress');
imds = imageDatastore('datasets/train', ...
'IncludeSubfolders',true, ...
'LabelSource','foldernames');
inputSize = [768 1024];
imds.ReadFcn = @(loc)imresize(imread(loc),inputSize);
numTrainingFiles = 51;
[imdsTrain,imdsTest] = splitEachLabel(imds,numTrainingFiles,'randomize');
net = trainNetwork(imdsTrain,layers,options);
save('net.mat','net');
I have the code above train a model using set of images. I give file location "train" and it contains two subfile which include training images. Problem is that when I start training, it gives out of memory error and it just has 100 image in it. When I train one subfile only, it does it flawlessly. Error may happen because of the following line.
inputSize = [768 1024];
imds.ReadFcn = @(loc)imresize(imread(loc),inputSize);
0 comentarios
Respuestas (1)
Gaurav Garg
el 20 de Nov. de 2019
Hi,
Consider using ‘MiniBatchSize’ and 'ExecutionEnvironment' as arguments in trainingOptions.
Using ‘MiniBatchSize’, you can specify the size of mini-batch you wish to use for training. You can try using size as 10 in your case.
Using 'ExecutionEnvironment', you can set the environment to CPU or GPU or auto (to train on a GPU, if available).
0 comentarios
Ver también
Categorías
Más información sobre Image Data Workflows en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!