trainnet function - randomness and repeatability

Hello all,
I'm checking the trainnet function. I'm running same script multiple times and training outputs are slightly different. Even though I remove all points which brings any randomness in (I'm aware about) - like random label splitting, batches schuffling, etc.
So may I ask you, does anybody know what causes that outcomes are slightly different each time? Please find below basic steps of my script.
training_imds = imageDatastore(Training_data_folder,"IncludeSubfolders",true,"Labelsource","foldernames");
%-----------------------
% Training data split
% only part of training dataset can be used for the training
% training_imdsVal is not used
[training_imds_Train,training_imdsVal] = splitEachLabel(training_imds,0.3);
training_imds = training_imds_Train;
%-----------------------
% Training process - train - val data split
[training_imds_Train,training_imds_Val] = splitEachLabel(training_imds,0.9);
training_imds_Train_au = augmentedImageDatastore([imHeight imWidth],training_imds_Train);
training_imds_Val_au = augmentedImageDatastore([imHeight imWidth],training_imds_Val);
layers = [
imageInputLayer([imHeight imWidth 3]) % image size and RGB (=3)
convolution2dLayer(20,20)
reluLayer()
maxPooling2dLayer(3)
fullyConnectedLayer(2)
softmaxLayer()
];
options = trainingOptions("sgdm", ...
Metrics="accuracy", ...
InitialLearnRate=0.000001, ...
ValidationData=training_imds_Val_au,...
MiniBatchSize=128,...
ValidationFrequency=25,...
ValidationPatience=5,...
MaxEpochs = 1,...
LearnRateSchedule = 'piecewise',...
LearnRateDropPeriod = 5,...
ExecutionEnvironment='cpu');
trained_net = trainnet(training_imds_Train_au,layers,"crossentropy",options);
When I run this more times, different outcomes are received, e.g.:
Iteration Epoch TimeElapsed LearnRate TrainingLoss ValidationLoss TrainingAccuracy ValidationAccuracy
_________ _____ ___________ _________ ____________ ______________ ________________ __________________
0 0 00:00:10 1e-06 1.4484 81.864
1 1 00:00:10 1e-06 5.8723 60.156
25 1 00:01:31 1e-06 1.3652 0.40944 91.406 97.229
50 1 00:03:17 1e-06 0.12455 1.051e-09 99.219 100
55 1 00:03:43 1e-06 0.16857 0 98.438 100
Training stopped: Max epochs completed
Iteration Epoch TimeElapsed LearnRate TrainingLoss ValidationLoss TrainingAccuracy ValidationAccuracy
_________ _____ ___________ _________ ____________ ______________ ________________ __________________
0 0 00:00:09 1e-06 7.5433 51.637
1 1 00:00:10 1e-06 8.6849 41.406
25 1 00:01:27 1e-06 0.9964 0.14055 93.75 99.118
50 1 00:03:03 1e-06 0.62306 0.022228 96.094 99.748
55 1 00:03:23 1e-06 0.12455 0.009824 99.219 99.874
Training stopped: Max epochs completed

 Respuesta aceptada

What happens when you reset the state or seed of the random number generator before each attempt to train the network? Let's choose an arbitrary seed value and generate some numbers.
rng(42)
x1 = rand(1, 5);
If we reset the seed to the same value, the generator starts in the same place and generates the same numbers.
rng(42)
x2 = rand(1, 5);
isequal(x1, x2) % Same values, down to the last bit
ans = logical
1
But generating new values doesn't generate the same values as the freshly-reset generator.
x3 = rand(1, 5);
isequal(x1, x3) % No, x3 contains different values
ans = logical
0
You may have removed the randomness from your code, but I believe the network may be initialized with random starting values for the training internally.

1 comentario

Oldrich
Oldrich hace alrededor de 4 horas
Thank you very much for the clarification. I was not familiar with these random generator setting options.

Iniciar sesión para comentar.

Más respuestas (2)

Matt J
Matt J hace alrededor de 22 horas
Editada: Matt J hace alrededor de 22 horas

0 votos

In addition to random initialization of the Learnables, as mentioned by @Steven Lord, you are using the default Shuffle setting, which performs a random reordering of the training inputs once at the beginning of the training process.

1 comentario

Oldrich
Oldrich hace alrededor de 4 horas
Thank you for your comment - you are right about shuffling.

Iniciar sesión para comentar.

Oldrich
Oldrich hace alrededor de 4 horas

0 votos

When I applied suggestions of @Steven Lord and @Matt J - reset random generator at the beginning of each run and remove suffling totally - same results were received.
@Steven Lord @Matt J Thank you both very much for clarification of these net training details.

1 comentario

Matt J
Matt J hace alrededor de 2 horas
If you reset the random number generator, turning off shuffling should make no difference.

Iniciar sesión para comentar.

Categorías

Productos

Versión

R2025b

Etiquetas

Preguntada:

el 9 de Abr. de 2026 a las 12:03

Comentada:

hace alrededor de 11 horas

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by