Setting the best training / validation ratio in a Neural Network

7 visualizaciones (últimos 30 días)
Jose Marques
Jose Marques el 4 de Mayo de 2018
Editada: Greg Heath el 16 de En. de 2019
I am using a Neural Network to make a regression, using 10% of data to test. But how can I set the ratio values of training and validation datasets?
  3 comentarios
Greg Heath
Greg Heath el 16 de En. de 2019
FYI: The default ratios are 0.7/0.15/0.15
Do you ave a specific reason for not accepting them?
Greg
Jose Marques
Jose Marques el 16 de En. de 2019
Thanks, guys!
I am comparing different regression algorithms (Neural Networks, SVM, Regression Trees, Ensemble Trees). So, now I am using 30% of the samples as test for all algorithms.
To subdivide the 70% left in Neural Networks, I use 56% for training and 14% for crossvalidating. Do you think is a good option?

Iniciar sesión para comentar.

Respuestas (1)

Greg Heath
Greg Heath el 16 de En. de 2019
Editada: Greg Heath el 16 de En. de 2019
1. ALWAYS START WITH 10 DESIGNS USING THE MATLAB DEFAULT!
2. Then evaluate the results to determine what to modify.
3. For regression the default is FITNET. So, look at the codes in
help fitnet
and
doc fitnet
4. They are the same:
[ x, t ] = simplefit_dataset;
net = fitnet(H); % H = 10 hidden nodes
net = train(net,x,t);
view(net)
y=net(x);
perf = perform(net,t,y)
5. Since I don't trust "perform" , I add a normalized mean square error calculation which typically has a range from 0 to 1
NMSE = mse(t-y)/mse(t-mean(t)) % 0 <= NMSE <= 1
7. Search using
Greg NMSE
8. This is related to the familiar Rsquare (coefficient of determination) used in elementary statistics
(See any encyclopedia)
Rsquare = 1-NMSE
9. If successful, the next step is to try to obtain good results with the number of hidden nodes
H < 10
10. Otherwise, increase H.
11. I have a jillion examples in both the NEWSGROUP and ANSWERS.
PS: This format sucks.
Greg
  2 comentarios
Jose Marques
Jose Marques el 16 de En. de 2019
Editada: Jose Marques el 16 de En. de 2019
Greg,
thanks for your kindness. Your answers are always helpful.
I have some questions:
- why do not you trust the function 'perform'? I just realized that I have my own function to calculate MSE.
- I created a function to try optimize these hiperparameters:
% Number of executions of the function 'calculate_error_NN'. In each execution,
% different samples are taken as train and test.
num_executions = 10;
% Creating variables to optimize
hidden1 = optimizableVariable('hidden1',[1,20],'Type','integer');
hidden2 = optimizableVariable('hidden2',[1,20],'Type','integer');
hidden3 = optimizableVariable('hidden3',[1,20],'Type','integer');
func_trein = optimizableVariable('func',{'trainlm' 'trainbfg' 'trainscg' 'traincgp'},'Type','categorical');
% Error function to be optimize by bayesian optmization
func = @(my_struct)calculate_error_NN(5,5,5,'trainlm',num_executions);
results = bayesopt(func,[hidden1,hidden2,hidden3,func_trein],...
'Verbose',1,...
'MaxObjectiveEvaluations',1000,...
'MaxTime',100000,...
'PlotFcn','all');
Do you think is a good approach? What hiperparameters I should optimize?
Thanks a lot!
Greg Heath
Greg Heath el 16 de En. de 2019
Editada: Greg Heath el 16 de En. de 2019
I SEE NO REASON FOR IT'S EXISTENCE.!
My approach is as simple as possible. Typically, I accept all defaults except a double for loop over a non-overfitting number of Hidden nodes and 10 or (RARELY!) 20 sets of random initial weights for each value of H.
I have posted jillions of exmples in BOTH comp.soft-sys.matlab and ANSWERS.
HOPE THIS HELPS.
GREG

Iniciar sesión para comentar.

Categorías

Más información sobre Get Started with Statistics and Machine Learning Toolbox en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by