how to use cross-validation in fitrgp

4 visualizaciones (últimos 30 días)
Amend
Amend el 12 de En. de 2017
Respondida: Don Mathis el 23 de Mzo. de 2017
I find that there are two places in fitrgp() that we can do cross-validation:
  • cvgprMdl = fitrgp(x,y,'KernelFunction','squaredexponential','Holdout',0.25);
  • gprMdl = fitrgp(x,y,'KernelFunction','squaredexponential',... 'OptimizeHyperparameters','auto','HyperparameterOptimizationOptions',struct('Holdout',0.25));
I don't clearly understand what is the different for the 'Holdout' used in two places?
Thank you.

Respuestas (1)

Don Mathis
Don Mathis el 23 de Mzo. de 2017
Briefly: The first command specifies a holdout proportion for fitting a single model. The second command specifies the holdout proportion used inside the objective function of a Bayesian Optimization.
In more detail:
Your first command trains a single model on 75% of the dataset and outputs a "RegressionPartitionedModel". This contains the trained model in cvgprMdl.Trained{1}. You can get its holdout Loss by doing:
loss = kfoldLoss(cvgprMdl)
Your second command runs a BayesianOptimization in which 30 models are fit, each to the same 75% of the dataset, using different hyperparameters. The optimization searches for the hyperparameters that minimize the holdout Loss on the remaining 25%. After the optimization completes, a final model is fit to 100% of the dataset using the optimal hyperparameters. The returned object is a "RegressionGP".

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by