How does bayesian optimization and cross-validation work?

6 visualizaciones (últimos 30 días)
Dimitri
Dimitri el 23 de Ag. de 2019
Comentada: Don Mathis el 25 de En. de 2021
Hello,
I was wondering how exactly the hyperparameter optimization works in this example: Example. The default setting is 5-fold cross-validation, but the output is a normal RegressionSVM and not a RegressionPartitionedSVM. That's how I understand the process, please give me feedback.
Let´s consider the first step of the hyperparameter optimization. The algorithm choses a initial hyperparameter setting and learns a model with 4/5 of the data. Now it evaluates the performance on the 1/5 of the data. What happens next? Is this hyperparameter setting used again and one model learned on another of the 4/5 data? After 5 iterations you now have 5 objectiv function values which are used for the calculation of the loss? This loss is the final loss for the first hyperparameter setting. This procedure is now repeated 30 times?

Respuesta aceptada

Don Mathis
Don Mathis el 23 de Ag. de 2019
Editada: Don Mathis el 23 de Ag. de 2019
In each iteration of the optimization, fitrsvm is called with 5-fold crossvalidation, using a particular vector of hyperparameters. This results in a RegressionPartitionedSVM. Then the kfoldLoss method is called on that object, obtaining the Loss for that vector of hyperparameters. That Loss value is printed in the command line display in the "Objective" column for that iteration.
In the next iteration, a new vector of hyperparameters is chosen, and the process is repeated.
Finally, after 30 (by default) iterations, the "best" hyperparameter vector is chosen, and a final model is trained on the entire dataset using those hyperparameters, without crossvalidation. That final RegressionSVM model is returned.
  3 comentarios
Sinan Islam
Sinan Islam el 23 de En. de 2021
@Don Mathis I optimized model with cross validation, now I dont know how to print the cross validation error. Using kfoldloss on optimized model is not working.
Don Mathis
Don Mathis el 25 de En. de 2021
You need to call crossval() on the model, which will return a partitioned model, then call kfoldLoss on that.

Iniciar sesión para comentar.

Más respuestas (0)

Categorías

Más información sobre Support Vector Machine Regression en Help Center y File Exchange.

Productos


Versión

R2019a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by