How does bayesian optimization and cross-validation work?
Mostrar comentarios más antiguos
Hello,
I was wondering how exactly the hyperparameter optimization works in this example: Example. The default setting is 5-fold cross-validation, but the output is a normal RegressionSVM and not a RegressionPartitionedSVM. That's how I understand the process, please give me feedback.
Let´s consider the first step of the hyperparameter optimization. The algorithm choses a initial hyperparameter setting and learns a model with 4/5 of the data. Now it evaluates the performance on the 1/5 of the data. What happens next? Is this hyperparameter setting used again and one model learned on another of the 4/5 data? After 5 iterations you now have 5 objectiv function values which are used for the calculation of the loss? This loss is the final loss for the first hyperparameter setting. This procedure is now repeated 30 times?
Respuesta aceptada
Más respuestas (0)
Categorías
Más información sobre Support Vector Machine Regression en Centro de ayuda y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!