- Improve Neural Network Classifier Using OptimizeHyperparameters
- Customize Neural Network Classifier Optimization
tuning hyper parameters using bayesian optimisation
19 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Hi all,
I have configured an artificial neural network using 'fitnet' and need to tune the hyper parameters within the network since it is performing rather poorly. I am still rather new to matlab and this is all new to me. Any help with understanding the bayesian optimisation process in basic terms would be greatly appreciated. I am potentially looking to use the 'bayesopt' function but dont understand how this works.
0 comentarios
Respuestas (1)
Katja Mogalle
el 6 de Mayo de 2024
Hi Luke,
In simple terms, Bayesian optimization is an algorithm that helps you choose the best hyperparameters that define the structure or training circumstainces for a neural network. You typically define a set of values for the algorithm to explore (e.g. network size, activations functions, training parameters) and Bayesian optimization takes care of figuring out which combinations of parameters to try out in order to reach an optimal outcome (the objective function is typically the cross-validation loss).
Using the bayesoft function is one approach (it is the most flexible, but maybe also more difficult to use). Though depending on the complexity and size of your task I can propose to look into one of these two directions which might be easier for getting started:
A) If you think a smaller, less complex network is able to solve the task, you can try out fitcnet (for classification) or fitrnet (for regression) of the Statistics and Machine Learning Toolbox. Both functions have built-in support for bayesian hyper-parameter optimization. For example, by using the OptimizeHyperparameters setting, the software will attempt to minimize the cross-validation loss (error) by varying parameters such as the activation functions, layer sizes, pre-processing options, and more. These examples might help you get started:
B) Alternatively, if you need a bigger and more flexible network architecture or you are working with image data, it might be better to start in the Deep Learning area. Here I can recommend the route via the Experiment Manager app, for example:
Or if, in the end, you need more flexibility over the Bayesian optimization settings, here is an example using the bayesopt function: https://www.mathworks.com/help/deeplearning/ug/deep-learning-using-bayesian-optimization.html .
Hope this helps.
0 comentarios
Ver también
Categorías
Más información sobre Pattern Recognition and Classification en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!