How to do grid Search to optimize sigma using Matlab?
117 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Diver
el 28 de Jun. de 2015
Comentada: Dariush Javani
el 12 de Mayo de 2023
Hi;
I new to SVM, hence, please consider this when you answer the question.
I'm trying to classify IRIS data using matlab. I choose fitcecoc becouse it can classify multiple features and classes.
Next, I need to search for the best value for sigma. My understanding I need to do something called "grid Search". However, I have no clue how to do grid Search using Matlab.
Please, notice since I'm new to Matlab, I might be asking the wrong questions in the first place.
0 comentarios
Respuesta aceptada
Walter Roberson
el 29 de Jun. de 2015
Editada: Walter Roberson
el 17 de Mayo de 2017
firstparam = [1, 2, 3.3, 3.7, 8, 21]; %list of places to search for first parameter
secondparam = linspace(0,1,20); %list of places to search for second parameter
[F,S] = ndgrid(firstparam, secondparam);
fitresult = arrayfun(@(p1,p2) fittingfunction(p1,p2), F, S); %run a fitting on every pair fittingfunction(F(J,K), S(J,K))
[minval, minidx] = min(fitresult);
bestFirst = F(minidx);
bestSecond = S(minidx);
now the fitting was best at values bestFirst and bestSecond
It is common that you have a range of values for each parameter; in that case you use linspace() to sample in-between the range. The number of points you ask for in linspace() determines how fine of a grid you search at.
When you are searching something that should be somewhat smooth, you can use a coarse grid to determine the general area to search and then you can use that to select an area to pay more attention to. I gave an example of code for that in http://uk.mathworks.com/matlabcentral/answers/222803-how-to-fit-6-curves-simultaneously-to-solve-for-2-unknowns
7 comentarios
Walter Roberson
el 5 de Mayo de 2023
p1 is "this specific value out of all of the possible values of the first parameter"
p2 is "this specific value out of all of the possible values of the second parameter"
Generally speaking, with optimization there are a few different possibilities:
- the permitted values for the parameters might be continuous -- for example not just 0.1 or 0.2 but also all representable values in-between. In such a case you can use algorithms based on calculating or estimating jacobians to figure out the best direction to move; Or
- the permitted values for the parameters might be discrete but the function might be fairly smooth. In such a case, knowing that a particular combination of parameters results in a high value can hint that there is no point in searching "near" that combination; the information from one location can be used to guide the search, resulting in shorter searches than might otherwise be the case; Or
- the permitted values for the parameters might be discrete, but the function might be substantially irregular, so knowing the value at one location might not give you any useful information about the value at nearby locations. In this case, you might have to just test all combinations of parameters and pick the combination that worked best
Sometime in the case where a discrete function is fairly smooth but the number of possible combinations of parameters is not "too big", it can be easier to just check every combination instead of bothering to be "smart" about which combinations to examine.
The code I posted above is the outline for "just test all combinations" for two parameters. You can use very similar techniques if you have additional parameters. The more parameters you have, the faster the number of combinations grows, so testing all combinations can become impractical as the number of parameters increases.
Testing all combinations is also sometimes used when the range of computed values is pretty similar, but it is important to get the absolute best combination. For example it might perhaps not be difficult to find a combination of parameters that makes a chemical process 99.1% efficient, but there might be one combination that makes it 99.12% efficient. For a lot of purposes, the 99.1% "easily obtained" versions might be quite good enough, but in some cases the "a bunch of trouble to control exactly 99.12% process" might be worth doing. If 99.1% is "good enough" then an algorithm such as "particle swarm" might be "good enough", but from time to time testing every combination is worth it.
Dariush Javani
el 12 de Mayo de 2023
Thank you Walter for your comprehensive response.
Más respuestas (1)
Don Mathis
el 24 de Abr. de 2021
Nowadays you can just do this:
load fisheriris
bestModel = fitcecoc(meas, species, 'OptimizeHyperparameters','auto')
2 comentarios
krishna Chauhan
el 19 de Sept. de 2022
@Don Mathis So what does it signify here? Only cross validation as "one vs one", as this is the only parameter we can set before training. Please guide.
Don Mathis
el 19 de Sept. de 2022
The 'OptimizeHyperparameters' argument tells fitcecoc to use Bayesian Optimization to optimize some of the hyperparameters. The Doc page below explains which hyperparameters are optimized. Gridsearch is simpler than Bayesian Optimization but usually is slower to find the best parameters when multiple parameters are being optimized.
Ver también
Categorías
Más información sobre Classification Trees en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!