SVM and KNN hyperparameter
9 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
I am attempting to optimize KNN and SVM classifiers with any optimization algorithm except Naivebayes.
can anyone help, please?
0 comentarios
Respuestas (1)
Alan Weiss
el 1 de Jul. de 2022
Is this what you are looking for?
Alan Weiss
MATLAB mathematical toolbox documentation
2 comentarios
Alan Weiss
el 4 de Jul. de 2022
To use a different algorithm you would have to run a different solver. I am not at all sure of the benefit of using a different algorithm.And I am not familiar with the Bath algorithm. Really, what do you expect to get that is better?
To use ga (genetic algorithm), you need a Global Optimization Toolbox license. To minimize the cross-validation error, you might want to fix the partition, and then use ga to minimize the error over several parameter settings. Something like this:
rng default
c = cvpartition(n,'Kfold',5); % Fix a partition
% I assume that you want to optimize over x(1)=BoxConstraint and x(2)=KernelScale
lb = [1/10,1/10]; % Somewhat arbitrary bounds
ub = [10,10];
% Minimize the cross-validation loss
fun = @(x)kfoldloss(fitcsvm(X,Y,'CVPartition',c,'BoxConstraint',x(1),'KernelScale',x(2)));
[sol,fval] = ga(fun,2,[],[],[],[],lb,ub);
% Now train the model on the optimal parameters
model = fitcsvm(X,Y,'BoxConstraint',sol(1),'KernelScale',sol(2));
But again, before you do this, I believe you should think about what you expect to get that is better than the automatic hyperparameter optimization using Bayesian optimization.
Alan Weiss
MATLAB mathematical toolbox documentation
Ver también
Categorías
Más información sobre Traveling Salesman (TSP) en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!