# SVM and KNN hyperparameter

11 views (last 30 days)
Ehsan Altayef on 30 Jun 2022
Commented: Alan Weiss on 4 Jul 2022
I am attempting to optimize KNN and SVM classifiers with any optimization algorithm except Naivebayes.

Alan Weiss on 1 Jul 2022
Is this what you are looking for?
Alan Weiss
MATLAB mathematical toolbox documentation
##### 2 CommentsShowHide 1 older comment
Alan Weiss on 4 Jul 2022
To use a different algorithm you would have to run a different solver. I am not at all sure of the benefit of using a different algorithm.And I am not familiar with the Bath algorithm. Really, what do you expect to get that is better?
To use ga (genetic algorithm), you need a Global Optimization Toolbox license. To minimize the cross-validation error, you might want to fix the partition, and then use ga to minimize the error over several parameter settings. Something like this:
rng default
c = cvpartition(n,'Kfold',5); % Fix a partition
% I assume that you want to optimize over x(1)=BoxConstraint and x(2)=KernelScale
lb = [1/10,1/10]; % Somewhat arbitrary bounds
ub = [10,10];
% Minimize the cross-validation loss
fun = @(x)kfoldloss(fitcsvm(X,Y,'CVPartition',c,'BoxConstraint',x(1),'KernelScale',x(2)));
[sol,fval] = ga(fun,2,[],[],[],[],lb,ub);
% Now train the model on the optimal parameters
model = fitcsvm(X,Y,'BoxConstraint',sol(1),'KernelScale',sol(2));
But again, before you do this, I believe you should think about what you expect to get that is better than the automatic hyperparameter optimization using Bayesian optimization.
Alan Weiss
MATLAB mathematical toolbox documentation