Help req. in using fitcsvm()
2 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
UserJ
el 9 de Mayo de 2018
Comentada: UserJ
el 12 de Mayo de 2018
Hi!
I am trying to use fitcsvm() to implement SVM. Previously, I was using LibSVM. I know from the results obtained using LibSVM that the best kernel for my problem is RBF. Now, I want to find the kernel parameters. For this, I am using the following code:
opts=struct('Optimizer','bayesopt','ShowPlots',true, 'Repartition',1);
svmmod=fitcsvm(ftTrn,CLTrn,'KernelFunction','rbf','OutlierFraction',0.05,...
'OptimizeHyperparameters','auto','HyperparameterOptimizationOptions',opts);
% ftTrn: Training data, %CLTrn: corresponding classlabels
1) Is this code right for my purpose?
2) svmmod contains the SVM trained on the entire training data or on a subset (on a fold used for determining the best values for the kernel parameters)?
3) Are there any other parameters I can tweak for improving the classification performance?
0 comentarios
Respuesta aceptada
Don Mathis
el 11 de Mayo de 2018
Editada: Don Mathis
el 11 de Mayo de 2018
(1) Yes that's right. In that case it will optimize BoxConstraint and KernelScale.
(2) svmmod contains the SVM trained on the entire training data, using the best hyperparameters found. 5-fold crossvalidated misclassification rate was used as the objective function during optimization.
(3) You can optimize more variables. You can find out what hyperparameters are eligible like this:
>> h = hyperparameters('fitcsvm',ftTrn,CLTrn)
>> h.Name
h =
5×1 optimizableVariable array with properties:
Name
Range
Type
Transform
Optimize
ans =
'BoxConstraint'
ans =
'KernelScale'
ans =
'KernelFunction'
ans =
'PolynomialOrder'
ans =
'Standardize'
And then you can optimize additional hyperparameters like this:
svmmod=fitcsvm(ftTrn,CLTrn,'KernelFunction','rbf','OutlierFraction',0.05,...
'OptimizeHyperparameters',{'BoxConstraint','KernelScale','Standardize'},'HyperparameterOptimizationOptions',opts)
Because you're fixing the kernel function, the 'PolynomialOrder' hyperparameter is not relevant. So 'Standardize' ends up being the only additional hyperparameter.
One more note: Since you're now optimizing 3 variables, you might want to run the optimization longer, say 60 evaluations:
opts=struct('Optimizer','bayesopt','ShowPlots',true, 'Repartition',1, 'MaxObjectiveEvaluations',60);
svmmod=fitcsvm(ftTrn,CLTrn,'KernelFunction','rbf','OutlierFraction',0.05,...
'OptimizeHyperparameters',{'BoxConstraint','KernelScale','Standardize'},'HyperparameterOptimizationOptions',opts)
2 comentarios
Don Mathis
el 11 de Mayo de 2018
Yet one more note: If you've got some time on your hands, why not let it try other kernel functions, too?
opts=struct('Optimizer','bayesopt','ShowPlots',true, 'Repartition',1, 'MaxObjectiveEvaluations',60);
svmmod=fitcsvm(ftTrn,CLTrn,'OutlierFraction',0.05,...
'OptimizeHyperparameters','all','HyperparameterOptimizationOptions',opts)
Más respuestas (0)
Ver también
Categorías
Más información sobre Statistics and Machine Learning Toolbox en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!