File Exchange

image thumbnail

Simple Machine Learning Algorithms for Classification

version 1.1 (5.47 KB) by Jingwei Too
Simple and ease of implementation. The machine learning algorithms include KNN, SVM, LDA, NB, RF and DT.


Updated 24 Oct 2020

View Version History

View License

This toolbox contains six widely used machine learning algorithms
(1) K-nearest Neighbor (KNN)
(2) Support Vector Machine (SVM)
(3) Decision Tree (DT)
(4) Discriminate Analysis Classifier (DA)
(5) Naive Bayes (NB)
(6) Random Forest (RF)

The "Main" script shows examples of how to use these machine learning programs with the benchmark data set.

The displayed results include:
(1) Accuracy for each fold in k-fold cross-validation
(2) Average accuracy over k-folds
(3) Confusion matrix.


Comments and Ratings (15)

Chang hsiung

Amit DOegar

Nice Work, We appreciate , , in random forest when features are in numeric and response is categorical then error is coming in confusion mat
Error using confusion mat (line 71)
G and GHAT need to be the same type., Error in jRF (line 28)
con=confusion mat(ytest,pred);
Kindly advice

djim djim

I have got an error:
Undefined function 'fitcknn' for input arguments of type 'double'.

Error in jKNN (line 12)

Error in Main (line 18)

could you support?

Financeo Putra

Excuse me, is there a way to display confusion matrix for each fold..?

Financeo Putra

Thank you very much for your answer Jingwei.. hope you don't mind if i ask you another question.. I already display sensitivity and specificity value by using classperf.. my question is.. how to display every value of sensitivity and specificity for each fold..? and is it possible to display every confusion matrix for each fold..? Thank You very much for your reply..

Jingwei Too

Dear Financeo Putra,
If you do 5-fold cross-validation, then it is possible to split your data into 80% and 20% test, and the perform 5 time test. However, if you plan to divide the data into data training 80% and data testing 20% and then perform 10 folds validation, it is impossible since 10-fold cross-validation will divide data into 90% train and 10% test, and then perform 10 time test.

Financeo Putra

Thank You for your reply Jingwei i really appreciate it.. i mean is it possible to divide the data into data training 80% and data testing 20% and then perform 10 folds validation with your code..? because i already try with holdout.. and its only perform 1 time test.. i mean can i do 10 folds validation but with data partition..? thank for your answer Jingwei.. i do really appreciate it..

Jingwei Too

Dear Financeo Putra,
My source codes are only applicable when you wish to apply k-fold cross-validation. Moreover, cross-validation allows you to get more comprehensive results instead of using the hold-out method.

Financeo Putra

excuse me, could you tell me how to split data training and data testing (80:20) with your code..?

fatma yasar

Jingwei Too

I do not provide program for plotting. You need to use a scatterplot or other relevant codes.


how to plot LDA? please include code

Rasool Reddy

good code, thanks

Tee Wei Hown

MATLAB Release Compatibility
Created with R2018a
Compatible with any release
Platform Compatibility
Windows macOS Linux

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Simple ML for Classification