Naive Bayes
1 visualización (últimos 30 días)
Mostrar comentarios más antiguos
Avishek Dutta
el 7 de Jun. de 2012
Comentada: Abrham Debasu
el 18 de Nov. de 2014
Hi all,
Being a beginner I am facing great confusion regarding the NaiveBayes classifier.
The examples I see are like this
O1 = NaiveBayes.fit(meas,species);
C1 = O1.predict(meas);
cMat1 = confusionmat(species,C1)
or
nbGau= NaiveBayes.fit(meas(:,1:2), species);
nbGauClass= nbGau.predict(meas(:,1:2));
Always the function is taking two inputs. My task is to compare different classification methods on IMU data. I have seven scenarios like walk, run, stairup etc. I have a sample data which is an extract of my full training data (extracted randomly from the plot by selecting 2 points) i.e. data of all 7 scenarios combined together.
Using classify(sample,training,group) I am getting good results. But for NaiveBayes I have no option to include this sample data as input.
Am I missing something basic?
Please Help.
Avishek
0 comentarios
Respuesta aceptada
Ilya
el 7 de Jun. de 2012
"training" is used to train a classifier (pass it to FIT method), and "sample" is used to test the classifier performance on data not used for training (pass it to PREDICT method). for example:
O1 = NaiveBayes.fit(training,group);
C1 = O1.predict(sample);
cMat1 = confusionmat(sampleGroup,C1);
where sampleGroup is an array of true class labels for the predictor matrix in "sample".
2 comentarios
Más respuestas (1)
Ilya
el 7 de Jun. de 2012
If you use ClassificationTree introduced in 2011a, you can use the same syntax with FIT and PREDICT. If you use classregtree, use EVAL method to predict.
You can run methods(obj) and properties(obj) on any object such as classregtree or NaiveBayes to see a list of all its methods and properties.For instance,
methods(t)
would give you a long list with EVAL on it.
7 comentarios
Ilya
el 7 de Jun. de 2012
If you care only about the classification accuracy, use any classifier you like and measure its accuracy by cross-validation or using an independent set. Ensemble techniques such as TreeBagger introduced in 2009a or fitensemble introduced in 2011a tend to be very powerful and versatile (work on data of many kinds); all in Statistics Toolbox. You can try k-NN classification in Stats. There are neural nets in Neural Net Tlbx and SVM in Bioinformatics Tlbx.
If you need a classifier with an interpretable structure, go with something simple such as LDA, NaiveBayes or decision tree.
Ver también
Categorías
Más información sobre Naive Bayes en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!