why accuracy is zero

8 visualizaciones (últimos 30 días)
sun rise
sun rise el 8 de Ag. de 2021
Comentada: Walter Roberson el 27 de Ag. de 2021
function [result] = multisvm(TrainingSet,GroupTrain,TestSet)
%Models a given training set with a corresponding group vector and
%classifies a given test set using an SVM classifier according to a
%one vs. all relation.
%
%This code was written by Cody Neuburger cneuburg@fau.edu
%Florida Atlantic University, Florida USA...
%This code was adapted and cleaned from Anand Mishra's multisvm function
%found at http://www.mathworks.com/matlabcentral/fileexchange/33170-multi-class-support-vector-machine/
GroupTrain=GroupTrain';
u=unique(GroupTrain);
numClasses=length(u);
%TestSet=TestSet';
%TrainingSet=TrainingSet';
result = zeros(length(TestSet(:,1)),1);
%build models
for k=1:numClasses
%Vectorized statement that binarizes Group
%where 1 is the current class and 0 is all other classes
G1vAll=(GroupTrain==u(k));
models {k} = fitcsvm(TrainingSet,G1vAll);
end
%classify test cases
for j=1:size(TestSet,1)
for d=1:numClasses
if(predict(models{d},TestSet(j,:)))
break;
end
end
result(j) = d;
%--------------------------------
end
%disp(result);
%disp(GroupTrain);
load Group_Test
Group_Test1 = Group_Test1;
%disp(Group_Test1);
%Accuracy = mean(Group_Test1==result)*100;
%fprintf('Accuracy = %f\n', Accuracy);
%fprintf('error rate = %f\n ', length(find(result ~= Group_Test1 ))/length(Group_Test1'));
c=0;
for j=1:size(TestSet,1)
if Group_Test1(j)==result(j)
c = c+1;
end
end
acc = c/100
end
  2 comentarios
DGM
DGM el 9 de Ag. de 2021
I'm assuming that the equality test in the if statement in the screenshot is never true. If these are floating point numbers, that's entirely possible.
sun rise
sun rise el 18 de Ag. de 2021
The error in the predict statement

Iniciar sesión para comentar.

Respuesta aceptada

Walter Roberson
Walter Roberson el 9 de Ag. de 2021
result(j) is going to be a class number, an integer represented in double precision.
GroupTrain is not necessarily an integer class number at all, and is not necessarily consecutive from 1 even if it is integer. All we know is that it is a datatype that unique() can be applied to and that == comparisons works for.
For example if GroupTrain is 10, 20, 30, then u = unique() of that would be 10, 20, 30, and the code would loop through training based upon whether the class was 10, then whether it was 20, and so on. Then it would loop over classes, and use predict() and if the prediction was non-zero then it would record the class index rather than u() indexed at the class index. So predictions might be perfect, but it would be 1, 2, 3 recorded, and those would not match the 10, 20, 30s of the classes.
  21 comentarios
Walter Roberson
Walter Roberson el 26 de Ag. de 2021
I let the code run for about 40 hours. It was up to 132 gigabytes of memory. I got tired of it and canceled it; it really needs a rewrite.
Walter Roberson
Walter Roberson el 27 de Ag. de 2021
I changed the imresize() to [400,100] in both places, and reran HOG_NEW, which ran without problem.
I then re-ran HOG_NEW . After about 2 hours I asked it to pause; about half an hour later it did pause, having just completed building the first classification tree out of 937 . Estimated time to build the trees is therefore roughly
days(hours(2.5) * 937)
ans = 97.6042
which is more than 3 months.
I then asked it to apply that classification tree to all of the training data, which took about 20 minutes. That adds another
days(hours(1/3) * 937)
ans = 13.0139
So you should expect your code to take more than 4 months to run. More, as your system is slower than mine.
There is not much you can do to speed up building the classification trees... though possibly dropping the members of each class after that class is trained might help. Results would probably be less robust.

Iniciar sesión para comentar.

Más respuestas (1)

sun rise
sun rise el 20 de Ag. de 2021

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by