Neural Netwok testing issues
2 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
I am a beginner on neural networks and working on a project in which I am supposed to recognize a face of a person. I have developed a code and train the network on NN toolbox with the specific data base(5 people each having 10 samples).
During test when I provide one of the picture from data base(through which I have trained the network) it works correctly( gives me 1 or 0.99 value in output on the specific person)
But my problem is this when I provide it a random pic whose dimensions are same it never gives zero output for all the faces(those 5).
0 comentarios
Respuesta aceptada
Greg Heath
el 18 de Jul. de 2013
Bottom lime:
You don't have enough data to adequately characterize a space of dimension 10,304.
50 data points can span, AT MOST, 49 dimensions.
Use feature extraction to drastically reduce input dimensions.
Try searching the NEWSGROUP and ANSWERS for ways to extract features from face images
Hope this helps.
Greg
0 comentarios
Más respuestas (6)
Greg Heath
el 12 de Jul. de 2013
Are you using patternnet? If not, what?
What is the size of your input matrix, target matrix and number of hidden nodes?
What defaults have you overridden?
Your target columns should be 5-dimensional unit vectors with 1 unity component and the rest zeros.
trueclass = vec2ind(target) % target = ind2vec(trueclass)
Your output columns should be 5-dimensional consistent estimates of class posterior probability unit vectors with the row index of the largest component indicating the assigned class. It is not necessary that the rows have unity sums. However, if you wish to quote posterior probabilities, just divide each output column by it's sum.
assignedclass = vec2ind(output)
The most common reasons for poor performance on nontraining data are
1. The training samples are nonuniformly distributed among the classes
2. Nontraining samples from one or more classes are not sufficiently
characterized by the training samples
3. The net is overfit with too many weights that are not sufficiently constrained by the training samples.
Hard to say more without more info and/or posted code.
Hope this helps.
Thank you for formally accepting my answer
Greg
Mehrukh Kamal
el 12 de Jul. de 2013
6 comentarios
Greg Heath
el 16 de Jul. de 2013
Editada: Greg Heath
el 16 de Jul. de 2013
This code doesn't help w.r.t. your problem. It just shows that you read in the input matrix and created the target matrix. One of many simpler approaches:
close all,clear all, clc
input_data = [];
target = zeros(5,45);
d1 =ones(1,9);
for i = 1:5
for j=1:9
g=rand(35,1);
input_data = [ A g ];
end
j = 1+9*(i-1);
target( i, j : j+8) = d1;
end
How can I help with the neural net code if you don't show any??
Greg Heath
el 16 de Jul. de 2013
What...no validation set? How did you avoid memorization by overtraining an overfit net?
[I N ] = size(input_data) = [10304 50 ]
[O N ] = size(target) = [ 5 50 ]
Neq = prod(size(target)= N*O = 250
==> Even if you used all of your data for training, you would only have 250 training equations to estimate ( for H = 10 hidden nodes)
Nw = (I+1)*H+(H+1)*O = 103050 +55 = 103,105
unknown weights!
You have to do some serious input feature extraction to reduce your input dimension to something much more reasonable.
See the comp.ai.neural-nets FAQ and posts regarding overfitting.
Even if you drastically reduce the input dimension, you should probably consider f-fold stratified cross-validation. For example, for each fold use, for each class, 8 for training, 1 for validation and 1 for testing. To get precise estimates of error rate you will have to make many runs. Keep track of running estimates of means and standard deviations of the validation and test error estimates. Do not include runs that do not converge to a reasonable estimate. If you plot the results (e.g., mean+/-stdv), you should see what is a reasonable threshold for exclusion.
Hope this helps.
Thank you for formally accepting my answer
Greg
0 comentarios
Ver también
Categorías
Más información sobre Deep Learning Toolbox en Help Center y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!