Borrar filtros
Borrar filtros

EyePair recognition is not working

1 visualización (últimos 30 días)
sergio gomez guillen
sergio gomez guillen el 15 de Jun. de 2021
I'm working on a project to recognice eyes mouth nose and forehead to track and save date in thermal recordings, but it doesnt recognice the eyepair even if I change the MergeThreshold value, do you have any idea of what should i do to fix it? I thought about pre processing the video and increassing the contrasts to make it more cloaer for the algorithm to recognice but I dont really know how to implement it in matlab and maybe there is an easier way to fix it. One of the main troubles with my code is the file extension, it's .ats or .seq (they are both quite similar), but most of the functions cannot read this kind of files. If you need some more information ask me, and sorry if I did something wrong posting this, it's my first time. Thank you so much.
I'm using the Viola-Jones algorithm . I add the script in case it's necesary, sorry about the code, it's quite messy but it's work in progress.
%% bring in the first frame from the file
clear all
clc
%Detect objects using Viola-Jones Algorithm
%To detect Face
v = FlirMovieReader('faceRecognitionA8300sc-000003.ats')
[frameE, metadataE] = step(v);
frame1E = im2double(frameE);
frame2E = imadjust(frame1E);
figure, imshow(frame2E)
% vid = videoinput('winvideo',1);
% src = getselectedsource(vid);
% preview(vid);
% src.Contrast = 200;
%NO CONSIGO QUE RECONOZCA LOS OJOS AL SER EN BLANCO Y NEGRO PORQUE TIENE
%MUCHO BRILLO, O BAJAR CONTRASTE O CAMBIAR CODIGO BASE
eyeDetector = vision.CascadeObjectDetector('EyePairBig');
eyeDetector.MergeThreshold= 5;
%eyeDetector.MinSize=[ 100,100];
bboxE=step(eyeDetector,frame2E);
%% detect a face on the first frame
%% detect a face on the first frame
for i = 1:size(bboxE,1)
eyeDetector = vision.CascadeObjectDetector('EyePairBig');
eyeDetector.MergeThreshold= 5;
% eyeDetector.MinSize= [ 200,200];
bboxE=step(eyeDetector,frame2E);
% bboxE2=[(bboxE(:,1),-2)(bboxE(:,2)-10)(bboxE(:,3))(bboxE(:,4)+10)];
% figure(iE);
rectangle('Position',bboxE(i,:),'LineWidth',4,'LineStyle','-','EdgeColor','b');
end
title('Eyes Detection');
hold off;
%Eyes= imcrop(frame2E,bboxE(i,:));
%Eyes=imcrop(frame2E,bboxE);
%figure,imshow(Eyes);
%figure,imshow(frame2E);
figure,
%imshow(frame2E); hold on;
dispFrameE = insertObjectAnnotation(frame2E, 'rectangle', bboxE, 'eyes');
%figure, imshow(dispFrameE)
%% Detection and Tracking
% Capture and process video frames from the video in a loop to detect and
% track a face. The loop will run until the video player
% window is closed.
% setup video player
videoPlayer = vision.DeployableVideoPlayer('Location', [11 45]);
videoPlayer.FrameRate = 60;
videoPlayer.Size = 'Full-screen';
pointTracker = vision.PointTracker('MaxBidirectionalError', 2);
runLoop = true;
numPtsE = 0;
frameCountE = 0;
while ~isDone(v)
% Get the next frame.
videoFrameE = im2double(step(v));
videoFrameE = imadjust(videoFrameE);
videoFrameGrayE = videoFrameE;
frameCountE = frameCountE + 1;
%% OARTE NYUEVA INICIO
if numPtsE < 10
% Detection mode.
bboxE=step(eyeDetector,videoFrameGrayE);
if ~isempty(bboxE)
% Find corner points inside the detected region.
pointsE = detectMinEigenFeatures(videoFrameGrayE, 'ROI', bboxE(1, :));
% Re-initialize the point tracker.
xyPointsE = pointsE.Location;
numPtsE = size(xyPointsE,1);
release(pointTracker);
release(videoPlayer);
initialize(pointTracker, xyPointsE, videoFrameGrayE);
% Save a copy of the points.
oldPointsE = xyPointsE;
% Convert the rectangle represented as [x, y, w, h] into an
% E-by-2 matrix of [x,y] coordinates of the four corners. This
% is needed to be able to transform the bounding box to display
% the orientation of the face.
bboxPointsE = bbox2points(bboxE(1, :));
% Convert the box corners into the [x1 y1 x2 y2 x3 y3 x4 y4]
% format required by insertShape.
bboxPolygonE = reshape(bboxPointsE', 1, []);
% Display a bounding box around the detected face.
videoFrameE = insertShape(videoFrameE, 'Polygon', bboxPolygonE, 'LineWidth', 3);
% Display detected corners.
videoFrameE = insertMarker(videoFrameE, xyPointsE, '+', 'Color', 'red');
end
else
% Tracking mode.
[xyPointsE, isFoundE] = step(pointTracker, videoFrameGrayE);
visiblePointsE = xyPointsE(isFoundE, :);
oldInliersE = oldPointsE(isFoundE, :);
numPtsE = size(visiblePointsE, 1);
if numPtsE >= 10
% Estimate the geometric transformation between the old points
% and the new points.
[xform, oldInliersE, visiblePointsE] = estimateGeometricTransform(...
oldInliersE, visiblePointsE, 'similarity', 'MaxDistance', 4);
% Apply the transformation to the bounding box.
bboxPointsE = transformPointsForward(xform, bboxPointsE);
% Convert the box corners into the [x1 y1 x2 y2 x3 y3 x4 y4]
% format required by insertShape.
bboxPolygonE = reshape(bboxPointsE', 1, []);
% Display a bounding box around the face being tracked.
videoFrameE = insertShape(videoFrameE, 'Polygon', bboxPolygonE, 'LineWidth', 3);
% Display tracked points.
videoFrameE = insertMarker(videoFrameE, visiblePointsE, '+', 'Color', 'red');
% Reset the points.
oldPointsE = visiblePointsE;
setPoints(pointTracker, oldPointsE);
end
end
% Display the annotated video frame using the video player object.
step(videoPlayer, videoFrameE);
% Check whether the video player window has been closed.
runLoop = isOpen(videoPlayer);
end
I cant attach the video file beacuse it's too big even if compressed.

Respuestas (0)

Productos


Versión

R2019a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by