Motion analysis in captured video
    9 visualizaciones (últimos 30 días)
  
       Mostrar comentarios más antiguos
    
I have a camera mounted on a car capturing video. The car starts and stops frequently. I want to figure out when the car stops and starts. There is enough disturbance in the environment with uneven ground and even a breeze can cause huge changes in the environment. Just looking for difference from frame to frame does not give me what I need. I have tried that already. There are no markers either to distinguish when the car stops. This is not inline analysis. I have to do this offline.
Any ideas to try would be greatly appreciated.
0 comentarios
Respuestas (1)
  Ayush
 el 19 de Jul. de 2025
        Hi Sushma, 
I understand you are trying to segment motion caused by camera (ego-motion) from motion in the environment. 
Since you have mentioned that the naive frame differencing method isn't working for you, here are some more possible advanced methods you can try:
1. Feature-based motion estimation: You can track stable features (e.g. using ORB, SIFT) across frames and estimate camera motion from how those features shift. 
Here is a pesudo MATLAB code for the same:
video = VideoReader('your_video.mp4');
prevFrame = readFrame(video);
prevGray = rgb2gray(prevFrame);
while hasFrame(video)
    currFrame = readFrame(video);
    currGray = rgb2gray(currFrame);
    % You can detect and track features
    pointsPrev = detectORBFeatures(prevGray);
    [featuresPrev, validPrev] = extractFeatures(prevGray, pointsPrev);
    pointsCurr = detectORBFeatures(currGray);
    [featuresCurr, validCurr] = extractFeatures(currGray, pointsCurr);
    % try matching the features
    indexPairs = matchFeatures(featuresPrev, featuresCurr);
    matchedPrev = validPrev(indexPairs(:,1));
    matchedCurr = validCurr(indexPairs(:,2));
    % Estimate transform: Affine Transform
    [tform, inlierIdx] = estimateGeometricTransform2D(matchedPrev, matchedCurr, 'affine');
    % Here, I compute the translation magnitude
    motionMag = norm(tform.T(3, 1:2)); % x and y translation
    % Here, I am storing the motion magnitude
    motionVec(end+1) = motionMag;
    prevGray = currGray;
end
% Smooth and threshold
motionVecSmooth = movmean(motionVec, 5);
stopped = motionVecSmooth < threshold; % Choose threshold based on test
You can read more about "detectORBFeatures" here in the official documentation: https://www.mathworks.com/help/vision/ref/detectorbfeatures.html
You can read more about "matchFeatures" here in the official documentation: https://www.mathworks.com/help/vision/ref/matchfeatures.html
2. Background stabilization: You can use a video stabilization algorithm to model ego-motion and then try to observe the residual motion. 
Here is a pesudo MATLAB code for the same method for your reference:
video = VideoReader('your_video.mp4');
stabilizer = vision.VideoFileReader('your_video.mp4');
motionEstimator = vision.BlockMatcher('BlockSize', [16 16]);
% Initialize previous frame
prevFrame = readFrame(video);
prevGray = rgb2gray(prevFrame);
while hasFrame(video)
    currFrame = readFrame(video);
    currGray = rgb2gray(currFrame);
    % Estimate motion
    motion = step(motionEstimator, prevGray, currGray);
    dx = motion(:,1);
    dy = motion(:,2);
    % Compute average displacement
    meanMotion = mean(sqrt(dx.^2 + dy.^2));
    motionVec(end+1) = meanMotion;
    prevGray = currGray;
end
% Smooth and threshold
motionVecSmooth = movmean(motionVec, 5);
stopped = motionVecSmooth < threshold; % Choose threshold based on test
3. Deep learning-based optical flow: You can use deep learning models trained to extract  motion vectors or ego-motion from the frames for example: RAFT, FlowNet2, etc.
You can read more about "RAFT" algorithm and its usage in MATLAB using the following official documentation: https://www.mathworks.com/help/vision/ref/opticalflowraft.html
Hope it helps!
0 comentarios
Ver también
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!

