How to project 3D pointcloud on image frame using projectLidarPointsOnImage ?
8 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Hi everyone,
New to Matlab forum, I have a question about the function projectLidarPointsOnImage. I've written a script based on the example on Mathwork https://ch.mathworks.com/help/lidar/ref/projectlidarpointsonimage.html#d123e19970 , but do not have any results in the imPts matrix. The code runs but matrix with projected points is empty ..! I'm wondering why...
Here is the situation:
I've got 2 Lidars scans of a geological scene at t1 and t2, and 2 images of the same scene at the same t1 and t2 taken with a camera. The aim is to have a raster image of the 3D points as they appare projected on the frame.
The center point of LiDAR and Camera are the same, egal to the (0,0,0) point of the LiDAR scans coordinate system. Camera has been placed on the same position than the scanner. So I'm considering here that camera and LiDAR have been on the same view angle (no translation or rotation transformation requierd). For this reason I supposed the calibration step of the LiDAR with the Camera (https://ch.mathworks.com/help/lidar/lidarcameracalibration.html?s_tid=CRUX_lftnav) wasn't a mandatory step...
Am I correct or do I obligatory have to callibrate the sensor before using the project function?
Do you have any idea of what is the source of the problem in the code?
All idea are welcome :-)
Here is my script
% % Define PointCloud Object
data = readmatrix('R1_aligned_cut_translated.txt');
x = data(:,1);
y = data(:,2);
z = data(:,3);
xyzPoints = [x y z];
ptCloud = pointCloud(xyzPoints);
%% Define camera intrasics
% Define Focal length
%F is the focal length in world units, typically millimeters
F = 24;
% [sx, sy] are the number of pixels per world unit in the x and y direction respectively
% Capteur 24x36, definition capteur 3840x5760. H: x = 3840pix/24mm --> 160 pix/mm
% L: y = 5760pix/36mm --> 160
sx = 160;
sy = 160;
% fx and fy are in pixels
% Camera focal length, specified as a two-element vector, [fx, fy].
%
fx=F.*sx;
fy=F.*sy;
focalLength = [fx fy];
% Define principalPoint.
% Optical center of camera, specified as a two-element vector, [cx,cy], in pixels.
principalPoint = [1920 2880];
% Define imageSize.
% Image size produced by the camera, specified as a two-element vector, [mrows,ncols].
imageSize = [3840 5760];
intrinsics = cameraIntrinsics(focalLength,principalPoint,imageSize);
%%Define Lidar to camera rigid transformation
%creates a default rigid3d object that corresponds to an identity transformation
tform = rigid3d;
%% Project the point cloud onto the image frame
imPts = projectLidarPointsOnImage(ptCloud,intrinsics,tform);
%% Display
figure
% img1 = imread('imgR1.tif');
% imshow(img1)
hold on
plot(imPts(:,1),imPts(:,2),'.','Color','r')
hold off
3 comentarios
Adam Danz
el 24 de Mzo. de 2021
Thanks Céline, I played around with it for a while but ran out of time. I wonder if the intrinsics are off.
Perhaps someone else with more experience with the Lidar Toolbox will respond. If not, you can alway contact tech support.
Respuestas (1)
Paola Donis Noriega
el 28 de Oct. de 2025
Hi Céline,
To use the projectLidarPointsOnImage function, you need the transformation from the lidar frame to the camera frame. This transformation is obtained through lidar camera calibration. Even if the sensors are mounted closely together, the lidar frame and the camera frame are not necessarily aligned, which is why lidar camera calibration is needed. Before using projectLidarPointsOnImage, you can use the lidar camera calibrator app to estimate the tform input needed to project points from the point cloud onto the image.
0 comentarios
Ver también
Categorías
Más información sobre Labeling, Segmentation, and Detection en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!