Fusing camera to lidar not working correctly outside lidarCameraCalibrator-App
6 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
I'm trying to fuse a camera and a Lidar-sensor using Matlabs lidar toolbox. For the calibration I used the lidarCameraCalibrator-app with the camera- and lidarfiles you can find in the Google Drive folder below (square size 250mm; no padding; extended roi). This calibration works just fine and the resulting colored point cloud looks like I expected:
However, when I now use the exported tform with the fuseCameraToLidar or the bboxCameraToLidar function I get incorrect results:
It seems like the rotation being done to the point cloud is just wrong. The projectLidarPointsOnImage function is working just fine. In the following Google Drive folder you can find my calibration-data, the resulting calibration-matrices and a script to show the incorrect functions:
Thank you very much for your help!
2 comentarios
Moritz Rumpf
el 26 de Abr. de 2023
Did you try the projectLidarPointsOnImage - function?
I have similiar problems but it works with this function
Xiao Lin
el 13 de Dic. de 2023
Please check the detail of the function. While the projectLidarPointsOnImage using lidar2camera tform, the fusion function uses camera2lidar tform. Plus, for some unknown reasons, fusion function only accept rigidtform3d object as tform input. Using rigid3d object will cause wrong fusion result. rigid3d is not recommanded to be used in the latter version of matlab (since 2020)
Respuestas (1)
Maneet Kaur Bagga
el 31 de Ag. de 2023
As per my understanding the problem shows incorrect results due to the following reasons:
1) As the transformation matrix is used with the fuseCameraToLidar function. While reproducing the code provided in the Google Drive the variable intrinsic is creating a tform (3-D Rigid transformation object) and while printing the object parameters the result is found that it contains properties with Rotation and Translation but the expected result after creating a matrix should be a d+1 by d+1 matrix where d is the dimension of transformation (d = 3). Hence the first possible error is in creating the rigidform3d object. Here the dimensionality parameter is missing, and the object is defined incorrectly.
2) Also, after defining the rotation and the translation parameters separately and recreating the object using the rigidtform3d(R, translation) function from the given values defined in the cameraCalibration_Session_3.mat file and then passing it as a rigidform3d object as follows:
We achieve the following output after running the program, the Rotation matrix defined here based on the given values in the mat file is Invalid.
Here R is the Rotation matrix and translation is the second parameter of the function rigidfrom3d as defined in the provided file.
3) The third possibility for encountering the error is due to the Radial Distortion present in the image provided.
To remove the radial distortion from the image you can use the following documentation:
Additionally, you may also refer to the following documentation for the above used functions:
fuseCameratolidar function:
Project Lidar Points on an Image:
Creating a rigitform3d object.
Thank You!
Maneet Bagga
0 comentarios
Ver también
Categorías
Más información sobre Calibration and Sensor Fusion en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!