Qu Cao
MathWorks
Followers: 2 Following: 0
I'm an Automated Driving and Mapping Engineer at MathWorks and a Mechanical Engineer by education. DISCLAIMER: Any advice or opinions posted here are my own, and in no way reflect that of MathWorks.
Estadística
0 Preguntas
72 Respuestas
CLASIFICACIÓN
525
of 295.495
REPUTACIÓN
146
CONTRIBUCIONES
0 Preguntas
72 Respuestas
ACEPTACIÓN DE RESPUESTAS
0.00%
VOTOS RECIBIDOS
20
CLASIFICACIÓN
of 20.240
REPUTACIÓN
N/A
EVALUACIÓN MEDIA
0.00
CONTRIBUCIONES
0 Archivos
DESCARGAS
0
ALL TIME DESCARGAS
0
CLASIFICACIÓN
of 153.991
CONTRIBUCIONES
0 Problemas
0 Soluciones
PUNTUACIÓN
0
NÚMERO DE INSIGNIAS
0
CONTRIBUCIONES
0 Publicaciones
CONTRIBUCIONES
0 Público Canales
EVALUACIÓN MEDIA
CONTRIBUCIONES
0 Temas destacados
MEDIA DE ME GUSTA
Feeds
In stereocalibration, is the relationship between the 'R and T output as PoseCamera2' and the actual camera position the same, or does the sign of x in T reverse?
Sorry for the confusion. We will update our documenation to be more specific about the meaning of PoseCamera2. PoseCamera2 is t...
6 meses hace | 0
| aceptada
detectSIFTFeatures only working for uint8
Use im2double: I = imread('cameraman.tif'); points = detectSIFTFeatures(im2double(I))
8 meses hace | 1
estworldpose giving different answers on each run.
estworldpose is a RANSAC-based method. You may want to set the random seed before running the function to get persistent results...
alrededor de 1 año hace | 0
| aceptada
Replacing vision.GeometericTransformEstimator call
https://www.mathworks.com/matlabcentral/answers/521519-what-function-replaced-vision-geometrictransformestimator
más de 1 año hace | 0
I have two camera parameters from stereoParams. Which one should I choose for Stereo Visual SLAM application? Or do I just get their mean values?
Usually, the focal length of the two cameras are the same. You can use either one.
más de 1 año hace | 0
Defining Feature detection area.
You can specify the ROI that you want to extract features from.
más de 1 año hace | 0
| aceptada
unit of translation result from estrelpose function
As the documentation of estrelpose says, the function calculates the camera location up to an unknow scale. This is becuase you...
casi 2 años hace | 0
| aceptada
Difficulties in obtaining good results with the ORB-SLAM2 algorithm in MATLAB.
Thank you for posting the question. In general, tuning the hyperparameters for a visual SLAM system can be hard and requires a...
casi 2 años hace | 1
| aceptada
vSLAM: vSLAM algorithm is very sensitive to hyperparameters Issue?
You find the nature of the SLAM problem. Yes, the visual SLAM system is sensitive to hyperprameters which usually need to be tun...
casi 2 años hace | 0
| aceptada
How to construct stereoParameters with intrinsic and extrinsic matrix?
poseCamera2 essentially transforms camera 2 to camer 1. If you have “the transaltion and rotation from camera1 to camera 2”, (le...
alrededor de 2 años hace | 1
Object 3D world coordinates from multiple images
You will need a stereo camera to give you the actual dimension of 3-D objects. Alternatively,if you know the size of an object...
alrededor de 2 años hace | 0
creating a bag of features for new image set for monocular SLAM
The bag-of-features data may not work for the KITTI dataset because it was trained using a small amount of image data. You may w...
alrededor de 2 años hace | 1
| aceptada
The Premultiply Convention in Geometric Transformations does not support C/C++ code generation?
Thank you for reporting this. There is a bug in the documentation. All the geometric transformation objects with the premultiply...
alrededor de 2 años hace | 0
| aceptada
How to use reconstructScene with a disparity map from file, without calling rectifyStereoImages ?
You can use the reprojectionMatrix output from rectifyStereoImages to do the reconstruction. Otherwise, you need to save the ste...
más de 2 años hace | 0
| aceptada
Match the coordinate systems of "triangulate" and "reconstructScene" with "disparitySGM"
The point cloud generated from reconstructScene is in the rectified camera 1 coordinate. Starting in R2022a, you can use the ad...
más de 2 años hace | 0
| aceptada
MATLAB Simulate 3D Camera: why is there no focal length (world units) attribute in the sensor model?
Please take a look at this page: https://www.mathworks.com/help/vision/ug/camera-calibration.html#bu0ni74 If you know the size...
más de 2 años hace | 0
How to port SLAM algorithm to embedded platform?
Unfortunately, as of R2022a the visual SLAM pipeline doesn't support code generation yet. We're actively working on this suppopr...
más de 2 años hace | 1
| aceptada
how to get the relative camera pose to another camera pose?
Note that the geometric transformation convention used in the Computer Vision Toolbox (CVT) is different from the one used in th...
más de 2 años hace | 2
| aceptada
How to get 3D world coordinates from 2D image coordinates?
You should use the rectified stereo images. The disparityMap computed from disparitySGM should have the same size as your stereo...
más de 2 años hace | 0
Creating a depth map from the disparity map function
You can use reconstructScene for your workflow.
casi 3 años hace | 0
Unable to use functions from the Computer Vision Toolbox in Simulink MATLAB function block
A workaround is to declare the function as an extrinsic function so that it will be essentially executed in MATLAB: https://www...
casi 3 años hace | 0
| aceptada
how to get texture extraction using LBP features in MATLAB?
You can use the extractLBPFeatures function.
alrededor de 3 años hace | 0
About error of helperVisualizeMotionAndStructureStereo
In helperVisualizeMotionAndStructureStereo.m, please note the following code in retrievePlottedData which discards xyzPoints out...
alrededor de 3 años hace | 2
About SLAM initial Pose data
The initial pose data is provided by the dataset. It's used to convert the 3-D reconstruction into the world coordinate system. ...
alrededor de 3 años hace | 0
About "slam" on my camera device
The example shows how to run stereo visual SLAM using recorded data. It doesn't support "online" visual SLAM yet, meaning that y...
alrededor de 3 años hace | 0
Is Unreal Engine of the Automated Driving Toolbox available on Ubuntu?
As of R2021a, only Windows is supported. See Unreal Engine Simulation Environment Requirements and Limitations.
alrededor de 3 años hace | 1
why we use Unreal engine when there is a 3D visualization available in Automated driving toolbox?
It's not just used for visualization. With Unreal, you can configure prebuilt scenes, place and move vehicles within the scene, ...
más de 3 años hace | 0
| aceptada
About running a stereo camera calibrator
In general, you can use any type of stereo camera and calibrate its intrinsic parameters using the Stereo Camera Calibrator. You...
más de 3 años hace | 0
How to obtain optimal path between start and goal pose using pathPlannerRRT() and plan()?
Please set the random seed at the beginning to get consistent results across different runs: https://www.mathworks.com/help/mat...
más de 3 años hace | 0
| aceptada
Does vehicleCostmap this type of map only support pathPlannerRRT object to plan a path? Can I use another algorithm to plan a path?
You can create an occupancyMap object from a vehicleCostmap object using the following syntax: map = occupancyMap(p,resolution)...
más de 3 años hace | 0