Multi-cameras on same coordinate system
Mostrar comentarios más antiguos
Hello folks,
First time poster here. I have a small project that I was hoping to use Matlab for and perhaps someone here can tell me if the computer vision toolbox is all I need, or if I can expect to write some custom code. I have 8 video feeds of a sporting event in my backyard. I have the same model cameras installed all around this small stage area in a circular fashion. I placed a rectangular checkerboard in the center (the day before), and all cameras could see it, and thus I have some calibration feeds. I want to register all of these cameras on a global coordinate system. That is to say, I need a way to know where every camera is relative to every other camera.
So far I've played around with the 'detectCheckerboardPoints' function, and it seems to work fine. But I have no real idea of where to start in terms of doing what I want. From some reading it seems that the computer vision toolbox is good at calibrating a stereo pair, but this is not my task. Is the vision toolbox able to solve my problem? I guess I need to know the rotation and translation of every camera in a global coord. system. Any guidance will be much appreciated.
Thanks,
-Scott
Respuestas (1)
Dima Lisin
el 29 de Oct. de 2015
0 votos
Yes, you can use the Computer Vision System Toolbox for this. First, you have to calibrate each of your cameras individually, using the Stereo Camera Calibrator App. Then, since all the cameras can see the same checkerboard, you can use the extrinsics function to compute their respective rotations and translations. Then you can use the cameraMatrix function to get the camera projection matrices relating each camera to the common world coordinate system.
11 comentarios
Scott Binks
el 11 de Nov. de 2015
Dima Lisin
el 11 de Nov. de 2015
Hi Scott,
The thing to keep in mind here is that the extrinsics are the transformation from the world coordinates into the camera's coordinates. So if R and t are your extrinsics, then the location of the camera in the world coordinate system is -t*R', and its orientation is R'.
Scott Binks
el 11 de Nov. de 2015
Editada: Scott Binks
el 12 de Nov. de 2015
Dima Lisin
el 12 de Nov. de 2015
Hi Scott,
I don't know what up and down vectors mean. R' is the 3D rotation matrix that defines the orientation of the camera's image plane in the world coordinates.
I am a bit confused about how you are using the Camera Calibrator. If you have two cameras, then you can use the Stereo Camera Calibrator app to do stereo calibration, and get the rotation and translation between the cameras. If you have more than two cameras, then you should use Camera Calibrator to calibrate each camera individually, and then use the extrinsics function to compute their locations and orientations in a common coordinate system.
Scott Binks
el 12 de Nov. de 2015
Editada: Scott Binks
el 13 de Nov. de 2015
Dima Lisin
el 13 de Nov. de 2015
Yes, the cameraParameters object has the rotation matrices and translation vectors. So if you calibrate all your cameras at the same time by taking pictures of the same checkerboard simultaneously, then you can use those.
But if you can also calibrate each of your cameras separately, using a unrelated sets of calibration images. Then you can set up the cameras, place a single checkerboard so that it is visible by all of them, and take a picture of it with all the cameras. Then you would use the extrinsics function.
The rotation matrices are with respect to the world coordinates, which are defined by the checkerboard. To be precise, the extrinsics (R, and t) are a transformation from the checkerboard's coordinate system into the camera's coordinate system. See the documentation for more details.
Scott Binks
el 13 de Nov. de 2015
Dima Lisin
el 13 de Nov. de 2015
You seem to be using the Stereo Camera Calibrator app, which calibrates stereo pairs of cameras. There is also a Camera Calibrator app, which calibrates a single camera. You can invoke it using the cameraCalibrator command.
What you should do is take about 20 images of the checkerboard with each camera, and calibrate them individually. Then arrange your cameras the way you need, place a checkerboard such as it is visible to all of them, and take a picture with each camera. Now the checkerboard will define a common coordinate system for all your cameras.
The rotation matrix does not contain x,y,z vectors. It rotates a 3D vector. rotatedVector = origVector * R. Where rotatedVector and origVector are 3D vectors of [x,y,z] coordinates. So you don't need to take the rotation matrix apart.
Scott Binks
el 13 de Nov. de 2015
Scott Binks
el 15 de Nov. de 2015
Dima Lisin
el 16 de Nov. de 2015
It is relative to that checkerboard that you took a picture of with all your cameras. The checkerboard defines the world coordinate system. More specifically, the checkerboard defines the Z=0 plane. The X-axis goes to the right along the longer side of the checkerboard, the Y-axis goes down along the shorter side of the board, and the Z-axis points into the board. -t*R' is the camera's location in these world coordinates, and R' is the cameras orientation in those coordinates.
To verify this, you can plot checkerboard points that you get from the generateCheckerboardPoints function, and then plot the cameras using the plotCamera function.
By the way, it would have been easier to use the cameraCalbrator app than the estimateCameraParameters function for calibration.
Categorías
Más información sobre MATLAB Support Package for USB Webcams en Centro de ayuda y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!