Multi-Camera Calibration for Motion Tracking
6 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Alex
am 8 Jul. 2014
Beantwortet: Dima Lisin
am 15 Jul. 2014
I am trying to calibrate two cameras using Matlab 2014a and the example found here: http://www.mathworks.com/help/vision/ref/extrinsics.html This is for use in a low-cost motion tracking system. From the calibration, I want to ultimately obtain each camera's absolute position in the world coordinate system.
In order to get position, I did the following: Convert transposeMatrix from a 1x3 to 3x1 position = -transpose(rotationMatrix)*transposeMatrix Is this correct?
Should the calculated positions of the cameras be relative to the coordinate system created in the image I use for "imOrig"?
Thank you for your help.
0 Kommentare
Akzeptierte Antwort
Dima Lisin
am 15 Jul. 2014
I think you meant "translationVector" instead of "transposeMatrix". Is that right?
The rotationMatrix and translationVector give you the transfromation from the checkerboard's coordinate system into the camera's coordinate system. So to find the location of the camera in the checkerboard's coordinate system you have to do the inverse transformation:
position = -translationVector * rotationMatrix';
This will give you the position as a 1-by-3 vector. Also, please keep in mind that all the matrices use the post-multiply convention, i. e. row vector times a matrix.
0 Kommentare
Weitere Antworten (0)
Siehe auch
Kategorien
Mehr zu MATLAB Support Package for USB Webcams finden Sie in Help Center und File Exchange
Produkte
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!