# How do I translate pixel coordinates to real world coordinates using a calibration target?

3 views (last 30 days)
_NB_ on 18 Nov 2021
Commented: _NB_ on 19 Nov 2021
I have a 2xN array of pixel coordinates = A that correspond to the dots on a dotted calibration target (a sheet with dots on it that are a known distance apart).
I have a 2xN array of real-world-unit coordinates = B which contains the positions of the dots on the target.
How do I find a way to smoothly translate any coordinate that falls into the area of the points contained in A onto B, so that I can find a given point in real space for any point on the target, not just the points where the dots are?
A linear translation is not possible as the image contains distortions. I have tried functions like estimateCameraParameters and estimateCameraMatrix, but these apparently either require multiple images or non-coplanar points, both of which I do not have.

Image Analyst on 18 Nov 2021
I think you can use scatteredInterpolant. Given a list of (xp, yp) pixel coordinates, and another separate list of real world (xr, yr) coordinates, you can essentially build a surface with scatteredInterpolant. So once you have that you can simply input the pixel coordinates and out pops the real world coordinates. Attached is a demo of scatteredInterpolant. I'm sure you can easily modify it, but the key is you need to know the real world coordinates for a certain set of image/pixel coordinates, but you said you know those.
_NB_ on 19 Nov 2021
I cannot do a linear interpolation because for other images there may be more distortion, which could be along the X-axis, Y-Axis or random in nature.

### Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by