Image undistortion with fixed camera position and single calibration image

12 Ansichten (letzte 30 Tage)
Dear all,
I want to undistort an image. The camera is fixed on a tripod. The camera captures the reflected light of particles within a thin lightsheet. Usually, the camera is positioned perpendicular to this light sheet, and a high quality non-distortion lens is used. In this case, there is no need for undistortion. But it may also happen, that a lower quality lens is used or that the camera cannot be positioned perrpendicular to the light sheet. In this case, I would take a picture of a calibration target, positioned at the location of the light sheet. I would like to track the pattern of the calibration target, and use the information to undistort images of the light sheet.
What approach should I use in Matlab? The camera calibrator expects at least two calibration images, taken from different perspectives. But my camera is fixed, and the position of the light sheet and the calibration target is fixed too. The image of the calibration target might e.g. look like this (I am free to use any pattern I want):
Thanks for your input!
  2 Kommentare
Matt J
Matt J am 26 Feb. 2023
Bearbeitet: Matt J am 26 Feb. 2023
From your posted image, it is hard to understand why you say the light sheet is unmovable. It looks like you could change its position by hand easily.
William Thielicke
William Thielicke am 27 Feb. 2023
Bearbeitet: Matt J am 27 Feb. 2023
Yes, I could change it, but I don't want to. Maybe this picture explains the experimental setup a bit better. I want to be able to compensate the distortion when the camera is not perpendicular to the light sheet. What is the best approach? I think this is called orthorectification. And I think this is different from lens distortion. Probably both compensations have to be done one after the other. How would you do this in Matlab?

Melden Sie sich an, um zu kommentieren.

Akzeptierte Antwort

Matt J
Matt J am 27 Feb. 2023
Bearbeitet: Matt J am 27 Feb. 2023
My recommendation is that you first calibrate your camera for intrinsic and lens distortion parameters. These parameters do not depend on camera position, so they can be done independently of the apparatus you have posted. In other words, you should be able to disconnect the camera from your appartus, take it into a different room, calibrate, then put it back and the lens distortion and intrinsic parameters will still be valid. Moreover, the calibration fixture that you use for this step need not be part of the apparatus in your post. If you can obtain a chequerboard like the examples in the documentation, you can use the camera calibrator app just like in those example.
Once the camera has been calibrated for intrinsics and lens distortion, you can reconnect it to your apparatus and image a fixture of points like you were originally planning. You then use undistortPoints to obtain their coordinates with distortion corrected. Then you can use fitgeotrans(___,'projective') to figure out the transform that rectifies your image.
  3 Kommentare
William Thielicke
William Thielicke am 22 Mär. 2024
Dear @Matt J , I finally found some time to pick this up again (and hopefully finish implementing something soon). I applied your hints with code copy pasted from Matlab help like this:
clc
clearvars
images = imageDatastore(fullfile(toolboxdir("vision"),"visiondata","calibration","gopro")); %get some worst case data
imageFileNames = images.Files;
% Detect calibration pattern in images
[imagePoints,boardSize] = detectCheckerboardPoints(imageFileNames,'HighDistortion',true);
% Read the first image to obtain image size
originalImage = imread(imageFileNames{6}); %this image will be undistorted and rectified
[mrows, ncols, ~] = size(originalImage);
% Generate world coordinates for the planar pattern keypoints
squareSize = 250; %why is this needed....? This somehow controls output image size...
worldPoints = generateCheckerboardPoints(boardSize,squareSize);
% Calibrate the camera
[cameraParams, imagesUsed, estimationErrors] = estimateCameraParameters(imagePoints, worldPoints, ...
'EstimateSkew', false, 'EstimateTangentialDistortion', false, ...
'NumRadialDistortionCoefficients', 2, 'WorldUnits', 'millimeters', ...
'InitialIntrinsicMatrix', [], 'InitialRadialDistortion', [], ...
'ImageSize', [mrows, ncols]);
points = detectCheckerboardPoints(originalImage,'PartialDetections',false); %must be false, otherwise nans, and next step doesnt work
undistortedPoints = undistortPoints(points,cameraParams.Intrinsics);
%% Here, a distorted image will be undistorted by interpolation, result saved in variable "undistortedImage"
[undistortedImage, newIntrinsics] = undistortImage(originalImage,cameraParams.Intrinsics,'interp','cubic');
tform = fitgeotform2d(undistortedPoints,worldPoints,'Projective');
%% Here, and interpolated image will be interpolated a second time
undistorted_rectified = imwarp(undistortedImage,tform);
imshow(undistorted_rectified)
This is just a first attempt to get an image that is undistorted and has a proper alignment. It works, but I wonder if there are ways to get this with less interpolation. Currently a pixel image is interpolated two times, this will introduce a lot of artifacts in the data I am planning to work with (images with particles that are 3 - 10 pixels of diameter). Is there a way to cobine the two steps (undistortimage and imwarp) into a single interpolation step? How would I determine the output resolution of my undistorted and rectified image? Thanks!
Matt J
Matt J am 23 Mär. 2024
Bearbeitet: Matt J am 23 Mär. 2024
The only thing I can think of is that you forego imwarp, and just implement your own warping using griddedInterpolant or interp2. In that case, you would have control over how the deformed image pixel locations get computed. You can compute them as the succession of two point transforms - the undistortion and the rectification. Once the final deformed pixel locations (incorporating both transforms), have been computed you can do a single interpolation to get the final image.

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (1)

Matt J
Matt J am 27 Feb. 2023
Bearbeitet: Matt J am 27 Feb. 2023
You could also browse the File Exchange for submissions that do single image calibration. Here is one example, though I have not used it myself.
An important thing to keep in mind, however, is that if you are going to pursue calibration from a single image, it is necessary that the landmark points in your calibration fixture not be coplanar in 3D. A single plane of points is not enough to determine the camera parameters. This is an old theoretical result.

Kategorien

Mehr zu MATLAB Support Package for USB Webcams finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by