I'm trying to do a distance estimation of an object with stereo vision.
I've done the calibration of the cameras using Matlab's Toolbox, and then I took a photo of a sheet where I printed a template in A3 format.
Then I tried to do a depth estimation using this code:
left=imread("406mm/left.png");
right=imread("406mm/right.png");
imshowpair(left,right,"montage")
[leftRect,rightRect,reprojectionMatrix]=rectifyStereoImages(left,right,stereoParams1309);
A=stereoAnaglyph(leftRect,rightRect);
disparityRange = [0 20*16];
disparityMap = disparityBM(leftRect,rightRect,'DisparityRange',disparityRange,'UniquenessThreshold',10,'BlockSize',9,'ContrastThreshold',0.5);
imshow(disparityMap,disparityRange)
filtered=medfilt2(disparityMap,[5 5]);
title('Disparity Map filtered')
xyzPoints=reconstructScene(filtered,reprojectionMatrix);
ptcloud=pointCloud(xyzPoints);
player=pcplayer([-500 500],[-500 500],[0 2000],'VerticalAxis','y');
mask=repmat(Z>0 & Z<500,[1,1]);
figure();imshow(xyzPoints(:,:,3)); colormap jet
distance=xyzPoints(:,:,3);
distance(distance>420)=[];
isfinite=mean(distance(isfinite(distance)))
average=mean(distance,'all',"omitnan")
Those are left and right photos:
And this is what the point cloud looks like:
The object was at 406mm from the left camera, but as you can see in the point cloud there were some correct points at 400mm circa and other wrong points far away from them (at like 1000mm), and I don't understand why. It looks like it couldn't understand that it was a 2D sheet.
That's why I tried to remove the points > 500mm with a mask before trying to calculate the average, and the result was this one:
Do you know what's wrong? Why does it split the points in two different regions at two different distances?
And if you don't know how to improve the point cloud, do you maybe know how can I get rid of all the points i don0t want in order to make the average more accurate (using only the points around 400mm)?
Thank you!