Filter löschen
Filter löschen

How to add kalman filter to the Lidar odometry and mapping slam present in the mathworks document?

8 Ansichten (letzte 30 Tage)
This is the code to perform LOAM for the self driving car moving around the parking lot, now how to add kalman filter in this code given below:
% Load reference path % 60 vehicles are parked
data = load("parkingLotReferenceData.mat");
% Set reference trajectory of the ego vehicle
refPosesX = data.refPosesX;
refPosesY = data.refPosesY;
refPosesT = data.refPosesT;
% Set poses of the parked vehicles
parkedPoses = data.parkedPoses;
% Display the reference trajectory and the parked vehicle locations
sceneName = "LargeParkingLot";
hScene = figure(Name="Large Parking Lot",NumberTitle="off");
helperShowSceneImage(sceneName);
hold on
plot(refPosesX(:,2),refPosesY(:,2),LineWidth=2,DisplayName="Reference Path");
scatter(parkedPoses(:,1),parkedPoses(:,2),[],"filled",DisplayName="Parked Vehicles");
xlim([-60 40])
ylim([10 60])
hScene.Position = [100 100 1000 500]; % Resize figure
title("Large Parking Lot")
legend
modelName = 'GenerateLidarDataOfParkingLot';
open_system(modelName)
snapnow
helperAddParkedVehicles(modelName,parkedPoses)
close(hScene)
if ~ispc
error("Unreal Engine Simulation is supported only on Microsoft" + char(174) + " Windows" + char(174) + ".");
end
% Run simulation
simOut = sim(modelName);
close_system(modelName,0)
ptCloudArr = helperGetPointClouds(simOut);
groundTruthPosesLidar = helperGetLidarGroundTruth(simOut);
ptCloud = ptCloudArr(1);
nextPtCloud = ptCloudArr(2);
gridStep = 0.5;
tform = pcregisterloam(ptCloud,nextPtCloud,gridStep);
disp(tform)
egoRadius = 3.5;
cylinderRadius = 50;
selectedIdx = findPointsInCylinder(ptCloud,[egoRadius cylinderRadius]);
ptCloud = select(ptCloud,selectedIdx,OutputSize="full");
selectedIdx = findPointsInCylinder(nextPtCloud,[egoRadius cylinderRadius]);
nextPtCloud = select(nextPtCloud,selectedIdx,OutputSize="full");
figure
hold on
title("Cylindrical Neighborhood")
pcshow(ptCloud)
view(2)
maxPlanarSurfacePoints = 8;
points = detectLOAMFeatures(ptCloud,MaxPlanarSurfacePoints=maxPlanarSurfacePoints);
nextPoints = detectLOAMFeatures(nextPtCloud,MaxPlanarSurfacePoints=maxPlanarSurfacePoints);
figure
hold on
title("LOAM Points")
show(points,MarkerSize=12)
[~,rmseValue] = pcregisterloam(points,nextPoints);
disp(['RMSE: ' num2str(rmseValue)])
points = downsampleLessPlanar(points,gridStep);
figure
hold on
title('LOAM Points After Downsampling the Less Planar Surface Points')
show(points,'MarkerSize',12)
absPose = groundTruthPosesLidar(1);
relPose = rigidtform3d;
vSetOdometry = pcviewset;
viewId = 1;
vSetOdometry = addView(vSetOdometry,viewId,absPose);
% Display the parking lot scene with the reference trajectory
hScene = figure(Name="Large Parking Lot",NumberTitle="off");
helperShowSceneImage(sceneName);
hold on
plot(refPosesX(:,2),refPosesY(:,2),LineWidth=2)
xlim([-60 40])
ylim([10 60])
hScene.Position = [100 100 1000 500];
numSkip = 5;
for k = (numSkip+1):numSkip:numel(ptCloudArr)
prevPoints = points;
viewId = viewId + 1;
ptCloud = ptCloudArr(k);
% Select a cylindrical neighborhood of interest.
selectedIdx = findPointsInCylinder(ptCloud,[egoRadius cylinderRadius]);
ptCloud = select(ptCloud,selectedIdx,OutputSize="full");
% Detect LOAM points and downsample the less planar surface points
points = detectLOAMFeatures(ptCloud,MaxPlanarSurfacePoints=maxPlanarSurfacePoints);
points = downsampleLessPlanar(points,gridStep);
% Register the points using the previous relative pose as an initial
% transformation
relPose = pcregisterloam(points,prevPoints,InitialTransform=relPose);
% Update the absolute pose and store it in the view set
absPose = rigidtform3d(absPose.A * relPose.A);
vSetOdometry = addView(vSetOdometry,viewId,absPose);
% Visualize the absolute pose in the parking lot scene
plot(absPose.Translation(1),absPose.Translation(2),Color="r",Marker=".",MarkerSize=8);
xlim([-60 40])
ylim([10 60])
title("Build a Map Using Lidar Odometry")
legend(["Ground Truth","Estimated Position"])
pause(0.001)
view(2)
end
absPose = groundTruthPosesLidar(1);
relPose = rigidtform3d;
vSetMapping = pcviewset;
ptCloud = ptCloudArr(1);
selectedIdx = findPointsInCylinder(ptCloud,[egoRadius cylinderRadius]);
ptCloud = select(ptCloud,selectedIdx,OutputSize="full");
points = detectLOAMFeatures(ptCloud,'MaxPlanarSurfacePoints',maxPlanarSurfacePoints);
points = downsampleLessPlanar(points,gridStep);
viewId = 1;
vSetMapping = addView(vSetMapping,viewId,absPose);
voxelSize = 0.1;
loamMap = pcmaploam(voxelSize);
addPoints(loamMap,points,absPose);
% Display the parking lot scene with the reference trajectory
hScene = figure(Name="Large Parking Lot",NumberTitle="off");
helperShowSceneImage(sceneName);
hold on
plot(refPosesX(:,2),refPosesY(:,2),LineWidth=2)
xlim([-60 40])
ylim([10 60])
hScene.Position = [100 100 1000 500];
for k = (numSkip+1):numSkip:numel(ptCloudArr)
prevPtCloud = ptCloud;
prevPoints = points;
viewId = viewId + 1;
ptCloud = ptCloudArr(k);
% Select a cylindrical neighborhood of interest.
selectedIdx = findPointsInCylinder(ptCloud,[egoRadius cylinderRadius]);
ptCloud = select(ptCloud,selectedIdx,OutputSize="full");
% Detect LOAM points and downsample the less planar surface points
points = detectLOAMFeatures(ptCloud,MaxPlanarSurfacePoints=maxPlanarSurfacePoints);
points = downsampleLessPlanar(points,gridStep);
% Register the points using the previous relative pose as an initial
% transformation
relPose = pcregisterloam(points,prevPoints,MatchingMethod="one-to-one",InitialTransform=relPose);
% Find the refined absolute pose that aligns the points to the map
absPose = findPose(loamMap,points,relPose);
% Store the refined absolute pose in the view set
vSetMapping = addView(vSetMapping,viewId,absPose);
% Add the new points to the map
addPoints(loamMap,points,absPose);
% Visualize the absolute pose in the parking lot scene
plot(absPose.Translation(1),absPose.Translation(2),Color="r",Marker=".",MarkerSize=8)
xlim([-60 40])
ylim([10 60])
title("Build a Map Using Lidar Mapping")
legend(["Ground Truth","Estimated Position"])
pause(0.001)
view(2)
end
% Get the ground truth trajectory
groundTruthTrajectory = vertcat(groundTruthPosesLidar.Translation);
% Get the estimated trajectories
odometryPositions = vertcat(vSetOdometry.Views.AbsolutePose.Translation);
mappingPositions = vertcat(vSetMapping.Views.AbsolutePose.Translation);
% Plot the trajectories
figure
plot3(groundTruthTrajectory(:,1),groundTruthTrajectory(:,2),groundTruthTrajectory(:,3),LineWidth=2,DisplayName="Ground Truth")
hold on
plot3(odometryPositions(:,1),odometryPositions(:,2),odometryPositions(:,3),LineWidth=2,DisplayName="Odometry")
plot3(mappingPositions(:,1),mappingPositions(:,2),mappingPositions(:,3),LineWidth=2,DisplayName="Odometry and Mapping")
view(2)
legend
title("Compare Trajectories")
% Select the poses to compare
selectedGroundTruth = groundTruthTrajectory(1:numSkip:end,:);
% Compute the RMSE
rmseOdometry = rmse(selectedGroundTruth,odometryPositions,"all");
rmseWithMapping = rmse(selectedGroundTruth,mappingPositions,"all");
disp(['RMSE of the trajectory estimated with Odometry: ' num2str(rmseOdometry)])
disp(['RMSE of the trajectory estimated with Odometry and Mapping: ' num2str(rmseWithMapping)])
function ptCloudArr = helperGetPointClouds(simOut)
% Extract signal
ptCloudData = simOut.ptCloudData.signals.values;
% Create a pointCloud array
ptCloudArr = pointCloud(ptCloudData(:,:,:,2)); % Ignore first frame
for n = 1:size(ptCloudData,4)
ptCloudArr(end+1) = pointCloud(ptCloudData(:,:,:,n)); %#ok<AGROW>
end
end
function gTruth = helperGetLidarGroundTruth(simOut)
numFrames = size(simOut.lidarLocation.time,1);
gTruth = repmat(rigidtform3d,numFrames-1,1);
for i = 1:numFrames
gTruth(i).Translation = squeeze(simOut.lidarLocation.signals.values(:,:,i));
% Ignore the roll and the pitch rotations since the ground is flat
yaw = double(simOut.lidarOrientation.signals.values(:,3,i));
gTruth(i).R = [cos(yaw) -sin(yaw) 0;
sin(yaw) cos(yaw) 0;
0 0 1];
end
end

Antworten (1)

UDAYA PEDDIRAJU
UDAYA PEDDIRAJU am 25 Jan. 2024
Hi SAYANDIP,
To integrate an Extended Kalman Filter (EKF) with your Lidar Odometry and Mapping (LOAM) SLAM system, follow these steps:
  1. Define the State Vector: Include the position and orientation of the vehicle in the state vector.
  2. Define the Measurement Vector: Use the Lidar data as your measurement vector.
  3. Define the State Transition Model: This model should predict how the vehicle's state evolves over time.
  4. Define the Measurement Model: Establish how the Lidar measurements relate to the vehicle's state.
  5. Initialize the State Vector and Covariance Matrix: Start with initial estimates for the state and its uncertainty.
  6. Predict the State and Covariance: Use the state transition model to forecast the next state and update the covariance matrix with the process noise.
  7. Update the State and Covariance: Adjust the state estimate with the new Lidar measurements and refine the covariance.
  8. Iterate: Repeat the prediction and update steps for each new Lidar measurement.
Here's a MATLAB code template for implementing the EKF:
% Simulation parameters
dt = 0.1; % Time step (seconds)
num_time_steps = 50; % Number of simulation steps
% Initial state [x_position; y_position; x_velocity; y_velocity]
x = [0; 0; 1; 1]; % The robot starts at the origin moving at 1 m/s
% Initial covariance matrix
P = eye(4) * 0.1; % Small initial uncertainty
% Process noise covariance matrix (tuned for this example)
Q = diag([0.01, 0.01, 0.01, 0.01]);
% Measurement noise covariance matrix (tuned for this example)
R = diag([0.05, 0.05]); % Only position is measured
% State transition model (constant velocity model)
F = [1 0 dt 0; 0 1 0 dt; 0 0 1 0; 0 0 0 1];
% Measurement model (we directly measure position)
H = [1 0 0 0; 0 1 0 0];
% Simulate some measurements (noisy observations of the position)
true_position = [0; 0]; % True initial position
measured_positions = zeros(2, num_time_steps);
for i = 1:num_time_steps
% Assume the robot moves linearly with constant velocity
true_position = true_position + [dt; dt];
% Simulate the measurement with added noise
measured_positions(:, i) = true_position + randn(2, 1) * sqrt(R(1, 1));
end
% Run the EKF
for i = 1:num_time_steps
% Predict the state and covariance
x = F * x; % State prediction
P = F * P * F' + Q; % Covariance prediction
% Update the state and covariance with new Lidar measurements
z = measured_positions(:, i); % Current Lidar measurements
K = P * H' / (H * P * H' + R); % Kalman gain
x = x + K * (z - H * x); % Updated state estimate
P = (eye(4) - K * H) * P; % Updated covariance estimate
% Display the state estimate
disp(['Time step ' num2str(i) ':']);
disp(['Estimated State: ' num2str(x')]);
end
Replace the placeholders with your actual system models and noise characteristics. You can refer to the following documentation for more details on EKF implementation: https://www.mathworks.com/matlabcentral/fileexchange/24855-extended-kalman-filter-ekf.
I hope this helps with your implementation.
  2 Kommentare
saccharin189
saccharin189 am 9 Jun. 2024
data = load("parkingLotReferenceData.mat");
what is the loda file or directory about "parkingLotReferenceData.mat"?

Melden Sie sich an, um zu kommentieren.

Kategorien

Mehr zu Get Started with Lidar Toolbox finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by