matlab showpath,Fuse Multiple Lidar Sensors Using Map Layers

Load Driving Scenario

First, create a drivingScenario object and populate the scene with several buildings using an example helper function. The function also visualizes the scene.

scene = drivingScenario;

groundTruthVehicle = vehicle(scene,'PlotColor',[0 0.4470 0.7410]);

% Add a road and buildings to scene and visualize.

exampleHelperPopulateScene(scene,groundTruthVehicle);

055e4a25a7600b51dfcdfc078af0f5bd.png

Generate a trajectory that follows the main road in the scene using a waypointTrajectory object.

sampleRate = 100;

speed = 10;

t = [0 20 25 44 46 50 54 56 59 63 90].';

wayPoints = [ 0 0 0;

200 0 0;

200 50 0;

200 230 0;

215 245 0;

260 245 0;

290 240 0;

310 258 0;

290 275 0;

260 260 0;

-15 260 0];

velocities = [ speed 0 0;

speed 0 0;

0 speed 0;

0 speed 0;

speed 0 0;

speed 0 0;

speed 0 0;

0 speed 0;

-speed 0 0;

-speed 0 0;

-speed 0 0];

traj = waypointTrajectory(wayPoints,'TimeOfArrival',t,...

'Velocities',velocities,'SampleRate',sampleRate);

Create Simulated Lidar Sensors

To gather lidar readings from the driving scenario, create three lidarPointcloudGenerator objects using an example helper function. This vehicle has been configured to have two front-facing, narrow field-of-view (FOV) lidars and a single wide FOV rear-facing Lidar. The overlapping region of both front-facing sensors should help to quickly register and confirm free space ahead of the vehicle, whereas the rear-facing sensor range helps map the traversed region.

lidarSensors = exampleHelperCreateVehicleSensors(scene, groundTruthVehicle);

disp(lidarSensors)

Columns 1 through 2

{1x1 lidarPointCloudGenerator} {1x1 lidarPointCloudGenerator}

Column 3

{1x1 lidarPointCloudGenerator}

Initialize Egocentric Map

Create a multiLayerMap object composed of three occupancyMap objects and one generic mapLayer object. Each local occupancyMap is updated by the corresponding lidar sensor. To combine data from all maps into the mapLayer object, set the GetTransformFcn name-value argument to the exampleHelperFuseOnGet function stored as a handle fGet. The exampleHelperFuseOnGet function fused all three maps data values by calling the getMapData function on each and using a log-odds summation of the values.

% Define map and parameters.

res = 2;

width = 100*2;

height = 100*2;

% Define equal weights for all sensor readings.

weights = [1 1 1];

% Create mapLayers for each sensor.

fLeftLayer = occupancyMap(width,height,res,'LayerName','FrontLeft');

fRightLayer = occupancyMap(width,height,res,'LayerName','FrontRight');

rearLayer = occupancyMap(width,height,res,'LayerName','Rear');

% Create a get callback used to fuse data in the three layers.

fGet = @(obj,values,varargin)...

exampleHelperFuseOnGet(fLeftLayer,fRightLayer,rearLayer,...

weights,obj,values,varargin{:});

% Create a generic mapLayer object whose getMapData function fuses data from all

% three layers.

fusedLayer = mapLayer(width,height,'Resolution',res,'LayerName','FuseLayer',...

'GetTransformFcn',fGet,'DefaultValue',0.5);

% Combine layers into a multiLayerMap.

egoMap = multiLayerMap({fLeftLayer, fRightLayer, rearLayer, fusedLayer});

% Set map grid origin so that the robot is located at the center.

egoMap.GridOriginInLocal = -[diff(egoMap.XLocalLimits) diff(egoMap.YLocalLimits)]/2;

Create Reconstruction Map

Create an empty world map. This map is periodically updated using data from the fusion layer. Use this map to indicate how well the lidar fusion method is working.

% Create an empty reconstruction layer covering the same area as world map.

reconLayer = occupancyMap(400,400,res,...% width,height,resolution

'LayerName','FuseLayer','LocalOriginInWorld',[-25 -50]);

Setup Visualization

Plot the egocentric layers next to the reconstructed map. Use the exampleHelperShowEgoMap function to display each local map.

% Setup the display window.

axList = exampleHelperSetupDisplay(groundTruthVehicle,lidarSensors);

% Display the reconstructionLayer and submap region.

show(reconLayer,'Parent', axList{1});

hG = findobj(axList{1},'Type','hggroup');

egoOrientation = hG.Children;

egoCenter = hgtransform('Parent',hG);

egoOrientation.Parent = egoCenter;

gridLoc = egoMap.GridLocationInWorld;

xLimits = egoMap.XLocalLimits;

yLimits = egoMap.YLocalLimits;

rectangle('Parent',egoCenter,...

'Position',[gridLoc diff(xLimits) diff(yLimits)],...

'EdgeColor','r');

% Display the local maps built by each sensor alongside the reconstruction map.

exampleHelperShowEgoMap(axList,egoMap,[0 0 0],{'FrontLeft Lidar','FrontRight Lidar','Rear Lidar','Fused'});

2ca7a9aea9dc2b1eb5ea4064ebb5930f.png

Simulate Sensor Readings and Build Map

Move the robot along the trajectory while updating the map with the simulated Lidar readings.

To run the driving scenario, call the exampleHelperResetSimulation helper function. This resets the simulation and trajectory, clears the map, and moves the egocentric maps back to the first point of the trajectory.

exampleHelperResetSimulation(scene,traj,lidarSensors,egoMap,reconLayer)

ab03f54e9cfe2bf0bd8bb2b6c5f3fed3.png

Call the exampleHelperRunSimulation function to execute the simulation.

The primary operations of the simulation loop are:

Get the next pose in the trajectory from traj and extract the z-axis orientation (theta) from the quaternion.

Move the egoMap to the new [x y theta] pose.

Retrieve sensor data from the lidarPointCloudGenerators.

Update the local maps with sensor data using insertRay.

Update the global map using the mapLayer fused result.

Refresh the visualization.

exampleHelperRunSimulation(scene,traj,groundTruthVehicle,egoMap,lidarSensors,reconLayer,axList)

bcdece808a5569ab636b12c9b1be2b6a.png

The displayed results indicate that the front-right sensor is introducing large amounts of noise into the fused map. Notice that the right-hand wall has more variability throughout the trajectory. You do not want to discard readings from this sensor entirely because the sensor is still detecting free space in the front. Instead reduce the weight of those sensor readings during fusion and recreate the full multilayer map. Then, reset and rerun the simulation.

% Construct a new multiLayerMap with a different set of fusion weights

updatedWeights = [1 0.25 1];

egoMap = exampleHelperConstructMultiLayerEgoMap(res,width,height,updatedWeights);

% Rerun the simulation

exampleHelperResetSimulation(scene,traj,lidarSensors,egoMap,reconLayer)

exampleHelperRunSimulation(scene,traj,groundTruthVehicle,egoMap,lidarSensors,reconLayer,axList)

d8e4e0d22d326d4f320ea5c300735fbc.png

After simulating again, notice a few things about the map:

Regions covered only by the noisy sensor can still detect freespace with little noise.

While noise is still present, the readings from the other sensors outweigh those from the noisy sensor. The map shows distinct obstacle boundaries (black squares) in regions of sensor overlap.

Noise beyond the distinct boundaries remain because the noisy lidar is the only sensor that reports readings in those areas, but does not connect to other free space.

ecbc39928a7e3a68ba907a03fb17157c.png26d648f39321591772842b12b2f4eda5.png

Next Steps

This example shows a simple method of how readings can be fused. You may further customize this fusion with the following suggestions:

To adjust weights based on sensor confidence prior to layer-layer fusion, specify a custom inverse sensor model when using the insertRay object function in the examplerHelperUpdateEgoMap function.

To assign occupancy values based on a more complex confidence distribution like a gaussian inverse model, use the raycast object function to retrieve the cells traced by each eminating ray. Once a set of cells has been retrieved, values can be explicitly assigned based on more complex methods.

To reduce confidence of aging cells, utilize additional map layers which keep track of timestamps for each cell. These timestamps can be used to place greater importance on recently updates cells and slowly ignore older readings.

以下是一个简单的MATLAB程序,演示如何使用卡尔曼滤波将IMU和Lidar的位置数据进行融合。 首先,我们需要定义一些变量和矩阵。假设IMU和Lidar都以相同的时间间隔进行采样,因此我们可以使用一个单一的时间步长变量dt。我们还需要定义IMU和Lidar的测量噪声协方差矩阵R_imu和R_lidar,以及IMU和Lidar的状态转移矩阵A_imu和A_lidar。 ``` % Define variables and matrices dt = 0.01; % Sampling time interval R_imu = [0.1 0; 0 0.1]; % Measurement noise covariance matrix for IMU R_lidar = [0.5 0; 0 0.5]; % Measurement noise covariance matrix for Lidar A_imu = [1 dt; 0 1]; % State transition matrix for IMU A_lidar = [1 0; 0 1]; % State transition matrix for Lidar ``` 接下来,我们将生成一些随机的IMU和Lidar位置测量数据,作为我们的输入。 ``` % Generate random IMU and Lidar position measurements t = 0:dt:10; % Time vector N = length(t); % Number of time steps p_imu = zeros(2,N); % IMU position measurements p_lidar = zeros(2,N); % Lidar position measurements for i = 1:N if i == 1 p_imu(:,i) = [0; 0]; % Initial IMU position p_lidar(:,i) = [0; 0]; % Initial Lidar position else p_imu(:,i) = A_imu * p_imu(:,i-1) + sqrt(R_imu) * randn(2,1); % Update IMU position p_lidar(:,i) = A_lidar * p_lidar(:,i-1) + sqrt(R_lidar) * randn(2,1); % Update Lidar position end end ``` 现在,我们可以开始使用卡尔曼滤波来融合IMU和Lidar的位置数据。我们将定义一个状态向量x,其中包含IMU和Lidar的位置和速度信息。我们还将定义一个状态转移矩阵A和一个测量矩阵H,用于将测量值映射到状态空间。我们还需要定义一个协方差矩阵P,用于表示我们对状态估计的不确定性。 ``` % Initialize state vector, state transition matrix, and covariance matrix x = zeros(4,1); % State vector containing IMU and Lidar position/velocity A = [A_imu zeros(2,2); zeros(2,2) A_lidar]; % Overall state transition matrix H = [1 0 0 0; 0 0 1 0]; % Measurement matrix to map measurements to state space P = eye(4); % Covariance matrix representing uncertainty in state estimate ``` 在每个时间步骤中,我们将使用IMU和Lidar的位置测量数据来更新状态向量x和协方差矩阵P。我们还将使用卡尔曼滤波的标准预测和更新步骤来更新状态向量和协方差矩阵。 ``` % Perform Kalman filtering to fuse IMU and Lidar position measurements for i = 1:N % Prediction step x = A * x; % State prediction P = A * P * A' + eye(4); % Covariance prediction % Update step if mod(i,10) == 0 % Update every 10 time steps (for example) z = [p_imu(1,i); p_lidar(1,i)]; % Measurement vector S = H * P * H' + blkdiag(R_imu, R_lidar); % Innovation covariance K = P * H' * inv(S); % Kalman gain x = x + K * (z - H * x); % State update P = (eye(4) - K * H) * P; % Covariance update end end ``` 最后,我们可以绘制IMU和Lidar的原始位置测量数据,以及融合后的位置估计结果。 ``` % Plot results figure; plot(p_imu(1,:), p_imu(2,:), 'r'); hold on; plot(p_lidar(1,:), p_lidar(2,:), 'b'); hold on; plot(x(1,:), x(3,:), 'g'); hold off; legend('IMU', 'Lidar', 'Fused'); xlabel('X position'); ylabel('Y position'); ``` 这个简单的程序演示了如何使用卡尔曼滤波来融合IMU和Lidar的位置数据。在实际应用中,您可能需要更复杂的状态向量和测量向量,以及更复杂的状态转移和测量矩阵。此外,您可能需要根据应用的具体需求调整测量噪声协方差矩阵。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值