毫米波雷达和视觉传感器融合的检测仿真代码

一、构建驾驶场景

驾驶场景为两车道的高速公路,长度为500m,共有四辆车。主车正前方和正后方各有一辆行使的车辆,还有一辆从左侧超车的车辆。所有车辆按设定的路径和车速行驶;超车车辆从右侧车道开始,变道至坐车到超车后又返回右车道。

scenario=drivingScenario; %构建驾驶场景
scenario.SampleTime=0.01;
roadCenters=[0 0;50 0;100 0;250 20;500 40];%道路中心
roadWidth=7.2;%道路宽度
road(scenario,roadCenters,roadWidth);%在驾驶场景中添加道路
egoCar=vehicle(scenario,"ClassID",1);%添加主车
path(egoCar,roadCenters(2:end,:)-[0 1.8],25)%主车路径的路径和车速
leadCar=vehicle(scenario,"ClassID",1);%添加前方车辆
path(leadCar,[70 0;roadCenters(3:end,:)]-[0 1.8],25);%前方车辆的路径和车速
passingCar=vehicle(scenario,"ClassID",1);%添加超车车辆
waypoints=[0 -1.8;50 1.8;100 1.8;250 21.8;400 32.2;500 38.2];%设置路线点
path(passingCar,waypoints,35);%超车车辆的路径和车速
chaseCar=vehicle(scenario,"ClassID",1);%添加后方车辆
path(chaseCar,[25 0;roadCenters(2:end,:)]-[0 1.8],25);%后方车辆的路径和车速
plot(scenario)

所得如下结果

二、定义毫米波雷达和视觉传感器

本示例中,主车带有六个毫米波雷达和两个视觉传感器,传感器的检测范围为360°。传感器检测区域有一些重叠和覆盖 。主车前后各装一个远程毫米波雷达传感器和一个视觉传感器;两侧分别装有两个短程毫米波雷达,每个短程毫米波雷达可覆盖90°检测范围,其中一个毫米波雷达覆盖车辆中部到后部区域,另一个毫米波雷达覆盖车辆中部到前部区域。

sensors=cell(8,1);
%设置位于汽车前保险杠中央的前向远程毫米波雷达
sensors{1}=radarDetectionGenerator("SensorIndex",1,"Height",0.2,"MaxRange",174,...
    "SensorLocation",[egoCar.Wheelbase+ egoCar.FrontOverhang,0],"FieldOfView",[20,5]);
%设置位于汽车后保险杠中央的前向远程毫米波雷达
sensors{2}=radarDetectionGenerator("SensorIndex",2,"Height",0.2,"Yaw",180,...
    "SensorLocation",[-egoCar.RearOverhang,0],"MaxRange",174,"FieldOfView",[20,5]);
%设置车辆左后轮罩处的左后短程毫米波雷达
sensors{3}=radarDetectionGenerator("SensorIndex",3,"Height",0.2,"Yaw",120,...
    "SensorLocation",[0,egoCar.Width/2],"MaxRange",30,"ReferenceRange",50,...
"FieldOfView",[90,5],"AzimuthResolution",10,"RangeResolution",1.25);
%设置车辆右后轮罩处的右后短程毫米波雷达
sensors{4}=radarDetectionGenerator("SensorIndex",4,"Height",0.2,"Yaw",-120,...
    "SensorLocation",[0,-egoCar.Width/2],"MaxRange",30,"ReferenceRange",50,...
    "FieldOfView",[90,5],"AzimuthResolution",10,"RangeResolution",1.25);
%设置车辆左前轮罩处的左前短程毫米波雷达
sensors{5}=radarDetectionGenerator("SensorIndex",5,"Height",0.2,"Yaw",60,...
    "SensorLocation",[egoCar.Wheelbase,egoCar.Width/2],"MaxRange",30,...
    "ReferenceRange",50,"FieldOfView",[90,5],"AzimuthResolution",10,...
    "RangeResolution",1.25);
%设置车辆右前轮罩处的右前短程毫米波雷达
sensors{6}=radarDetectionGenerator("SensorIndex",6,"Height",0.2,"Yaw",-60,...
    "SensorLocation",[egoCar.Wheelbase,-egoCar.Width/2],"MaxRange",30,...
    "ReferenceRange",50,"FieldOfView",[90,5],"AzimuthResolution",10,...
    "RangeResolution",1.25);
%设置前挡风玻璃上的前视觉传感器
sensors{7} = visionDetectionGenerator('SensorIndex', 7, 'FalsePositivesPerImage', 0.1, ...
    'SensorLocation', [0.75*egoCar.Wheelbase 0], 'Height', 1.1);
%设置后挡风玻璃上的后视觉传感器
sensors{8} = visionDetectionGenerator('SensorIndex', 8, 'FalsePositivesPerImage', 0.1, ...
    'SensorLocation', [0.2*egoCar.Wheelbase 0], 'Height', 1.1, 'Yaw', 180);

 三、创建跟踪器

tracker = multiObjectTracker('FilterInitializationFcn', @initSimDemoFilter, ...
    'AssignmentThreshold', 30, 'ConfirmationThreshold', [4 5]);%多目标跟踪器
positionSelector = [1 0 0 0; 0 0 1 0]; % 传感器位置
velocitySelector = [0 1 0 0; 0 0 0 1]; % 传感器速度
BEP=createDemoDisplay(egoCar,sensors);%创建显示并返回鸟瞰图的句柄
imshow(BEP)

四、辅助函数

initSimDemoFilter
该函数初始化一个基于单个检测目标的固定速度的卡尔曼滤波器。

function filter = initSimDemoFilter(detection)
% Use a 2-D constant velocity model to initialize a trackingKF filter.
% The state vector is [x;vx;y;vy]
% The detection measurement vector is [x;y;vx;vy]
% As a result, the measurement model is H = [1 0 0 0; 0 0 1 0; 0 1 0 0; 0 0 0 1]
H = [1 0 0 0; 0 0 1 0; 0 1 0 0; 0 0 0 1];
filter = trackingKF('MotionModel', '2D Constant Velocity', ...
    'State', H' * detection.Measurement, ...
    'MeasurementModel', H, ...
    'StateCovariance', H' * detection.MeasurementNoise * H, ...
    'MeasurementNoise', detection.MeasurementNoise);
end

createDemoDisplay
该函数生成一个带有三个面板的显示界面:

1、左上方显示:跟随自车的俯视图;

2、左下方显示:跟踪自车的摄像头图像;

3、右半部分:鸟瞰图显示;

function BEP = createDemoDisplay(egoCar, sensors)
    % Make a figure
    hFigure = figure('Position', [0, 0, 1200, 640], 'Name', 'Sensor Fusion with Synthetic Data Example');
    movegui(hFigure, [0 -1]); % Moves the figure to the left and a little down from the top

    % Add a car plot that follows the ego vehicle from behind
    hCarViewPanel = uipanel(hFigure, 'Position', [0 0 0.5 0.5], 'Title', 'Chase Camera View');
    hCarPlot = axes(hCarViewPanel);
    chasePlot(egoCar, 'Parent', hCarPlot);

    % Add a car plot that follows the ego vehicle from a top view
    hTopViewPanel = uipanel(hFigure, 'Position', [0 0.5 0.5 0.5], 'Title', 'Top View');
    hCarPlot = axes(hTopViewPanel);
    chasePlot(egoCar, 'Parent', hCarPlot, 'ViewHeight', 130, 'ViewLocation', [0 0], 'ViewPitch', 90);

    % Add a panel for a bird's-eye plot
    hBEVPanel = uipanel(hFigure, 'Position', [0.5 0 0.5 1], 'Title', 'Bird''s-Eye Plot');

    % Create bird's-eye plot for the ego vehicle and sensor coverage
    hBEVPlot = axes(hBEVPanel);
    frontBackLim = 60;
    BEP = birdsEyePlot('Parent', hBEVPlot, 'Xlimits', [-frontBackLim frontBackLim], 'Ylimits', [-35 35]);

    % Plot the coverage areas for radars
    for i = 1:6
        cap = coverageAreaPlotter(BEP,'FaceColor','red','EdgeColor','red');
        if isa(sensors{i},'drivingRadarDataGenerator')
            plotCoverageArea(cap, sensors{i}.MountingLocation(1:2),...
                sensors{i}.RangeLimits(2), sensors{i}.MountingAngles(1), sensors{i}.FieldOfView(1));
        else
            plotCoverageArea(cap, sensors{i}.SensorLocation,...
                sensors{i}.MaxRange, sensors{i}.Yaw, sensors{i}.FieldOfView(1));
        end
    end

    % Plot the coverage areas for vision sensors
    for i = 7:8
        cap = coverageAreaPlotter(BEP,'FaceColor','blue','EdgeColor','blue');
        if isa(sensors{i},'drivingRadarDataGenerator')
            plotCoverageArea(cap, sensors{i}.MountingLocation(1:2),...
                sensors{i}.RangeLimits(2), sensors{i}.MountingAngles(1), 45);
        else
            plotCoverageArea(cap, sensors{i}.SensorLocation,...
                sensors{i}.MaxRange, sensors{i}.Yaw, 45);
        end
    end

    % Create a vision detection plotter put it in a struct for future use
    detectionPlotter(BEP, 'DisplayName','vision', 'MarkerEdgeColor','blue', 'Marker','^');

    % Combine all radar detections into one entry and store it for later update
    detectionPlotter(BEP, 'DisplayName','radar', 'MarkerEdgeColor','red');

    % Add road borders to plot
    laneMarkingPlotter(BEP, 'DisplayName','lane markings');

    % Add the tracks to the bird's-eye plot. Show last 10 track updates.
    trackPlotter(BEP, 'DisplayName','track', 'HistoryDepth',10);

    axis(BEP.Parent, 'equal');
    xlim(BEP.Parent, [-frontBackLim frontBackLim]);
    ylim(BEP.Parent, [-40 40]);

    % Add an outline plotter for ground truth
    outlinePlotter(BEP, 'Tag', 'Ground truth');
end

updateBEP
更新鸟瞰图,包括道路边界,雷达检测点和路径。

function updateBEP(BEP, egoCar, detections, confirmedTracks, psel, vsel)
    % Update road boundaries and their display
    [lmv, lmf] = laneMarkingVertices(egoCar);
    plotLaneMarking(findPlotter(BEP,'DisplayName','lane markings'),lmv,lmf);

    % update ground truth data
    [position, yaw, length, width, originOffset, color] = targetOutlines(egoCar);
    plotOutline(findPlotter(BEP,'Tag','Ground truth'), position, yaw, length, width, 'OriginOffset', originOffset, 'Color', color);

    % update barrier data
    [bPosition,bYaw,bLength,bWidth,bOriginOffset,bColor,numBarrierSegments] = targetOutlines(egoCar, 'Barriers');
    plotBarrierOutline(findPlotter(BEP,'Tag','Ground truth'),numBarrierSegments,bPosition,bYaw,bLength,bWidth,...
                       'OriginOffset',bOriginOffset,'Color',bColor);

    % Prepare and update detections display
    N = numel(detections);
    detPos = zeros(N,2);
    isRadar = true(N,1);
    for i = 1:N
        detPos(i,:) = detections{i}.Measurement(1:2)';
        if detections{i}.SensorIndex > 6 % Vision detections
            isRadar(i) = false;
        end
    end
    plotDetection(findPlotter(BEP,'DisplayName','vision'), detPos(~isRadar,:));
    plotDetection(findPlotter(BEP,'DisplayName','radar'), detPos(isRadar,:));

    % Remove all object tracks that are unidentified by the vision detection
    % generators before updating the tracks display. These have the ObjectClassID
    % parameter value as 0 and include objects such as barriers.
    isNotBarrier = arrayfun(@(t)t.ObjectClassID,confirmedTracks)>0;
    confirmedTracks = confirmedTracks(isNotBarrier);

    % Prepare and update tracks display
    trackIDs = {confirmedTracks.TrackID};
    labels = cellfun(@num2str, trackIDs, 'UniformOutput', false);
    [tracksPos, tracksCov] = getTrackPositions(confirmedTracks, psel);
    tracksVel = getTrackVelocities(confirmedTracks, vsel);
    plotTrack(findPlotter(BEP,'DisplayName','track'), tracksPos, tracksVel, tracksCov, labels);
end

五、驾驶场景仿真

以下循环移动车辆,调用传感器仿真,进行跟踪。
注意场景生成和传感器仿真可以不同步,明确不同时间步分布用于场景仿真和传感器仿真可以实现场景仿真和传感器仿真解耦。这对于构建高精度执行器运动模型估计车辆运动,独立于传感器测量速度有好处。
另一个例子是传感器有不同的更新频率,假设某一传感器更新频率为20ms,而另一传感器更新频率50ms,你可以规定场景的更新频率为10ms,则传感器可以在适当的时间更新数据。
本示例中,场景生成频率为0.01s,传感器检测频率为0.1s,传感器返回一个isValidTime的标志位,指示如果传感器生成检测则该标志位为true,这个标志位可以用于跟踪器的调用(有检测数据时)。
另外需要说明,传感器可以模拟目标物的多个检测点,尤其是当目标物距离传感器非常近的时候。由于跟踪器multiObjectTracker只能跟踪1个目标物的1个检测点,因此跟踪器处理前需要对雷达反射点进行聚类。这将通过设置TargetReportFormat为默认的Clustered detections模式实现。传感器模型可以输出原始的检测数据,或使用内部跟踪器的跟踪数据。

toSnap = true;
while advance(scenario) && ishghandle(BEP.Parent)
    % 获取场景时间
    time = scenario.SimulationTime;

    % 获取另一辆车在主车坐标系中的位置
    ta = targetPoses(egoCar);

    % 传感器仿真
    detectionClusters = {};
    isValidTime = false(1,8);
    for i = 1:8
        [sensorDets,numValidDets,isValidTime(i)] = sensors{i}(ta, time);
        if numValidDets
            for j = 1:numValidDets
                % Vision detections do not report SNR. The tracker requires
                % that they have the same object attributes as the radar
                % detections. This adds the SNR object attribute to vision
                % detections and sets it to a NaN.
                if ~isfield(sensorDets{j}.ObjectAttributes{1}, 'SNR')
                    sensorDets{j}.ObjectAttributes{1}.SNR = NaN;
                end

                % Remove the Z-component of measured position and velocity
                % from the Measurement and MeasurementNoise fields
                sensorDets{j}.Measurement = sensorDets{j}.Measurement([1 2 4 5]);
                sensorDets{j}.MeasurementNoise = sensorDets{j}.MeasurementNoise([1 2 4 5],[1 2 4 5]);
            end
            detectionClusters = [detectionClusters; sensorDets]; %#ok<AGROW>
        end
    end

    % Update the tracker if there are new detections
    if any(isValidTime)
        if isa(sensors{1},'drivingRadarDataGenerator')
            vehicleLength = sensors{1}.Profiles.Length;
        else
            vehicleLength = sensors{1}.ActorProfiles.Length;
        end
        confirmedTracks = updateTracks(tracker, detectionClusters, time);

        % Update bird's-eye plot
        updateBEP(BEP, egoCar, detectionClusters, confirmedTracks, positionSelector, velocitySelector);
    end

    % Snap a figure for the document when the car passes the ego vehicle
    if ta(1).Position(1) > 0 && toSnap
        toSnap = false;
        snapnow
    end
end

点击Run,驾驶场景开始运动。得到输出结果

  • 3
    点赞
  • 22
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 1
    评论
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

努力把公司干倒闭

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值