JMU 4th EXPERIMENT HOMEWORK
Exercise 4 Line-based Extended Kalman Filter for robot localization
1 Introduction
正如在之前的练习中所指出的,平台位置的知识对许多机器人应用程序至关重要,我们通过一辆自动驾驶汽车将仓库内的货物从一个地方拖到另一个地方来推动这一点。在这种情况下,练习3演示了如何获得用扫描测距仪感知的线性结构的更抽象的表示。这项练习将表明,给定该表示中的线性特征图,机器人可以根据其感知的线性结构进行定位。本练习严格遵循了[1,第331-342]页中给出的例子。我们强烈建议您在参加练习之前阅读相应的页面,并在整个练习过程中查阅本文档寻求建议。
As pointed out in the previous exercise, knowledge of the location of a platform is essential for a lot of robotics applications, and we motivated this with an autonomous vehicle hauling goods inside a warehouse from one place to another. Within this scenario, Exercise 3 demonstrated how to acquire a more abstract representation of linear structures perceived with a scanning range finder. This exercise will show that—given a map of the linear features in this representation— the robot can localize itself based on the linear structures it perceives. This exercise closely follows the example given in [1, p. 331-342]. We strongly advice you to read the respective pages prior to attending the exercise session and to turn to this document for advice throughout the exercise.
2 Kalman Filter Localization
Extended Kalman Filter
被分为两步
- 预测:
prediction step
- 更新:
update step
2.1 State Prediction
TASK:
-
修改transitionFunction函数
验证:
- Run the function
validateTransitionFunction()
. 如果正确,会输出一条地面真实路径
function [f, F_x, F_u] = transitionFunction(x,u, b)
tt1=x(3); %Tetha t-1
sl=u(1); %Delta SL
sr=u(2); %Delta SRx
f=x+[((sl+sr)/2)*cos(tt1+((sr-sl)/(2*b)));((sl+sr)/2)*sin(tt1+((sr-sl)/(2*b)));(sr-sl)/b];
F_x=[1,0,-sin(tt1 - (sl - sr)/(2*b))*(sl/2 + sr/2)
0,1,cos(tt1 - (sl - sr)/(2*b))*(sl/2 + sr/2)
0,0,1];
F_u=[cos(tt1 - (sl - sr)/(2*b))/2 + (sin(tt1 - (sl - sr)/(2*b))*(sl/2 + sr/2))/(2*b), cos(tt1 - (sl - sr)/(2*b))/2 - (sin(tt1 - (sl - sr)/(2*b))*(sl/2 + sr/2))/(2*b)
sin(tt1 - (sl - sr)/(2*b))/2 - (cos(tt1 - (sl - sr)/(2*b))*(sl/2 + sr/2))/(2*b), sin(tt1 - (sl - sr)/(2*b))/2 + (cos(tt1 - (sl - sr)/(2*b))*(sl/2 + sr/2))/(2*b)
-1/b,1/b];
end
%Symbolic Alorithm
% syms sl sr tt1 xt yt a;
% Symf=[xt;yt;tt1] + [((sl+sr)/2)*cos(tt1+((sr-sl)/(2*a)));((sl+sr)/2)*sin(tt1+((sr-sl)/(2*a)));(sr-sl)/a];
% SymF_x=jacobian(Symf,[xt;yt;tt1]);
% SymF_u=jacobian(Symf,[sl;sr]);
% Evf=eval(subs(Symf,{xt,yt,tt1,sl,sr,a},{x(1),x(2),x(3),u(1),u(2),b}));
% EvF_x=eval(subs(SymF_x,{xt,yt,tt1,sl,sr,a},{x(1),x(2),x(3),u(1),u(2),b}));
% EvF_u=eval(subs(SymF_u,{xt,yt,tt1,sl,sr,a},{x(1),x(2),x(3),u(1),u(2),b}));
% f=Evf;
% F_x=EvF_x;
% F_u=EvF_u;
运行结果
2.2 状态更新:State Update
2.2.1 Measurement Function
TASK:
- 修改
measurementFunction
函数,使其给定一个状态S和map m,能够给出一个预测z - 同时,计算出状态S的雅可比模型
Run the function validateMeasurementFunction(),输出正确即可
function [h, H_x] = measurementFunction(x, m)
h = [m(1) - x(3);
m(2) - (x(1)*cos(m(1)) + x(2)*sin(m(1)))];
H_x = [0, 0, -1;
-cos(m(1)), -sin(m(1)), 0];
[h(1), h(2), isRNegated] = normalizeLineParameters(h(1), h(2));
if isRNegated
H_x(2, :) = - H_x(2, :);
end
输出结果:
>> validateMeasurementFunction
measurement function appears to be correct!
2.2.2 Measurement Association
task:
- 修改
associateMeasurements.m
函数,满足感知到的线与地图正确关联的功能
验证:
- Run the function
validateAssociations()
,输出正确即可
function [v, H, R] = associateMeasurements(x, P, Z, R, M, g)
nMeasurements = size(Z,2);
nMapEntries = size(M,2);
d = zeros(nMeasurements, nMapEntries);
v = zeros(2, nMeasurements * nMapEntries);
H = zeros(2, 3, nMeasurements * nMapEntries);
for i = 1 : nMeasurements
for j = 1 : nMapEntries
[z_priori, H(:, :, j + (i-1) * nMapEntries)] = measurementFunction(x, M(:,j));
v(:,j + (i-1) * nMapEntries) = Z(:,i) - z_priori;
W = H(:, :, j + (i-1) * nMapEntries) * P * H(:, :, j + (i-1) * nMapEntries)' + R(:,:,i);
d(i,j) = v(:,j + (i-1) * nMapEntries)' * inv(W) * v(:,j + (i-1) * nMapEntries);
end
end
% line feature matching (pp. 341)
% association of each measurement to the map point with minimal distance
[minima, map_index] = min(d');
[measurement_index] = find(minima < g^2);
map_index = map_index(measurement_index);
v = v(:, map_index + (measurement_index-1)*nMapEntries);
H = H(:, :, map_index + (measurement_index-1)*nMapEntries);
R = R(:, :, measurement_index);
end
%输出结果
>> validateAssociations
measurement function appears to be correct!
association function appears to be correct!
2.3 更新估计:Updating the Estimate
task:
- 完成
filterStep.m
函数
验证:
- Run the function
validateFilter()
- 若正确则输出:base_line、滤波器输出、控制输入的正向集成
function [x_posteriori, P_posteriori] = filterStep(x, P, u, Z, R, M, k, g, b)
% [x_posteriori, P_posteriori] = filterStep(x, P, u, z, R, M, k, g, b)
% returns an a posteriori estimate of the state and its covariance
% additional bells and whistles in case no line was detected, please
% incorporate this at a sensical position in your code
% propagate the state (p. 337)
Q = k*diag(abs(u));
[x_priori, F_x, F_u] = transitionFunction(x, u, b);
P_priori = F_x * P * F_x' + F_u * Q * F_u';
if size(Z,2) == 0
x_posteriori = x_priori;
P_posteriori = P_priori;
return;
end
[v, H, R] = associateMeasurements(x_priori, P_priori, Z, R, M, g);
y = reshape(v, [], 1);
H = reshape(permute(H, [1,3,2]), [], 3);
R = blockDiagonal(R);
% update state estimates (pp. 335)
S = H * P_priori * H' + R;
K = P_priori * (H' / S);
P_posteriori = (eye(size(P_priori)) - K*H) * P_priori;
x_posteriori = x_priori + K * y;
end
输出结果
>> validateFilter
State transition function appears to be correct!
measurement function appears to be correct!
association function appears to be correct!
3 V-REP Experiment
Task:
完成incrementalLocalization()函数:使其输入先前位姿、控制,雷达扫描数据 。输出其下一步估计位姿和协方差
Validation:
打开VREP,导入scene/mooc_exercises.ttt模型,点击仿真,可以看到机器人、墙、雷达扫描覆盖图像
function [x_posterori, P_posterori] = incrementalLocalization(x, P, u, S, M, params, k, g, b)
% [x_posterori, P_posterori] = incrementalLocalization(x, P, u, S, R, M,
% k, b, g) returns the a posterori estimate of the state and its covariance,
% given the previous state estimate, control inputs, laser measurements and
% the map
C_TR = diag([repmat(0.1^2, 1, size(S, 2)) repmat(0.1^2, 1, size(S, 2))]);
[z, R, ~] = extractLinesPolar(S(1,:), S(2,:), C_TR, params);
figure(2), cla, hold on;
z_prior = zeros(size(M));
for k = 1:size(M,2)
z_prior(:,k) = measurementFunction(x, M(:,k));
end
plot(z(1,:), z(2,:),'bo');
plot(z_prior(1,:), z_prior(2,:),'rx');
xlabel('angle [rad]'); ylabel('distance [m]')
legend('measurement','prior')
drawnow;
% estimate robot pose
[x_posterori, P_posterori] = filterStep(x, P, u, z, R, M, k, g, b);
end