RealSense T265相机及IMU标定,运行VINS

本文提供了一套详细的Realsense T265 IMU和相机标定流程,包括使用imu_utils工具包标定IMU、相机标定、Camera-IMU联合标定及VINS-Fusion运行指导。通过打印或屏幕显示标定文件,录制数据并调用kalibr算法,最终实现内外参数的精确测量。
摘要由CSDN通过智能技术生成

转自 https://www.jianshu.com/p/194d6c9ef9a4

不确定该文章有没有对IMU和相机的时间做同步!

1. 使用imu_utils工具包标定IMU。

这里有个坑,imu_utils依赖code_utils,但不要同时放到src下进行编译。 先编译code_utils,再编译imu_utils
此外,编译code_utils会报错,

code_utils-master/src/sumpixel_test.cpp:2:24: fatal error: backward.hpp:No such file or directory

此时在code_utils下面找到sumpixel_test.cpp,修改#include "backward.hpp"#include “code_utils/backward.hpp”,再编译。

1.1 打开T265并发布imu话题

修改rs_t265.launch如下:

<arg name="unite_imu_method" default="linear_interpolation"/>

修改后才能发布imu话题,然后开启:

roslaunch realsense2_camera rs_t265.launch

1.2 新建t265_imu.launch

在~/catkin_ws/src/imu_utils/launch中新建t265_imu.launch,如下: (记得把#号后的内容删除)

<launch>
    <node pkg="imu_utils" type="imu_an" name="imu_an" output="screen">
        <param name="imu_topic" type="string" value= "/camera/imu"/> #IMU在ROS中发布的话题
        <param name="imu_name" type="string" value= "BMI055"/> #IMU型号
        <param name="data_save_path" type="string" value= "$(find imu_utils)/data/"/>
        <param name="max_time_min" type="int" value= "60"/> #运行的时长(min),60-120min
        <param name="max_cluster" type="int" value= "200"/> 
    </node>
</launch>

1.3 运行imu校准程序

roslaunch imu_utils t265_imu.launch

等60分钟后就会生成结果,imu_utils/data下的BMI055_imu_param.yaml。该文件给出了加速度计和陀螺仪三轴的noise_density(后缀n)和random_walk(后缀w),同时计算出了平均值,后面IMU+摄像头联合标定的时候需要这些均值。

2. 相机标定

下载官方给的april_6x6_80x80cm_A0.pdf或者其它标定文件。打印或者在屏幕显示,量尺寸后新建或者修改apriltags.yaml。我的如下:

target_type: 'aprilgrid' #gridtype
tagCols: 6               #number of apriltags
tagRows: 6               #number of apriltags
tagSize: 0.03           #size of apriltag, edge to edge [m]
tagSpacing: 0.333          #ratio of space between tags to tagSize
                         #example: tagSize=2m, spacing=0.5m --> tagSpacing=0.25[-]

开启T265

roslaunch realsense2_camera rs_t265.launch

降低图像话题频率

rosrun topic_tools throttle messages /camera/fisheye1/image_raw 4.0 /fisheye1
rosrun topic_tools throttle messages /camera/fisheye2/image_raw 4.0 /fisheye2

然后录制,注意相机要缓慢移动,同时相机视野要充分看到apriltags

rosbag record -O cameras_calibration /fisheye1 /fisheye2

调用kalibr的算法计算各个摄像头的内参和外参
使用默认可能会报错

Using the default setup in the initial run leads to an error of Cameras are not connected through mutual observations, please check the dataset. Maybe adjust the approx. sync. tolerance.

这是由于两个相机之间不同步造成的
解决方法:在末尾加上

--approx-sync 0.04
kalibr_calibrate_cameras --target ../apriltags.yaml --bag ./bag/cameras_calibration.bag --bag-from-to 5 80 --models omni-radtan omni-radtan --topics /fisheye1 /fisheye2 --approx-sync 0.04

omni-radtan为相机模型,可参考[3]。
最终输出一个pdf和txt文件,有内外参数据。

3. Camera-IMU联合标定

新建imu.yaml,将之前IMU标定生成的yaml文件复制过来并改为kalibr提供的格式,如下:

#Accelerometers
accelerometer_noise_density: 1.85e-03   #Noise density (continuous-time)
accelerometer_random_walk:   2.548e-05   #Bias random walk

#Gyroscopes
gyroscope_noise_density:     1.094e-02   #Noise density (continuous-time)
gyroscope_random_walk:       5.897e-04   #Bias random walk

rostopic:                    /imu      #the IMU ROS topic
update_rate:                 200.0      #Hz (for discretization of the values above)

修改rs_t265.launch,一个是保持IMU和图像信息同步,另一个要确保输出IMU数据。

<arg name="enable_sync"         default="true"/>,  
<arg name="unite_imu_method"    default="copy"/>

kalibr推荐IMU 200Hz,图像20Hz,参考上面用topic_tools throttle限制频率,然后录制bag。之后调用kalibr的算法计算IMU和camera外参。

kalibr_calibrate_imu_camera --target ../apriltags.yaml --cam ../camchain-stereo_calibration.yaml --imu ../imu.yaml --bag ../imu_cameras_calibration.bag --bag-from-to 10 100 --max-iter 30 --show-extraction

最终输出results-imucam-stereo_calibration.txt和camchain-imucam-stereo_calibration.yaml

4. 运行VINS-Fusion

在config中新建fisheye1.yaml,fisheye2.yaml,stereo_imu.yaml
将上面生成的联合标定结果参照其它相机config写入。我的如下:

fisheye1.yaml
%YAML:1.0
---
model_type: MEI
camera_name: camera
image_width: 848
image_height: 800
mirror_parameters:
   xi: 1.6943561
distortion_parameters:
   k1: -0.1075293
   k2: 0.6081762
   p1: 0.0029581
   p2: 0.0020715
projection_parameters:
   gamma1: 774.927
   gamma2: 773.762
   u0: 420.086
   v0: 402.516
stereo_imu.yaml
%YAML:1.0

#common parameters
#support: 1 imu 1 cam; 1 imu 2 cam: 2 cam; 
imu: 1         
num_of_cam: 2  

#imu_topic: "/imu0"
imu_topic: "/camera/imu"
#image0_topic: "/cam0/image_raw"
#image1_topic: "/cam1/image_raw"
image0_topic: "/camera/fisheye1/image_raw"
image1_topic: "/camera/fisheye2/image_raw"
output_path: "/home/zhang/Downloads/output/"

cam0_calib: "fisheye1.yaml"
cam1_calib: "fisheye2.yaml"
image_width: 848
image_height: 800

# Extrinsic parameter between IMU and Camera.
estimate_extrinsic: 1   # 0  Have an accurate extrinsic parameters. We will trust the following imu^R_cam, imu^T_cam, don't change it.
                        # 1  Have an initial guess about extrinsic parameters. We will optimize around your initial guess.

body_T_cam0: !!opencv-matrix #cam0 coordinate under body coordinate
   rows: 4
   cols: 4
   dt: d
   data: [-0.9996934517722752, -0.017627605360341365, 0.017385914209325233, 0.005754793009546214,
                0.01763901821691402, -0.999844293739213, 0.0005033025707699151, 0.004283178521816982,
                0.017374335094538958, 0.0008098187417148282, 0.999848726895038, -0.005638553131443425,
                0., 0., 0., 1.]

body_T_cam1: !!opencv-matrix
   rows: 4
   cols: 4
   dt: d
   data: [-0.999768258980969, -0.01371448358867344, 0.016593410561375914, -0.06100400373236911,
                0.013686981433494675, -0.9999047625489491, -0.001769850606391636, 0.004859511023863885,
                0.016616102834345566, -0.0015423267571366712, 0.9998607534825902, -0.0022157241622004077,
                0., 0., 0., 1. ]

#Multiple thread support
multiple_thread: 1

#feature traker paprameters
max_cnt: 150            # max feature number in feature tracking
min_dist: 30            # min distance between two features 
freq: 10                # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image 
F_threshold: 1.0        # ransac threshold (pixel)
show_track: 0           # publish tracking image as topic
flow_back: 1            # perform forward and backward optical flow to improve feature tracking accuracy

#optimization parameters
max_solver_time: 0.04  # max solver itration time (ms), to guarantee real time
max_num_iterations: 8   # max solver itrations, to guarantee real time
keyframe_parallax: 10.0 # keyframe selection threshold (pixel)

#imu parameters       The more accurate parameters you provide, the better performance
acc_n: 1.09387e-02          # accelerometer measurement noise standard deviation. #0.2   0.04
gyr_n: 1.8491e-03         # gyroscope measurement noise standard deviation.     #0.05  0.004
acc_w: 5.8973e-04         # accelerometer bias random work noise standard deviation.  #0.002
gyr_w: 2.5482e-05       # gyroscope bias random work noise standard deviation.     #4.0e-5
g_norm: 9.805         # gravity magnitude

#unsynchronization parameters
estimate_td: 1                      # online estimate time offset between camera and imu
td: 0.004                            # initial value of time offset. unit: s. readed image clock + td = real image clock (IMU clock)

#loop closure parameters
load_previous_pose_graph: 0        # load and reuse previous pose graph; load from 'pose_graph_save_path'
pose_graph_save_path: "/home/zhang/Downloads/output/pose_graph/" # save and load path
save_image: 0                   # save image in pose graph for visualization prupose; you can close this function by setting 0 

然后参考官方的运行步骤[4],效果如下(红色为回环轨迹):
在这里插入图片描述

感谢以下相关文章及作者给的帮助
[1] RealSenseD435i - imu + camera标定
[2] 用imu_utils标定IMU,之后用于kalibr中相机和IMU的联合标定
[3] kalibr camera models
[4] VINS-Fusion
[5] Cheng Wei’s Blog

作者:名字都被取完了妈个鸡
链接:https://www.jianshu.com/p/194d6c9ef9a4
来源:简书
著作权归作者所有。商业转载请联系作者获得授权,非商业转载请注明出处。

评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值