Intel RealSense D435i Calibration

0.引言

.   realsense d435i包含两个红外相机、红外发射器、RGB相机和IMU四个模块,显然四个传感器的空间位置是不同的,我们在处理图像和IMU数据时需要将这些数据都放在统一的坐标系上去。比如我们用d435i运行vins,处理的图像和IMU数据都需要放在同一个坐标系下,因此需要标定IMU相对RGB相机的空间位置(包括旋转和位移)。
  另外,相机固有参数比如焦距、畸变参数等以及IMU的零偏和scale系数等都需要提前知道。前者称为外参,后者称为内参,在运行程序前我们需要标定它们,不论程序是否有自标定功能,毕竟好的初始标定值对于自标定来说也是有利的。

1.标定工具安装

  1. 安装realsense-sdk2.0,包括d435i的驱动等,直到可以运行realsense-viewer,可以看到图像和深度图。
  2. 安装realsense-ros,前提要安装ros-kinetic,这个包可以直接读取d435i的数据流,并发布各个topic,后面标定操作直接订阅相关的topic。
  3. 安装imu_utils,前提要安装code_utils,这个用于标定IMU的噪音密度和随机游走系数。
  4. 安装Kalibr。这个软件包可以同时标定多个相机的外参和内参(提供不同的相机的模型),另外可以标定相机和IMU的外参。

1.1.imu_utils Install

参照官网安装即可。

1.2.kalibr Install

标定顺序:IMU标定 -> 相机标定 -> IMU+相机联合标定. 这么设定顺序是因为最后一步IMU和相机的联合标定需要IMU和相机的内参,现在开始展开说明。

2.IMU标定

  • step1:进入realsense-ros包,开启d435i,发布IMU数据。也可以直接在rs_camera.launch基础上进行更改。
roslaunch realsense2_camera rs_imu_calibration.launch

rs_imu_calibration.launch是在rs_camera.launch基础上针对IMU校准做了修改。目的是将acc、gyro数据对齐使用同一个topic发布。
Modified:

<!-- 更改前原版本arg name="unite_imu_method"          default=""/-->  
<arg name="unite_imu_method"          default="linear_interpolation"/>
<!--或着将参数改为copy-->
<arg name="unite_imu_method"          default="copy"/>
  • step2: 录制imu数据包
 rosbag record -O imu_calibration /camera/imu
  • step3: 运行校准程序

根据imu_utils文件夹里面的A3.launch改写D435i标定启动文件:d435i_imu_calibration.launch注意,记得修改max_time_min对应的参数,默认是120,意味着两个小时,如果ros包里的imu数据长度没有两个小时,就进行不下去,始终停留在wait for imu data这里。后面录制的时候不知道为什么自己停了,只录制了大约一个小时,所以实际数据时间参数填写的50.

d435i_imu_calibration.launch:

<launch>

    <node pkg="imu_utils" type="imu_an" name="imu_an" output="screen">
    	<!--TOPIC名称和上面一致-->
        <param name="imu_topic" type="string" value= "/camera/imu"/>
        <!--imu_name 无所谓-->
        <param name="imu_name" type="string" value= "d435i"/>
        <!--标定结果存放路径-->
        <param name="data_save_path" type="string" value= "$(find imu_utils)/data/"/>
        <!--数据录制时间-min-->
        <param name="max_time_min" type="int" value= "120"/>
        <!--采样频率,即是IMU频率,采样频率可以使用rostopic hz /camera/imu查看,设置为200,为后面的rosbag play播放频率-->
        <param name="max_cluster" type="int" value= "200"/>
    </node>
    
</launch>

执行校准启动文件

roslaunch imu_utils d435i_imu_calibration.launch
  • step4: 回放数据包
rosbag play -r 200 imu_calibration.bag  #以200倍速度进行播放,因为IMU数据录制时间很长,如果就按照默认速度播放会播放过很久!

标定结果imu_utils/data/d435i_imu_param.yaml

%YAML:1.0
---
type: IMU
name: d435i
Gyr:
   unit: " rad/s"
   avg-axis: #取平均值作为标定结果
      gyr_n: 2.1732068912927271e-03 #白噪声
      gyr_w: 1.7797900203083191e-05 #偏置
   x-axis:
      gyr_n: 2.1391701666780305e-03
      gyr_w: 1.9895954849984289e-05
   y-axis:
      gyr_n: 2.5929821451625827e-03
      gyr_w: 1.9897115334756576e-05
   z-axis:
      gyr_n: 1.7874683620375685e-03
      gyr_w: 1.3600630424508712e-05
Acc:
   unit: " m/s^2"
   avg-axis:  #取平均值作为标定结果
      acc_n: 2.3786845794688424e-02  #白噪声
      acc_w: 5.9166889270489845e-04  #偏置
   x-axis:
      acc_n: 2.1534587157119929e-02
      acc_w: 6.2088900106164286e-04
   y-axis:
      acc_n: 2.2044805094743814e-02
      acc_w: 6.2033852597901415e-04
   z-axis:
      acc_n: 2.7781145132201524e-02
      acc_w: 5.3377915107403855e-04

终端输出信息:

[ INFO] [1583551991.407318058]: Loaded imu_topic: /camera/imu
[ INFO] [1583551991.409118658]: Loaded imu_name: d435i
[ INFO] [1583551991.410842690]: Loaded data_save_path: /home/ipsg/D435I/imu_utils/src/imu_utils/data/
[ INFO] [1583551991.412641903]: Loaded max_time_min: 60
[ INFO] [1583551991.414503358]: Loaded max_cluster: 200
gyr x  num of Cluster 200
gyr y  num of Cluster 200
gyr z  num of Cluster 200
acc x  num of Cluster 200
acc y  num of Cluster 200
acc z  num of Cluster 200
wait for imu data.
gyr x  numData 1379952
gyr x  start_t 1.58355e+09
gyr x  end_t 1.58355e+09
gyr x dt 
-------------3600.83 s
-------------60.0138 min
-------------1.00023 h
gyr x  freq 383.231
gyr x  period 0.00260939
gyr y  numData 1379952
gyr y  start_t 1.58355e+09
gyr y  end_t 1.58355e+09
gyr y dt 
-------------3600.83 s
-------------60.0138 min
-------------1.00023 h
gyr y  freq 383.231
gyr y  period 0.00260939
gyr z  numData 1379952
gyr z  start_t 1.58355e+09
gyr z  end_t 1.58355e+09
gyr z dt 
-------------3600.83 s
-------------60.0138 min
-------------1.00023 h
gyr z  freq 383.231
gyr z  period 0.00260939
Gyro X 
C   -1.06724    30.4679   -8.79353    1.24469 -0.0272342
 Bias Instability 2.04796e-05 rad/s
 Bias Instability 1.9896e-05 rad/s, at 110.646 s
 White Noise 7.64079 rad/s
 White Noise 0.00213917 rad/s
  bias -0.147923 degree/s
-------------------
Gyro y 
C   -1.50544    38.3445   -8.85277    1.13221 -0.0212569
 Bias Instability 1.41036e-05 rad/s
 Bias Instability 1.98971e-05 rad/s, at 110.646 s
 White Noise 9.31086 rad/s
 White Noise 0.00259298 rad/s
  bias 0.01896 degree/s
-------------------
Gyro z 
C  -0.866715     24.625   -5.52922    0.72493 -0.0121455
 Bias Instability 6.32623e-06 rad/s
 Bias Instability 1.36006e-05 rad/s, at 96.9284 s
 White Noise 6.42775 rad/s
 White Noise 0.00178747 rad/s
  bias -0.0351278 degree/s
-------------------
==============================================
==============================================
acc x  numData 1379952
acc x  start_t 1.58355e+09
acc x  end_t 1.58355e+09
acc x dt 
-------------3600.83 s
-------------60.0138 min
-------------1.00023 h
acc x  freq 383.231
acc x  period 0.00260939
acc y  numData 1379952
acc y  start_t 1.58355e+09
acc y  end_t 1.58355e+09
acc y dt 
-------------3600.83 s
-------------60.0138 min
-------------1.00023 h
acc y  freq 383.231
acc y  period 0.00260939
acc z  numData 1379952
acc z  start_t 1.58355e+09
acc z  end_t 1.58355e+09
acc z dt 
-------------3600.83 s
-------------60.0138 min
-------------1.00023 h
acc z  freq 383.231
acc z  period 0.00260939
acc X 
C -3.05601e-05   0.00142723 -0.000646156  0.000264777 -1.39214e-06
 Bias Instability 0.000620889 m/s^2
 White Noise 0.0215346 m/s^2
-------------------
acc y 
C -3.20282e-05   0.00142617 -0.000709606   0.00030218 -3.36171e-06
 Bias Instability 0.000620339 m/s^2
 White Noise 0.0220448 m/s^2
-------------------
acc z 
C  -1.2485e-05   0.00135554  0.000129763 -1.62121e-05   3.7871e-06
 Bias Instability 0.000533779 m/s^2
 White Noise 0.0277811 m/s^2
-------------------
[imu_an-2] process has finished cleanly

当然也可以不用录制和回放直接在线标定,也即是1和3步即可。经过这些标定会生成一个yaml文件和很多txt文件,主要是yaml文件,给出了加速度计和陀螺仪三轴的noise_density和random_walk,同时计算出了平均值,后面IMU+摄像头联合标定的时候需要这些均值。

通过官方指令查看默认imu与相机参数(acc、gyro分开的):

rostopic echo /camera/gyro/imu_info
rostopic echo /camera/accel/imu_info
rostopic echo /camera/color/camera_info #相机参数

Alt

可以看出,实际上出厂没有标定IMU?相机出厂内参:

height: 480
width: 640
distortion_model: "plumb_bob"
D: [0.0, 0.0, 0.0, 0.0, 0.0]
K: [616.1290893554688, 0.0, 319.9371032714844, 0.0, 616.3303833007812, 240.64352416992188, 0.0, 0.0, 1.0]
R: [1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0]
P: [616.1290893554688, 0.0, 319.9371032714844, 0.0, 0.0, 616.3303833007812, 240.64352416992188, 0.0, 0.0, 0.0, 1.0, 0.0]

绘制Allan曲线:
Alt
Alt

3.相机标定

使用Kalibr标定相机的内参和多个相机相对位置关系即外参,需要准备kalibr提供的标定物,具体可以在kalibr-wiki目录中找到Calibration Target ,然后找到april grid下载,那里面提供了标定目标的图案和相应的参数。你也可以定做不同大小的标定目标。具体标定步骤:

  • step 0: 安装kalibr。使用kalibr自带的标定物生成脚本生成自己想要尺寸的aprilgrid,将其pdf文件放在电脑屏幕上按照真实尺寸显示,这样免去了打印的麻烦,而且因为目标是在绝对的水平面上,不会引入额外的标定误差.
kalibr_create_target_pdf --type apriltag --nx 6 --ny 6 --tsize 0.02 --tspace 0.28571429
#kalibr_create_target_pdf --type apriltag --nx [NUM_COLS] --ny [NUM_ROWS] --tsize [TAG_WIDTH_M] --tspace [TAG_SPACING_PERCENT]

yaml文件需要按照设定的尺寸进行修改.过程中报错:ImportError: No module named pyxsolve method.
yaml文件修改举例:
在这里插入图片描述

target_type: 'aprilgrid' #gridtype
tagCols: 6               #number of apriltags
tagRows: 6               #number of apriltags
tagSize: 0.02           #size of apriltag, edge to edge [m]
tagSpacing: 0.28571429          #ratio of space between tags to tagSize
  • step 1:进入realsense-ros包,开启d435i,发布摄像头图像的话题
roslaunch realsense2_camera rs_camera.launch                 

注意:d453i是有红外发射器的,可以发射很多红外小斑点,如果打开你会在rviz看到很多光斑,可能不利于标定,所以标定时关闭这个发射器的。关闭发射器:在realsense-viewer里面设置后,设置的参数使用ROS Wrapper打开后依然生效!!这一点在后面的使用中也十分有用,比如设置滤波器等等.

  • step2:将标定目标AprilGrid置于相机前方合理距离范围内,然后缓慢移动标定目标,让所有摄像头均能看到标定物

一定不要太远,不然无法检测到标定目标的特征,在标定算法中需要检测是否有足够数量图片检测到标定特征,否则直接无法标定。移动标定物时候不要过快导致运动模糊,我们只需要获取不同位置和角度的图像,确保图像清晰和特征完整即可。另外要尽可能多角度和多位置(上下左右等)甚至到摄像头捕捉图像的边缘,这样移动目标1min左右即可。

  • step3:降低图像话题的频率,录制图像数据包

kalibr在处理标定数据的时候要求图像的频率不可过高,一般为4hz(后面计算过程报错,改为20)。使用如下指令来限制图像频率:

rosrun topic_tools throttle messages /camera/color/image_raw 20 /color
rosrun topic_tools throttle messages /camera/infra2/image_rect_raw 20 /infra_left
rosrun topic_tools throttle messages /camera/infra1/image_rect_raw 20 /infra_right

然后进行录制(录制了两分钟):

rosbag record -O multicameras_calibration /infra_left /infra_right /color
  • step4:调用kalibr的算法计算各个摄像头的内参和外参
kalibr_calibrate_cameras --target ../yaml/april_6x6_A4.yaml --bag ./bag/0_multicameras_calibration.bag --models pinhole-equi pinhole-equi pinhole-equi --topics /infra_left /infra_right /color --bag-from-to 10 100

or checkboard

kalibr_calibrate_cameras --target checkerboard_8x11_30x30cm.yaml --bag ./bag/0_multicameras_calibration.bag --models pinhole-equi pinhole-equi omni-radtan omni-radtan --topics /cam0/image_raw /cam1/image_raw /cam2/image_raw /cam3/image_raw

命令说明:
april_6x6_A4.yaml – 标定物的参数,具体是标定目标的尺寸之类,因为我是缩小打印在A4上,所以要对参数进行修改;pinhole-equi – 选择的相机模型,kalibr提供了很多相机模型,可以自己选择. --bag-from-to 可以选择时间段,毕竟录制的时候不能保证整体都录制的很好。这个计算的结果会很久,是真的很久,最后会输出一个pdf和txt文件,有内外参数据。

只标定主相机:

kalibr_calibrate_cameras --target ../yaml/april_6x6_A4.yaml --bag ./bag/0_multicameras_calibration.bag --model pinhole-equi  --topic  /color  --show-extraction --approx-sync 0.04

最后还是标定的多相机:

kalibr_calibrate_cameras --target april_6x6_A4.yaml --bag multicameras_calibration.bag --models pinhole-equi pinhole-equi pinhole-equi --topics /infra_left /infra_right /color  --show-extraction --approx-sync 0.04 --bag-from-to 10 100
# --bag-from-to 10 100 选取10到100s之间的数据

期间遇到的报错:

(1).err1.ImportError: cannot import name NavigationToolbar2Wxsolve method.不需要重新编译。
(2).err2

Cameras are not connected through mutual observations, please check the dataset. Maybe adjust the approx. sync. tolerance.
Traceback (most recent call last):
  File "/home/ipsg/D435I/kalibr_ws/devel/bin/kalibr_calibrate_cameras", line 15, in <module>
    exec(compile(fh.read(), python_script, 'exec'), context)
  File "/home/ipsg/D435I/kalibr_ws/src/kalibr/aslam_offline_calibration/kalibr/python/kalibr_calibrate_cameras", line 447, in <module>
    main()
  File "/home/ipsg/D435I/kalibr_ws/src/kalibr/aslam_offline_calibration/kalibr/python/kalibr_calibrate_cameras", line 204, in main
    graph.plotGraph()
  File "/home/ipsg/D435I/kalibr_ws/src/kalibr/aslam_offline_calibration/kalibr/python/kalibr_camera_calibration/MulticamGraph.py", line 311, in plotGraph
    edge_label=self.G.es["weight"],
KeyError: 'Attribute does not exist'

solve method.

报告生成:

(1).PDF
在这里插入图片描述

(2).TXT

Calibration results 
====================
Camera-system parameters:
	cam0 (/infra_left):
	 type: <class 'aslam_cv.libaslam_cv_python.EquidistantDistortedPinholeCameraGeometry'>
	 distortion: [ 0.40059551 -0.73828444  4.12512287 -6.18517722] +- [ 0.0096611   0.06314745  0.09801433  0.08414741]
	 projection: [ 386.71250573  387.18240561  320.55905537  239.8204248 ] +- [ 0.14888357  0.16540833  0.34725177  0.27395728]
	 reprojection error: [-0.000012, -0.000007] +- [0.148506, 0.106890]

	cam1 (/infra_right):
	 type: <class 'aslam_cv.libaslam_cv_python.EquidistantDistortedPinholeCameraGeometry'>
	 distortion: [ 0.36572062 -0.38257775  3.28883181 -6.72070632] +- [ 0.01450564  0.12847927  0.29582794  0.15893508]
	 projection: [ 386.94988374  387.35006645  321.07250734  240.44316048] +- [ 0.13223764  0.16242541  0.3057414   0.26940346]
	 reprojection error: [0.000004, -0.000005] +- [0.164897, 0.108821]

	cam2 (/color):
	 type: <class 'aslam_cv.libaslam_cv_python.EquidistantDistortedPinholeCameraGeometry'>
	 distortion: [  0.24904226   2.8043341  -15.57145308  26.7679881 ] +- [ 0.0099252   0.09318402  0.2472954   0.11722548]
	 projection: [ 611.67156667  611.71826379  323.28627384  240.57596379] +- [ 0.10541019  0.13846652  0.29617104  0.23250236]
	 reprojection error: [0.000003, -0.000001] +- [0.207269, 0.167030]

	baseline T_1_0:
	 q: [-0.00053684  0.00066745  0.00003144  0.99999963] +- [ 0.00122298  0.00096443  0.00015755]
	 t: [ 0.04996444  0.00001392  0.0000511 ] +- [ 0.00013475  0.00012981  0.00032963]

	baseline T_2_1:
	 q: [ 0.00164081 -0.00396511  0.00180494  0.99998916] +- [ 0.00097517  0.00060299  0.00012675]
	 t: [ 0.01598901 -0.00016613  0.00066788] +- [ 0.00010335  0.00010625  0.00027807]



Target configuration
====================

  Type: aprilgrid
  Tags: 
    Rows: 6
    Cols: 6
    Size: 0.021 [m]
    Spacing 0.00614285721 [m]

4.IMU+相机联合标定

在标定IMU和camera的时候可以选择多个相机和IMU一起,也可以选择一个相机,前面已将多个相机进行了标定,如果需要可以只标定主相机(作为参照坐标系的相机)与IMU的相对位置。

  • step0:准备IMU和camera配置文件,将之前标定的数据按照kalibr的yaml文件准备好.

将Acc和Gyr的第一组平均数据拷贝到kalibr对应的imu.yml文件中,

rostopic: /camera/imu  #由于后面要重命名topic名字,这里改为rostopic: /imu
update_rate: 200.0 #Hz
 
accelerometer_noise_density: 2.3786845794688424e-02  #白噪声
accelerometer_random_walk: 5.9166889270489845e-04  #偏置
gyroscope_noise_density: 2.1732068912927271e-03 #continous
gyroscope_random_walk: 1.7797900203083191e-05

camchain-multicameras_calibration.yaml中复制单个相机信息进行单目+IMU标定,例如\color + \imu: color_d435i.yaml,多目+IMU就直接使用camchain-multicameras_calibration.yaml

cam0:
  T_cn_cnm1:
  - [0.9999620402124648, 0.0035968244723977626, 0.007936056188530766, 0.015989010745942417]
  - [-0.0036228484455317336, 0.9999880998809918, 0.003267271879129718, -0.0001661290746026598]
  - [-0.007924209945064884, -0.003295898983009658, 0.9999631712951373, 0.0006678796643165944]
  - [0.0, 0.0, 0.0, 1.0]
  cam_overlaps: [0, 1]
  camera_model: pinhole
  distortion_coeffs: [0.24904225693789128, 2.804334100694488, -15.571453079619436,
    26.767988102446466]
  distortion_model: equidistant
  intrinsics: [611.6715666677603, 611.7182637869023, 323.2862738374678, 240.57596378988336]
  resolution: [640, 480]
  rostopic: /color
  • step1:进入realsense-ros包,开启d435i,rs_camera.launch发布摄像头图像和IMU数据的话题,rs_camera.launch需要修改.
    Modified:
...
<arg name="enable_sync"               default="true"/>,  
...
 <arg name="unite_imu_method"          default="copy"/>

目的:一个是保持IMU和图像信息同步,另一个输出同步IMU数据。

  • step2:固定标定目标,确保摄像头能够提取特征前提下充分调整d435i的姿势和位置,录制数据包.

具体可以参照kalibr的视频,视频中是先面对标定目标,然后俯仰、偏航和横滚三个角度分别面向目标运动,然后是前后左右和上下运动,充分运动起来。因为kalibr推荐IMU 200Hz,图像20Hz,所以用topic_tools throttle限制频率。

注意:用topic_tools throttle限制频率后一定要看一下限制后的topic输出频率:rostopic hz /topic ,你会发现实际的频率与设定的频率并不一致,你可能需要设置不同的数值比如:rosrun topic_tools throttle messages /topic_1 25 /topic_2,如果topic_1是40hz,/topic_2可能不是25hz,而是20hz,具体原因不明。有知道的可以留言告知,谢谢!

切记一定要保证目标在图像中清晰可见,同时要求整个视频时间尽量短,否则后续优化耗时很久。

kalibr在处理标定数据的时候要求图像的频率不可过高,降低图像数据到20HZ,IMU数据至200HZ。。使用如下指令来限制图像频率:

rosrun topic_tools throttle messages /camera/imu  200 /imu
rosrun topic_tools throttle messages /camera/color/image_raw 20 /color
rosrun topic_tools throttle messages /camera/infra2/image_rect_raw 20 /infra_left
rosrun topic_tools throttle messages /camera/infra1/image_rect_raw 20 /infra_right

然后进行录制(录制了两分钟):

rosbag record -O camera_imu_calibration   /color /imu  /infra_left  /infra_right  #多目
rosbag record -O camera_imu_calibration   /color /imu #单目
  • step3:调用kalibr的算法计算IMU和camera外参

单目+IMU:

kalibr_calibrate_imu_camera --target april_6x6_A4.yaml --cam  color_d435i.yaml --imu imu_d435i.yaml --bag imu_cameras_calibration.bag --bag-from-to 10 100 --max-iter 15  --show-extraction

单目+IMU,棋盘格:

 kalibr_calibrate_imu_camera --target checkerboard_8x11_30x30cm.yaml --cam ./20200528calibr/color_d435i.yaml --imu ./20200528calibr/imu_d435i.yaml --bag  ./0529calibr/camera_imu_calibration.bag  --imu-models scale-misalignment --max-iter 15  --show-extraction 

多目+IMU:

kalibr_calibrate_imu_camera --target april_6x6_A4.yaml --cam camchain-multicameras_calibration.yaml --imu imu_d435i.yaml --bag  camera_imu_calibration.bag  --bag-from-to 10 100 --max-iter 15  --show-extraction

注释:

--bag-from-to 10 100 选择10-100s之间的数据.
--max-iter 15 设置优化迭代次数为15次,默认30.
--show-extraction 展示特征提取情况.

多目优化时矩阵维数太大,优化失败:

在这里插入图片描述

标定报告:

(1).PDF
在这里插入图片描述

(2).TXT

Calibration results
===================
Normalized Residuals
----------------------------
Reprojection error (cam0):     mean 0.309357318376, median 0.251843164322, std: 0.230997537619
Gyroscope error (imu0):        mean 0.265159164537, median 0.228561532289, std: 0.176229173919
Accelerometer error (imu0):    mean 0.221421363455, median 0.195499299021, std: 0.128105231257

Residuals
----------------------------
Reprojection error (cam0) [px]:     mean 0.309357318376, median 0.251843164322, std: 0.230997537619
Gyroscope error (imu0) [rad/s]:     mean 0.00814934517662, median 0.00702456135722, std: 0.00541618982304
Accelerometer error (imu0) [m/s^2]: mean 0.0744854379606, median 0.065765338454, std: 0.0430941897671

Transformation (cam0):
-----------------------
T_ci:  (imu0 to cam0): 
[[ 0.99996271 -0.00181009  0.00844446  0.02651246]
 [ 0.00171691  0.9999377   0.01102943  0.00830124]
 [-0.0084639  -0.01101452  0.99990352 -0.01286511]
 [ 0.          0.          0.          1.        ]]

T_ic:  (cam0 to imu0): 
[[ 0.99996271  0.00171691 -0.0084639  -0.02663461]
 [-0.00181009  0.9999377  -0.01101452 -0.00839444]
 [ 0.00844446  0.01102943  0.99990352  0.01254843]
 [ 0.          0.          0.          1.        ]]

timeshift cam0 to imu0: [s] (t_imu = t_cam + shift)
-0.0253737394755


Gravity vector in target coords: [m/s^2]
[ 0.25486723 -9.78147004 -0.65292373]


Calibration configuration
=========================

cam0
-----
  Camera model: pinhole
  Focal length: [611.6715666677603, 611.7182637869023]
  Principal point: [323.2862738374678, 240.57596378988336]
  Distortion model: equidistant
  Distortion coefficients: [0.24904225693789128, 2.804334100694488, -15.571453079619436, 26.767988102446466]
  Type: aprilgrid
  Tags: 
    Rows: 6
    Cols: 6
    Size: 0.021 [m]
    Spacing 0.00614285721 [m]



IMU configuration
=================

IMU0:
----------------------------
  Model: calibrated
  Update rate: 200.0
  Accelerometer:
    Noise density: 0.0237868457947 
    Noise density (discrete): 0.336396799289 
    Random walk: 0.000591668892705
  Gyroscope:
    Noise density: 0.00217320689129
    Noise density (discrete): 0.0307337865951 
    Random walk: 1.77979002031e-05
  T_i_b
    [[ 1.  0.  0.  0.]
     [ 0.  1.  0.  0.]
     [ 0.  0.  1.  0.]
     [ 0.  0.  0.  1.]]
  time offset with respect to IMU0: 0.0 [s]

kalibr多目标定与联合标定后会生成标定报告以及标定出的外参值,标定报告会直接给出相机坐标的空间位置示意图,结合标定结果和实际位置比较可以较为直观的判断结果是否可靠;另外就是看重投影误差,这个值越小越好,自己标定出来误差有点大,猜测是标定板参数不精确导致。内参数的话也可以关注一下,我使用的d435i,标定后双面的内参基本一样,不过和自带/camera_info还是有所区别的,但是不是很大。

5.VINS Yaml文件配置

realsense config update in 202005292.

%YAML:1.0

#common parameters
imu_topic: "/camera/imu"
image_topic: "/camera/color/image_raw"
output_path: "/home/fb/output/"

#camera calibration
model_type: PINHOLE
camera_name: camera
image_width: 640
image_height: 480
  #TODO modify distortion

distortion_parameters:
   k1: 0.10364469
   k2: -0.1823355
   p1: 0.002330617
   p2: 0.0037446
projection_parameters:
   fx: 601.226091
   fy: 601.3432164
   cx: 332.171979
   cy: 240.5101526

# Extrinsic parameter between IMU and Camera.
estimate_extrinsic: 0   # 0  Have an accurate extrinsic parameters. We will trust the following imu^R_cam, imu^T_cam, don't change it.
                        # 1  Have an initial guess about extrinsic parameters. We will optimize around your initial guess.
                        # 2  Don't know anything about extrinsic parameters. You don't need to give R,T. We will try to calibrate it. Do some rotation movement at beginning.
#If you choose 0 or 1, you should write down the following matrix.
#Rotation from camera frame to imu frame, imu^R_cam
extrinsicRotation: !!opencv-matrix
   rows: 3
   cols: 3
   dt: d
   # data: [  0.99977841,  0.00412757,  0.02064201,
   #         -0.00374241,  0.99981883, -0.01866291,
   #          -0.0207153,   0.01858152,  0.99961273 ] 
   data: [ 0.99998318, -0.00302186, -0.00495021,
           0.00296347,  0.99992645, -0.01176032,
           0.00498539,  0.01174545,  0.99991859 ]   
#Translation from camera frame to imu frame, imu^T_cam
extrinsicTranslation: !!opencv-matrix
   rows: 3
   cols: 1
   dt: d
   # data: [-0.02154582, 0.00681016, 0.02740755]
   data: [-0.0132516, -0.00082214, 0.01535377]



#feature traker paprameters
max_cnt: 200           # max feature number in feature tracking
min_dist: 15            # min distance between two features
freq: 10                # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image
F_threshold: 1.0        # ransac threshold (pixel)
show_track: 1           # publish tracking image as topic
equalize: 0             # if image is too dark or light, trun on equalize to find enough features
fisheye: 0              # if using fisheye, trun on it. A circle mask will be loaded to remove edge noisy points

#optimization parameters
max_solver_time: 0.04  # max solver itration time (ms), to guarantee real time
max_num_iterations: 8   # max solver itrations, to guarantee real time
keyframe_parallax: 10.0 # keyframe selection threshold (pixel)

#imu parameters       The more accurate parameters you provide, the better performance
#for handheld, wheeld
acc_n: 0.021793          # accelerometer measurement noise standard deviation. #0.2
gyr_n: 0.00215568        # gyroscope measurement noise standard deviation.     #0.05
acc_w: 0.00050207        # accelerometer bias random work noise standard deviation.  #0.02
gyr_w: 1.71656e-05      # gyroscope bias random work noise standard deviation.     #4.0e-5

#for tracked applications
#acc_n: 0.5          # accelerometer measurement noise standard deviation. #0.2
#gyr_n: 0.01         # gyroscope measurement noise standard deviation.     #0.05
#acc_w: 0.001         # accelerometer bias random work noise standard deviation.  #0.02
#gyr_w: 2.0e-5       # gyroscope bias random work noise standard deviation.     #4.0e-5


g_norm: 9.805       # gravity magnitude

#loop closure parameters
loop_closure: 1                    # start loop closure
fast_relocalization: 1             # useful in real-time and large project
load_previous_pose_graph: 0        # load and reuse previous pose graph; load from 'pose_graph_save_path'
#pose_graph_save_path: "/home/fb/output/pose_graph/" # save and load path
pose_graph_save_path: "*****"
#unsynchronization parameters
estimate_td: 1                      # online estimate time offset between camera and imu
td: -0.0237                          # initial value of time offset. unit: s. readed image clock + td = real image clock (IMU clock)

#rolling shutter parameters
rolling_shutter: 1                      # 0: global shutter camera, 1: rolling shutter camera
rolling_shutter_tr: 0.033               # unit: s. rolling shutter read out time per frame (from data sheet).

#visualization parameters
save_image: 1                   # save image in pose graph for visualization prupose; you can close this function by setting 0
visualize_imu_forward: 0        # output imu forward propogation to achieve low latency and high frequence results
visualize_camera_size: 0.4      # size of camera marker in RVIZ

备份VINS_ws配置:

%YAML:1.0

#common parameters
imu_topic: "/camera/imu"
image_topic: "/camera/color/image_raw"
output_path: "/home/ipsg/output/"

#camera calibration 
model_type: PINHOLE
camera_name: camera
image_width: 640
image_height: 480
distortion_parameters:
   k1: 9.2615504465028850e-02
   k2: -1.8082438825995681e-01
   p1: -6.5484100374765971e-04
   p2: -3.5829351558557421e-04
projection_parameters:
   fx: 616.1290893554688
   fy: 616.3303833007812
   cx: 319.9371032714844
   cy: 240.64352416992188


# Extrinsic parameter between IMU and Camera.
estimate_extrinsic: 2   # 0  Have an accurate extrinsic parameters. We will trust the following imu^R_cam, imu^T_cam, don't change it.
                        # 1  Have an initial guess about extrinsic parameters. We will optimize around your initial guess.
                        # 2  Don't know anything about extrinsic parameters. You don't need to give R,T. We will try to calibrate it. Do some rotation movement at beginning.                        
#If you choose 0 or 1, you should write down the following matrix.
#Rotation from camera frame to imu frame, imu^R_cam
extrinsicRotation: !!opencv-matrix
   rows: 3
   cols: 3
   dt: d
   data: [ 0.99964621,  0.01105994,  0.02418954,
           -0.01088975,  0.9999151,  -0.00715601, 
           -0.02426663,  0.00689006,  0.99968178]
#Translation from camera frame to imu frame, imu^T_cam
extrinsicTranslation: !!opencv-matrix
   rows: 3
   cols: 1
   dt: d
   data: [0.07494282, -0.01077138, -0.00641822]

#feature traker paprameters
max_cnt: 150            # max feature number in feature tracking
min_dist: 25            # min distance between two features 
freq: 10                # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image 
F_threshold: 1.0        # ransac threshold (pixel)
show_track: 1           # publish tracking image as topic
equalize: 0             # if image is too dark or light, trun on equalize to find enough features
fisheye: 0              # if using fisheye, trun on it. A circle mask will be loaded to remove edge noisy points

#optimization parameters
max_solver_time: 0.04  # max solver itration time (ms), to guarantee real time
max_num_iterations: 8   # max solver itrations, to guarantee real time
keyframe_parallax: 10.0 # keyframe selection threshold (pixel)

#imu parameters       The more accurate parameters you provide, the better performance
#acc_n: 0.1          # accelerometer measurement noise standard deviation. #0.2
#gyr_n: 0.01         # gyroscope measurement noise standard deviation.     #0.05
#acc_w: 0.0002         # accelerometer bias random work noise standard deviation.  #0.02
#gyr_w: 2.0e-5       # gyroscope bias random work noise standard deviation.     #4.0e-5
#g_norm: 9.805       # gravity magnitude
acc_n: 0.2          # accelerometer measurement noise standard deviation. #0.2
gyr_n: 0.05         # gyroscope measurement noise standard deviation.     #0.05
acc_w: 0.02         # accelerometer bias random work noise standard deviation.  #0.02
gyr_w: 4.0e-5       # gyroscope bias random work noise standard deviation.     #4.0e-5
g_norm: 9.81       # gravity magnitude



#loop closure parameters
loop_closure: 1                    # start loop closure
fast_relocalization: 1             # useful in real-time and large project
load_previous_pose_graph: 0        # load and reuse previous pose graph; load from 'pose_graph_save_path'
pose_graph_save_path: "/home/tony-ws1/output/pose_graph/" # save and load path

#unsynchronization parameters
estimate_td: 1                      # online estimate time offset between camera and imu
td: 0.000                           # initial value of time offset. unit: s. readed image clock + td = real image clock (IMU clock)

#rolling shutter parameters
rolling_shutter: 0                      # 0: global shutter camera, 1: rolling shutter camera
rolling_shutter_tr: 0             # unit: s. rolling shutter read out time per frame (from data sheet). 

#visualization parameters
save_image: 1                   # save image in pose graph for visualization prupose; you can close this function by setting 0 
visualize_imu_forward: 0        # output imu forward propogation to achieve low latency and high frequence results
visualize_camera_size: 0.4      # size of camera marker in RVIZ
在 Windows 操作系统上进行 Intel RealSense D435i 相机标定,您可以按照以下步骤进行操作: 1. 安装 RealSense SDK: 首先,您需要下载并安装 Intel RealSense SDK。确保您选择适用于 Windows 操作系统的最新版本。 2. 打开 RealSense Viewer: 在安装完成后,打开 RealSense Viewer 应用程序。这个应用程序可以帮助您进行相机标定和调试。 3. 连接相机: 将 RealSense D435i 相机通过 USB 线缆连接到计算机的 USB 端口,并等待系统识别并连接相机。 4. 打开标定工具: 在 RealSense Viewer 中,选择 "Calibration" 选项卡,然后点击 "Calibration Tool" 按钮。这将打开相机标定工具。 5. 选择标定模式: 在标定工具中,选择 "Depth and RGB" 标定模式。这个模式会同时标定深度和彩色图像。 6. 放置标定板: 在标定过程中,您需要准备一个已知尺寸的标定板,并将其放置在相机可见的位置。确保标定板在不同的角度和位置都能被相机看到。 7. 开始标定: 点击标定工具界面中的 "Start" 按钮开始标定过程。在标定过程中,您需要按照屏幕上的指示移动标定板以覆盖不同的视角。 8. 完成标定: 当标定过程完成后,系统会生成一组相机内外参数。您可以选择将这些参数保存到文件中,以便在后续的应用中使用。 请注意,以上步骤是一个基本的标定流程。对于更高级的标定需求,您可能需要参考 Intel RealSense SDK 的文档和示例代码。希望这些信息对您有所帮助!
评论 45
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值