ROS下UR5、usb_cam单目外参标定(使用easy_handeye、眼在手上eye-in-hand)

近期借助easy_handeye标定包完成了基于usb摄像头和UR5机械臂的外参标定,采用眼在手上(eye-in-hand)的标定方式。现在对整个标定过程做一个总结,以供大家参考和交流。

一、软件环境配置

硬件平台:PC、UR5、USB摄像头
软件平台:Ubuntu16.04、ROS kinetic

依赖包的安装(注意对应的ROS版本号)

  1. usb_cam驱动包安装
sudo apt-get install ros-indigo-usb-cam
  1. aruco_ros安装
$cd ~/catkin_ws/src/
$git clone https://github.com/pal-robotics/aruco_ros.git
$cd ..
$catkin_make install

  1. easy_handeye安装

visp安装

cd ~/catkin_ws/src
git clone -b kinetic-devel https://github.com/lagadic/vision_visp.git
cd ..
catkin_make --pkg visp_hand2eye_calibration

easy_handeye安装

cd ~/catkin_ws/src
git clone https://github.com/IFL-CAMP/easy_handeye
cd ..
catkin_make

二、修改标定launch文件

针对官方给出的标定launch文件进行修改,ubuntu下的文件路径 …/easy_handeye/docs/example_launch/ur5_kinect_calibration.launch
这个launch文件能够启动

<launch>
    <arg name="namespace_prefix" default="ur5_kinect_handeyecalibration" />

    <arg name="robot_ip" doc="The IP address of the UR5 robot" />

    <!--arg name="marker_size" doc="Size of the ArUco marker used, in meters" /-->
    <!--arg name="marker_id" doc="The ID of the ArUco marker used" /-->

    <!-- from aruco pkg -->
    <arg name="marker_id"        default="582"/>
    <arg name="marker_size"      default="0.036"/> 

    <!-- start the usb_cam-->
    <include file="$(find usb_cam)/launch/usb_cam-test.launch" />

    <!-- start ArUco -->
    <node name="aruco_tracker" pkg="aruco_ros" type="single">
        <remap from="/camera_info" to="/usb_cam/camera_info" />        //换成usb-cam包对应的话题
        <remap from="/image" to="/usb_cam/image_raw" />                //同上
        <param name="image_is_rectified" value="true"/>
        <param name="marker_size"        value="$(arg marker_size)"/>          
        <param name="marker_id"          value="$(arg marker_id)"/>            
        <param name="reference_frame"    value="usb_cam"/>
        <param name="camera_frame"       value="usb_cam"/>
        <param name="marker_frame"       value="aruco_marker_frame" />
    </node>

    <!-- start the robot -->
    <include file="$(find ur_modern_driver)/launch/ur5_bringup.launch">
        <arg name="limited" value="true" />
        <arg name="robot_ip" value="192.168.1.101" />                           
    </include>
    <include file="$(find ur5_moveit_config)/launch/ur5_moveit_planning_execution.launch">
        <arg name="limited" value="true" />
    </include>

    <!-- start easy_handeye -->
    <include file="$(find easy_handeye)/launch/calibrate.launch" >
        <arg name="namespace_prefix" value="$(arg namespace_prefix)" />
        <arg name="eye_on_hand" value="true" />
        <arg name="tracking_base_frame" value="usb_cam" />                 //同"reference_frame"
        <arg name="tracking_marker_frame" value="aruco_marker_frame" />    //二维码坐标系
        <arg name="robot_base_frame" value="base_link" />
        <arg name="robot_effector_frame" value="wrist_3_link" />

        <arg name="freehand_robot_movement" value="false" />
        <arg name="robot_velocity_scaling" value="0.5" />
        <arg name="robot_acceleration_scaling" value="0.2" />
    </include>

</launch>


三、手眼标定过程

运行ur5_kinect_calibration.launch开始标定,会启动下面2个界面和rviz界面
图1
图2
图2中打开菜单栏选择Plugins—visualization—image view,继续选择/aruco_tracker/result即为图中效果

1.用示教盒操作机械臂使二维码处于视野中心
2.点击图1中的check starting pose使之显示 0/17
3.点击next pose,点击plan显示绿色即可以点击execute
4.在图2点击take sample,重复3.4步直到17个点采样完毕
5.在图1中点击compute,会显示一组7个解的变换矩阵(四元数表示)

接下来发布此转换,需要修改publish.launch以正确发布
只需要给"eye_on_hand"赋值“true”即可,指定此转换为eye-in-hand
否则会有如下错误
在这里插入图片描述

<launch>
    <arg name="eye_on_hand" doc="eye-on-hand instead of eye-on-base" value="true"/>   //加一个value="true"即可
    <arg name="namespace_prefix" default="easy_handeye" />
    <arg if="$(arg eye_on_hand)" name="namespace" value="$(arg namespace_prefix)_eye_on_hand" />
    <arg unless="$(arg eye_on_hand)" name="namespace" value="$(arg namespace_prefix)_eye_on_base" />

    <!--it is possible to override the link names saved in the yaml file in case of name clashes, for example-->
    <arg if="$(arg eye_on_hand)" name="robot_effector_frame" default="" />
    <arg unless="$(arg eye_on_hand)" name="robot_base_frame" default="" />
    <arg name="tracking_base_frame" default="" />
    
    <arg name="inverse" default="false" />
    
    <!--publish hand-eye calibration-->
    <group ns="$(arg namespace)">
        <param name="eye_on_hand" value="$(arg eye_on_hand)" />
        <param unless="$(arg eye_on_hand)" name="robot_base_frame" value="$(arg robot_base_frame)" />
        <param if="$(arg eye_on_hand)" name="robot_effector_frame" value="$(arg robot_effector_frame)" />
        <param name="tracking_base_frame" value="$(arg tracking_base_frame)" />
        
        <param name="inverse" value="$(arg inverse)" />
        <node name="$(anon handeye_publisher)" pkg="easy_handeye" type="publish.py" output="screen"/>
    </group>
</launch>

最后将这个publish.launch文件加入你的启动文件中,即可发布这个已经标定的转换,即camera_link到end_effector的转换,这是我的启动文件,最后包含了publish.launch

<launch>

   <!-- start the UR5-->
   <include file="$(find ur_modern_driver)/launch/ur5_bringup.launch">
        <arg name="limited" value="true" />
        <arg name="robot_ip" value="192.168.1.101" />
   </include>

   <include file="$(find ur5_moveit_config)/launch/ur5_moveit_planning_execution.launch">
      <arg name="limited" value="true" />
   </include>
 
   <include file="$(find ur5_moveit_config)/launch/moveit_rviz.launch">
      <arg name="config" value="true" />
   </include>

   <!-- start the aruco-->
   <include file="$(find aruco_ros)/launch/single.launch">
      <!--arg name="config" value="true" /-->
   </include>

   <!-- start usb_cam-->

   <include file="$(find usb_cam)/launch/usb_cam-test.launch">
   </include>

   <!-- start the calibration publish-->
   <include file="$(find easy_handeye)/launch/publish.launch">
  </include>

</launch>

四、结果

启动文件,可以看到rviz有如下结果
在这里插入图片描述
识别出二维码,但是由于有两个采样点没有采到(二维码跑到视场之外,没法采样),所以usb_cam的位置与实际位置有一些偏差

整个流程就是这些,欢迎交流

评论 10
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值