建模实践4 —— 在gazebo中进行简单的仿真
在优化物理仿真模型基础中,已经添加了机器人的碰撞属性和惯性参数,添加了Transmissions、gazebo_ros_control 插件,接下来要做的就是为机器人的控制器编写配置文件。
1.创建控制器
第一步:在husky_kinova/husky_kinova_control/config文件夹下创建控制器的配置文件control.yaml:
joint_state_controller:
type: "joint_state_controller/JointStateController"
publish_rate: 50
husky_diff_drive_controller:
type: "diff_drive_controller/DiffDriveController"
left_wheel: ['front_left_wheel', 'rear_left_wheel']
right_wheel: ['front_right_wheel', 'rear_right_wheel']
publish_rate: 50
pose_covariance_diagonal: [0.001, 0.001, 0.001, 0.001, 0.001, 0.03]
twist_covariance_diagonal: [0.001, 0.001, 0.001, 0.001, 0.001, 0.03]
cmd_vel_timeout: 0.25
velocity_rolling_window_size: 2
# Base frame_id
base_frame_id: base_link
# Odometry fused with IMU is published by robot_localization, so
# no need to publish a TF based on encoders alone.
enable_odom_tf: false
# Husky hardware provides wheel velocities
estimate_velocity_from_position: false
# Wheel separation and radius multipliers
wheel_separation_multiplier: 1.875 # default: 1.0
wheel_radius_multiplier : 1.0 # default: 1.0
# Velocity and acceleration limits
# Whenever a min_* is unspecified, default to -max_*
linear:
x:
has_velocity_limits : true
max_velocity : 1.0 # m/s
has_acceleration_limits: true
max_acceleration : 3.0 # m/s^2
angular:
z:
has_velocity_limits : true
max_velocity : 2.0 # rad/s
has_acceleration_limits: true
max_acceleration : 6.0 # rad/s^2
finger_1_position_controller:
joint: j2n6s300_joint_finger_1
pid:
d: 0
i: 0
p: 10
type: effort_controllers/JointPositionController
finger_2_position_controller:
joint: j2n6s300_joint_finger_2
pid:
d: 0
i: 0
p: 10
type: effort_controllers/JointPositionController
finger_3_position_controller:
joint: j2n6s300_joint_finger_3
pid:
d: 0
i: 0
p: 10
type: effort_controllers/JointPositionController
finger_tip_1_position_controller:
joint: j2n6s300_joint_finger_tip_1
pid:
d: 0
i: 0
p: 0.5
type: effort_controllers/JointPositionController
finger_tip_2_position_controller:
joint: j2n6s300_joint_finger_tip_2
pid:
d: 0
i: 0
p: 0.5
type: effort_controllers/JointPositionController
finger_tip_3_position_controller:
joint: j2n6s300_joint_finger_tip_3
pid:
d: 0
i: 0
p: 0.5
type: effort_controllers/JointPositionController
joint_1_position_controller:
joint: j2n6s300_joint_1
pid:
d: 0
i: 0
p: 5000
type: effort_controllers/JointPositionController
joint_2_position_controller:
joint: j2n6s300_joint_2
pid:
d: 0
i: 0
p: 5000
type: effort_controllers/JointPositionController
joint_3_position_controller:
joint: j2n6s300_joint_3
pid:
d: 0
i: 0
p: 5000
type: effort_controllers/JointPositionController
joint_4_position_controller:
joint: j2n6s300_joint_4
pid:
d: 0
i: 0
p: 500
type: effort_controllers/JointPositionController
joint_5_position_controller:
joint: j2n6s300_joint_5
pid:
d: 0
i: 0
p: 200
type: effort_controllers/JointPositionController
joint_6_position_controller:
joint: j2n6s300_joint_6
pid:
d: 0
i: 0
p: 500
type: effort_controllers/JointPositionController
其中joint_state_controller用于发布机器人的关节状态,它以50Hz的频率发布机器人的关节状态;
其中的husky_diff_drive_controller是husky的差速驱动控制器,用于驱动husky的四个轮子;
另外的finger_1_position_controller、finger_2_position_controller、finger_3_position_controller、
finger_tip_1_position_controller、finger_tip_2_position_controller、finger_tip_3_position_controller、
joint_1_position_controller、joint_2_position_controller、joint_3_position_controller、
joint_4_position_controller、joint_5_position_controller、joint_6_position_controller
是关节位置控制器,被分配给kinova机械臂的各个关节,而且还定义了PID增益,用于接收每个关节的目标位置并驱动每个关节运动。
第二步:创建husky_kinova_control.launch文件:
<?xml version="1.0"?>
<launch>
<rosparam command="load" file="$(find husky_kinova_control)/config/control.yaml" />
<!-- Spawn controllers -->
<node name="husky_controller_spawner" pkg="controller_manager" type="spawner"
args="joint_state_controller husky_diff_drive_controller
joint_1_position_controller joint_2_position_controller
joint_3_position_controller joint_4_position_controller
joint_5_position_controller joint_6_position_controller
finger_2_position_controller finger_1_position_controller
finger_3_position_controller finger_tip_1_position_controller
finger_tip_2_position_controller finger_tip_3_position_controller "/>
</launch>
实现了加载控制器配置信息、加载关节状态控制器和关节位置控制器,运行机器人状态发布者(负责发布关节状态和tf)。
我将该启动文件包含在husky_kinova_empty_world.launch中,在启动gazebo
仿真环境的时候启动控制器:
<include file="$(find husky_kinova_control)/launch/husky_kinova_control.launch">
</include>
启动仿真:
cd catkin_ws
source devel/setup.bash
roslaunch husky_kinova_gazebo husky_kinova_empty_world.launch
手动发送指令:
rostopic pub -1 /joint_1_position_controller/command std_msgs/Float64 "data: 3.0"
rostopic pub -1 /joint_2_position_controller/command std_msgs/Float64 "data: 4.0"
可以任意发布指令测试各个关节位置控制器:
发布指令驱动husky小车做圆周运动:
rostopic pub -r 10 /husky_diff_drive_controller/cmd_vel geometry_gs/Twist "linear:
x: 0.5
y: 0.0
z: 0.0
angular:
x: 0.0
y: 0.0
z: 0.5"
2.创建物理仿真环境:
2.1 直接添加环境模型
打开gazebo,添加insert模型,然后file/save world as到自己的worlds目录下
2.2 使用Building Editor
打开一个空的环境
roslaunch husky_kinova_gazebo husky_kinova_empty_world.launch
快捷键Ctrl+B打开building editor打开绘图区
绘制完成后自动保存到home文件夹下的building_editor_models中,再次导出.world下次打开即可。
运行键盘控制节点时报错:
ERROR: cannot launch node of type [husky_kinova_control/teleop.py]: can't locate node [teleop.py] in package [husky_kinova_control]
解决办法:https://blog.csdn.net/qifengle315/article/details/103470117
添加按键控制节点,编写python脚本文件,读取键盘信息,将命令发布到/husky_diff_drive_controller/cmd_vel话题,运行该节点便可实现对gazebo中的husky控制。
3. 添加传感器
3.1 Camera仿真
<gazebo reference="${prefix}_link">
<sensor type="camera" name="camera_node">
<update_rate>30.0</update_rate>
<camera name="head">
<horizontal_fov>1.3962634</horizontal_fov>

<clip>
<near>0.02</near>
<far>300</far>
</clip>
<noise>
<type>gaussian</type>
<mean>0.0</mean>
<stddev>0.007</stddev>
</noise>
</camera>
<plugin name="gazebo_camera" filename="libgazebo_ros_camera.so">
<alwaysOn>true</alwaysOn>
<updateRate>0.0</updateRate>
<cameraName>/camera</cameraName>
<imageTopicName>image_raw</imageTopicName>
<cameraInfoTopicName>camera_info</cameraInfoTopicName>
<frameName>camera_link</frameName>
<hackBaseline>0.07</hackBaseline>
<distortionK1>0.0</distortionK1>
<distortionK2>0.0</distortionK2>
<distortionK3>0.0</distortionK3>
<distortionT1>0.0</distortionT1>
<distortionT2>0.0</distortionT2>
</plugin>
</sensor>
</gazebo>
-
< seneor >标签:描述传感器;
type:传感器类型,camera;
name:摄像头命名,自由设置; -
< update_rate >:摄像头更新频率,一秒钟30帧,即1秒钟更新30幅图像;
-
< camera >标签:描述摄像头参数;
< horizontal_fov >:俯仰角
< image >:1280*720分辨率,图像编码格式为RGB
< clip >:最近看到0.02米,最远看到300米;
< noise >:仿真一些高斯噪声; -
< plugin >标签:加载摄像头仿真插件;
gazebo提供的一个摄像头仿真插件,libgazebo_ros_camera.so,位于 /opt/ros/kinetic/lib 路径下;
打开仿真:
roslaunch husky_kinova_gazebo husky_kinova_with_camera_empty_world.launch
rostopic list 发现当前话题中多了以下这些:
/camera/camera_info
/camera/image_raw
/camera/image_raw/compressed
/camera/image_raw/compressed/parameter_descriptions
/camera/image_raw/compressed/parameter_updates
/camera/image_raw/compressedDepth
/camera/image_raw/compressedDepth/parameter_descriptions
/camera/image_raw/compressedDepth/parameter_updates
/camera/image_raw/theora
/camera/image_raw/theora/parameter_descriptions
/camera/image_raw/theora/parameter_updates
/camera/parameter_descriptions
/camera/parameter_updates
rqt_image_view 查看当前模拟摄像头采集回来的场景:
3.2 RGB_D摄像头仿真(kinect)
Kinetic除了可以获取RGB信息之外,还可以获取深度信息,通过红外感知摄像头和障碍物的深度信息。
<gazebo reference="${prefix}_link">
<sensor type="depth" name="${prefix}">
<always_on>true</always_on>
<update_rate>20.0</update_rate>
<camera>
<horizontal_fov>${60.0*M_PI/180.0}</horizontal_fov>

<clip>
<near>0.05</near>
<far>8.0</far>
</clip>
</camera>
<plugin name="kinect_${prefix}_controller" filename="libgazebo_ros_openni_kinect.so">
<cameraName>${prefix}</cameraName>
<alwaysOn>true</alwaysOn>
<updateRate>10</updateRate>
<imageTopicName>rgb/image_raw</imageTopicName>
<depthImageTopicName>depth/image_raw</depthImageTopicName>
<pointCloudTopicName>depth/points</pointCloudTopicName>
<cameraInfoTopicName>rgb/camera_info</cameraInfoTopicName>
<depthImageCameraInfoTopicName>depth/camera_info</depthImageCameraInfoTopicName>
<frameName>${prefix}_frame_optical</frameName>
<baseline>0.1</baseline>
<distortion_k1>0.0</distortion_k1>
<distortion_k2>0.0</distortion_k2>
<distortion_k3>0.0</distortion_k3>
<distortion_t1>0.0</distortion_t1>
<distortion_t2>0.0</distortion_t2>
<pointCloudCutoff>0.4</pointCloudCutoff>
</plugin>
</sensor>
</gazebo>
打开仿真:
roslaunch husky_kinova_gazebo husky_kinova_with_kinetic_empty_world.launch
rostopic list 发现当前话题中多了以下这些:
/kinect/depth/camera_info
/kinect/depth/image_raw
/kinect/depth/points
/kinect/parameter_descriptions
/kinect/parameter_updates
/kinect/rgb/camera_info
/kinect/rgb/image_raw
/kinect/rgb/image_raw/compressed
/kinect/rgb/image_raw/compressed/parameter_descriptions
/kinect/rgb/image_raw/compressed/parameter_updates
/kinect/rgb/image_raw/compressedDepth
/kinect/rgb/image_raw/compressedDepth/parameter_descriptions
/kinect/rgb/image_raw/compressedDepth/parameter_updates
/kinect/rgb/image_raw/theora
/kinect/rgb/image_raw/theora/parameter_descriptions
/kinect/rgb/image_raw/theora/parameter_updates
通过rviz来显示:
rosrun rviz rviz
点击Add->PointCloud2
设置Topic 为 /kinect/depth/points
设置 Fixed Frame 为 kinect_link
点击Add->image,将Topic 设置为 /kinect/rgb/image_raw ,让RGB图像也显示出来。
3.3 激光雷达仿真
<gazebo reference="${prefix}_link">
<sensor type="ray" name="rplidar">
<pose>0 0 0 0 0 0</pose>
<visualize>false</visualize>
<update_rate>5.5</update_rate>
<ray>
<scan>
<horizontal>
<samples>360</samples>
<resolution>1</resolution>
<min_angle>-3</min_angle>
<max_angle>3</max_angle>
</horizontal>
</scan>
<range>
<min>0.10</min>
<max>6.0</max>
<resolution>0.01</resolution>
</range>
<noise>
<type>gaussian</type>
<mean>0.0</mean>
<stddev>0.01</stddev>
</noise>
</ray>
<plugin name="gazebo_rplidar" filename="libgazebo_ros_laser.so">
<topicName>/scan</topicName>
<frameName>laser_link</frameName>
</plugin>
</sensor>
</gazebo>