ROS By Example_hydro_volume1_CN_6,7

6. INSTALLING THE ARBOTIX SIMULATOR
    To test our code on a simulated robot, we will use the arbotix_python simulator found in the ArbotiX stack by Michael Ferguson.

7. CONTROLLING A MOBILE BASE
    In this chapter we will learn how to control a mobile base that uses a pair of differential drive wheels and a passive caster wheel for balance. ROS can also be used to control an omni-directional base as well as flying robots or underwater vehicles but a land-based differential drive robot is a good place to start.
    在本章中,我们将学习如何控制使用一对差分驱动轮和被动脚轮平衡的移动平台。ROS也可以用来控制万向平台以及飞行机器人或水下车辆,但陆基差分驱动机器人是一个良好的开端。

7.1 Units and Coordinate Systems

7.1单位和坐标系


    Before we can send movement commands to our robot, we need to review the measurement units and coordinate system conventions used in ROS.

    在我们把移动命令发送给我们的机器人之前,我们需要查看ROS使用的测量单位和坐标系约定。


    When working with reference frames, keep in mind that ROS uses a right-hand convention for orienting the coordinate axes as shown on left. The index and middle fingers point along the positive x and y axes and the thumb points in the direction of the positive z axis. The direction of a rotation about an axis is defined by the right-hand rule shown on the right: if you point your thumb in the positive direction of any axis, your fingers curl in the direction of a positive rotation. For a mobile robot using ROS, the x-axis points forward, the y-axis points to the left and the z-axis points upward. Under the right-hand rule, a positive rotation of the robot about the z-axis is counterclockwise while a negative rotation is clockwise.

    用参考坐标,记住,ROS使用右手定则定向坐标轴,如图左侧。的食指和中指沿正x和y轴以及在正z轴方向上的拇指点指向。一个绕轴旋转的方向被定义为右侧所示的右手法则:如果你点任何轴的正方向拇指,手指蜷缩在一个正向旋转的方向。对于使用ROS的移动机器人,x轴点向前,y轴指向左侧和z轴点向上。根据右手定则,围绕z轴的机器人的正旋转是逆时针而负顺时针旋转。


    Remember also that ROS uses the metric system so that linear velocities are always specified in meters per second (m/s) and angular velocities are given in radians per second (rad/s). A linear velocity of 0.5 m/s is actually quite fast for an indoor robot (about 1.1 mph) while an angular speed of 1.0 rad/s is equivalent to about one rotation in 6 seconds or 10 RPM. When in doubt, start slowly and gradually increase speed. For an indoor robot, I tend to keep the maximum linear speed at or below 0.2 m/s.
    也请记住,ROS使用公制系统,始终指定线速度是每秒(米/秒)和角速度米每秒(弧度/秒)弧度。为0.5米/秒的线性速度实际上是相当快的室内机器人(约1.1英里),而为1.0弧度/秒的角速度相当于6秒左右旋转一周或10的RPM。如果有疑问,慢慢开始,逐渐增加速度。对于室内机器人予倾向于保持最大的线性速度等于或低于0.2米/秒。

7.2 Levels of Motion Control

7.2水平运动控制


    Controlling a mobile robot can be done at a number of levels and ROS provides methods for most of them. These levels represent different degrees of abstraction, beginning with direct control of the motors and proceeding upward to path planning and SLAM (Simultaneous Localization and Mapping).

    可以在多个层面进行控制移动机器人和ROS为它们大多数提供了方法。这些级别代表了不同程度的抽象,首先是直接控制电机,向上到路径规划和SLAM(同步定位与地图)。


7.2.1 Motors, Wheels, and Encoders

7.2.1电机,车轮,和编码器


    Most differential drive robots running ROS use encoders on the drive motors or wheels. An encoder registers a certain number of ticks (usually hundreds or even thousands) per revolution of the corresponding wheel. Knowing the diameter of the wheels and the distance between them, encoder ticks can be converted to the distance traveled in meters or the angle rotated in radians. To compute speed, these values are simply divided by the time interval between measurements.

    大多数运行ROS的差分驱动机器人使用编码器来驱动电机或轮子。编码器存储一定数量每个相应车轮的转速刻度(成百上千)。知道车轮的直径和它们之间的距离,编码器刻度可以转换为以米行进的距离或以弧度的角度旋转。计算速度,这些值由测量之间的时间间隔简单地划分。


    This internal motion data is collectively known as odometry and ROS makes heavy use of it as we shall see. It helps if your robot has accurate and reliable encoders but wheel data can be augmented using other sources. For example, the original TurtleBot uses a single-axis gyro to provide an additional measure of the robot's rotational motion since the iRobot Create's encoders are notably inaccurate during rotations.

    该内部运动数据统称为测距,并且ROS大量使用了这些数据,正如我们将要看到的。它帮助,如果你的机器人具有准确而可靠的编码器,但轮的数据可以通过其他途径补充。例如,原来的TurtleBot使用单个轴陀螺仪来提供附加测量机器人的旋转运动,因为iRobot公司创建编码器中的旋转显着不正确的。


    It is important to keep in mind that no matter how many sources of odometry data are used, the actual position and speed of the robot in the world can (and probably will) differ from the values reported by the odometry. The degree of discrepancy will vary depending on the environmental conditions and the reliability of the odometry sources.

    重要的是要记住,无论使用多少不同来源的测距数据,机器人在世界的实际位置和速度可以(并且很可能)与测距报告的值不同。差异程度将根据环境条件和测距源来源的可靠性而改变。


7.2.2 Motor Controllers and Drivers
7.2.2电机控制器和驱动器
    At the lowest level of motion control we need a driver for the robot's motor controller that can turn the drive wheels at a desired speed, usually using internal units such as encoder ticks per second or a percentage of max speed. With the exception of the Willow Garage PR2 and TurtleBot, the core ROS packages do not include drivers for specific motor controllers. However, a number of third-party ROS developers have published drivers for some of the more popular controllers and/or robots such as the Arduino, ArbotiX, Serializer, Element, LEGO® NXT and Rovio. (For a more complete list of supported platforms, see Robots Using ROS.)

    在运动控制的最低级,我们需要一个机器人的电机控制器的驱动,此驱动可以以所需的速度驱动车轮,通常使用内部单位例如每秒编码刻度或者每最大速度百分比。唯一的例外是Willow Garage的PR2和TurtleBot,核心ROS包不包括特定的电机控制器的驱动程序。然而,一些第三方ROS开发者发布一些比较流行的控制器和/或机器人的驱动,如Arduino的,ArbotiX,Serializer(串行),元,乐高NXT和Rovio。 (有关支持的平台的完整列表,请参阅机器人使用ROS)。


7.2.3 The ROS Base Controller
    At the next level of abstraction, the desired speed of the robot is specified in real-world units such as meters and radians per second. It is also common to employ some form of PID control. PID stands for "Proportional Integral Derivative" and is so-named because the control algorithm corrects the wheel velocities based not only on the difference (proportional) error between the actual and desired velocity, but also on the derivative and integral over time. You can learn more about PID control on Wikipedia. For our purposes, we simply need to know that the controller will do its best to move the robot in the way we have requested.

    在抽象的下一级,机器人的期望速度指定为真实世界的单位,例如米每秒和弧度每秒。它也是通常使用某种形式的PID控制。 PID代表“比例积分微分”并且被如此命名,是因为控制算法修正车轮速度 不仅基于实际和期望的速度之间的差(比例),但基于随时间的微积分。您可以在维基百科上了解更多关于PID控制。对于我们而言,我们只需要知道,控制器将尽最大努力朝着我们所要求的方式移动机器人。


    The driver and PID controller are usually combined inside a single ROS node called the base controller. The base controller must always run on a computer attached directly to the motor controller and is typically one of the first nodes launched when bringing up the robot. A number of base controllers can also be simulated in Gazebo including the TurtleBot, PR2 and Erratic.

    驱动器和PID控制器通常结合在单一ROS节点内,这个节点称为主控制器。主控制器必须总是在计算机上运行,这个计算机直接连接到电机控制器。这个主控制器通常是启动机器人时首先启动的节点之一。一些主控制器还可以在Gazebo上模拟,Gazebo包括TurtleBot,PR2和Erratic(不稳定)。


    The base controller node typically publishes odometry data on the /odom topic and listens for motion commands on the /cmd_vel topic. At the same time, the controller node typically (but not always) publishes a transform from the /odom frame to the base frame—either /base_link or /base_footprint. We say "not always" because some robots like the TurtleBot, uses the robot_pose_ekf package to combine wheel odometry and gyro data to get a more accurate estimate of the robot's position and orientation. In this case, it is the robot_pose_ekf node that publishes the transform from /odom to /base_footprint. (The robot_pose_ekf package implements an Extended Kalman Filter as you can read about on the Wiki page linked to above.)

    主控制器节点通常在 /odom(用来测距)话题上发布测距数据,并监听/cmd_vel话题上的运动命令。同时,通常控制器节点(但不总是)发布一个从/odom坐标到基坐标的转换--要么到/ base_link或/ base_footprint的转换。我们说“不总是”,因为一些机器人像TurtleBot,采用robot_pose_ekf包结合车轮测距和陀螺仪数据以获得所述机器人的位置和方向的更精确的估计。在这种情况下,robot_pose_ekf节点发布从/odom 到/ base_footprint的变换。 (该robot_pose_ekf包实现扩展卡尔曼滤波器,正如你可以连接到上面的维基页面上阅读。)


    Once we have a base controller for our robot, ROS provides the tools we need to issue motion commands either from the command line or by using other ROS nodes to publish these commands based on a higher level plan. At this level, it does not matter what hardware we are using for our base controller: our programming can focus purely on the desired linear and angular velocities in real-world units and any code we write should work on any base controller with a ROS interface.

    一旦我们有我们的机器人的主控制器,ROS提供了我们需要处理运动命令的工具,无论命令来自命令行或通过使用其他ROS节点发布基于一个更高的层次规划的命令。在这个层面上,对于我们主控制器,我们使用什么硬件都不重要:我们的程序能完全专注于真实世界单位的所需的线速度和角速度,我们写的任何代码应该能够通过一个ROS 接口工作在任何主控制器上。


7.2.4 Frame-Base Motion using the move_base ROS Package
7.2.4使用move_base ROS包的基于坐标运动
    At the next level of abstraction, ROS provides the move_base package that allows us to specify a target position and orientation of the robot with respect to some frame of reference; move_base will then attempt to move the robot to the goal while avoiding obstacles. The move_base package is a very sophisticated path planner and combines odometry data with both local and global cost maps when selecting a path for the robot to follow. It also controls the linear and angular velocities and accelerations automatically based on the minimum and maximum values we set in the configuration files.

     在抽象的下一层,ROS提供了move_base包。move_base包允许我们指定一个相对于某些参考系的机器人目标位置和方向; move_base之后尝试将机器人移动到目标位置和方向,同时避开障碍物。该move_base包是一个非常复杂的路径规划,并且move_base包在选择机器人遵循的路径时,结合了本地和全球成本的地图的测距数据。它也自动基于最小和最小值控制线速度、角速度和加速度(我们在配置文件中设置最大值和最小值)。


7.2.5 SLAM using the gmapping and amcl ROS Packages
7.2.5 使用映射和AMCL ROS包的SLAM
    At an even higher level, ROS enables our robot to create a map of its environment using the SLAM gmapping package. The mapping process works best using a laser scanner but can also be done using a Kinect or Asus Xtion depth camera to provide a simulated laser scan. If you own a TurtleBot, the TurtleBot stack includes all the tools you need to do SLAM.

    在一个更高的层次,ROS能够让我们的机器人创建一个地图。创建地图的环境使用SLAM gmapping包。地图创建过程最好效果是使用激光扫描仪,但也可以使用Kinect(超高动力学)或华硕Xtion深度摄像头,来提供一个模拟的激光扫描。如果你拥有一个TurtleBot,TurtleBot堆包含了所有你需要做的SLAM的工具。


    Once a map of the environment is available, ROS provides the amcl package (adaptive Monte Carlo localization) for automatically localizing the robot based on its current scan and odometry data. This allows the operator to point and click on any location on a map and the robot will find its way there while avoiding obstacles. (For a superb introduction to the mathematics underlying SLAM, check out Sebastian Thrun's online Artificial Intelligence course on Udacity.)

    一旦环境地图是可用的,ROS提供了amcl包(自适应蒙特卡洛本地化)用来根据它的当前扫描和测距数据自动定位的机器人。这允许操作者指向和点击在地图上任意位置,机器人能够找到去那里的路径,同时避开障碍物。(对于一个极好的介绍基本SLAM数学,看看塞巴斯蒂安史朗的在线人工智能课程Udacity。)


7.2.6 Semantic Goals
7.2.6语义目标
    Finally, at the highest level of abstraction, motion goals are specified semantically such as "go to the kitchen and bring me a beer", or simply, "bring me a beer". In this case, the semantic goal must be parsed and translated into a series of actions. For actions requiring the robot to move to a particular location, each location can be passed to the localization and path planning levels for implementation. While beyond the scope of this volume, a number of ROS packages are available to help with this task including smach, executive_teer, worldmodel, semantic_framer, and knowrob.

    最后,在抽象的最高水平,运动目标在语义上指定,如“去厨房给我拿啤酒”,或者简单地说,“给我一杯啤酒”。在这种情况下,语义目标必须被解析并翻译成一系列的动作。对于需要的机器人移动到特定位置的动作,每个位置都要传递到定位和路径规划层去实施。虽然超出了本卷的范围,一些ROS包可以帮助完成这个任务,包括smach,executive_teer,worldmodel,semantic_framer和knowrob。


7.2.7 Summary
    In summary, our motion control hierarchy looks something like this:
     总之,我们的运动控制层次看起来是这样的:
Goal
AMCL(acml包:用来根据它的当前扫描和测距数据自动定位的机器人)
Path Planner(路径规划)
move_base (move_base包允许我们指定一个相对于某些参考系的机器人目标位置和方向,也有路径规划)
/cmd_vel + /odom
Base Controller
Motor Speeds
    In this chapter and the next, we will learn how to use these levels of motion control. But before we can understand the more powerful features provided by move_base, gmapping and amcl, we need to start with the basics.
    在本章和下一章中,我们将学习如何使用这些运动控制层。但在此之前,我们要理解move_base,gmapping和AMCL提供更强大的功能,我们需要从基础开始。

/home/buslab/Desktop/ROSbyExample_cn


  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
当前大多数搞机器人开发的用户所装的ROS是indigo版本,而且是基于Ubuntu14.04的。如果你跑别的版本的rbx代码老出错,不用怀疑,就是代码版本问题!ros by example for indigo volume 1很多地方(包括CSDN)都可以下载,而volume 2则只此一家哦!下面是本书的目录: Contents Preface................................................................................................................vii Printed vs PDF Versions of the Book...............................................................ix 1. Scope of this Volume.......................................................................................1 2. Installing the ros-by-example Code...............................................................3 3. Task Execution using ROS.............................................................................7 3.1 A Fake Battery Simulator.....................................................................................8 3.2 A Common Setup for Running the Examples.....................................................10 3.3 A Brief Review of ROS Actions........................................................................11 3.4 A Patrol Bot Example.........................................................................................12 3.5 The Patrol Bot using a Standard Script...............................................................13 3.6 Problems with the Script Approach....................................................................16 3.7 SMACH or Behavior Trees?..............................................................................17 3.8 SMACH: Tasks as State Machines.....................................................................17 3.8.1 SMACH review.......................................................................................................18 3.8.2 Patrolling a square using SMACH..........................................................................19 3.8.3 Testing SMACH navigation in the ArbotiX simulator............................................23 3.8.4 Accessing results from a SimpleActionState...........................................................26 3.8.5 SMACH Iterators.....................................................................................................27 3.8.6 Executing commands on each transition.................................................................30 3.8.7 Interacting with ROS topics and services................................................................31 3.8.8 Callbacks and Introspection.....................................................................................36 3.8.9 Concurrent tasks: Adding the battery check to the patrol routine...........................36 3.8.10 Comments on the battery checking Patrol Bot......................................................44 3.8.11 Passing user data between states and state machines............................................44 3.8.12 Subtasks and hierarchical state machines..............................................................48 3.8.13 Adding the battery check to the house cleaning robot...........................................54 3.8.14 Drawbacks of state machines................................................................................54 3.9 Behavior Trees...................................................................................................55 3.9.1 Behavior Trees versus Hierarchical State Machines...............................................56 3.1.2 Key properties of behavior trees..............................................................................57 3.9.3 Building a behavior tree..........................................................................................58 3.9.4 Selectors and sequences...........................................................................................60 3.9.5 Customizing behaviors using decorators (meta-behaviors).....................................61 3.10 Programming with Behavior Trees and ROS....................................................63 3.10.1 Installing the pi_trees library.................................................................................63 3.10.2 Basic components of the pi_trees library..............................................................63 3.10.3 ROS-specific behavior tree classes........................................................................68 3.10.4 A Patrol Bot example using behavior trees..........................................................72 3.10.5 A housing cleaning robot using behavior trees.....................................................79 3.10.6 Parallel tasks..........................................................................................................85 3.10.7 Adding and removing tasks...................................................................................87 4. Creating a URDF Model for your Robot....................................................89 4.1 Start with the Base and Wheels..........................................................................90 4.1.1 The robot_state_publisher and joint_state_publisher nodes....................................91 4.1.2 The base URDF/Xacro file......................................................................................92 4.1.3 Alternatives to using the /base_footprint frame......................................................97 4.1.4 Adding the base to the robot model.........................................................................97 4.1.5 Viewing the robot's transform tree..........................................................................98 4.1.6 Using a mesh for the base........................................................................................99 4.2 Simplifying Your Meshes.................................................................................104 4.3 Adding a Torso.................................................................................................104 4.3.1 Modeling the torso.................................................................................................105 4.3.2 Attaching the torso to the base..............................................................................106 4.3.3 Using a mesh for the torso.....................................................................................107 4.3.4 Adding the mesh torso to the mesh base...............................................................108 4.4 Measure, Calculate and Tweak.........................................................................110 4.5 Adding a Camera..............................................................................................110 4.5.1 Placement of the camera........................................................................................111 4.5.2 Modeling the camera.............................................................................................112 4.5.3 Adding the camera to the torso and base...............................................................114 4.5.4 Viewing the transform tree with torso and camera................................................115 4.5.5 Using a mesh for the camera.................................................................................116 4.5.6 Using an Asus Xtion Pro instead of a Kinect........................................................118 4.6 Adding a Laser Scanner (or other Sensors)......................................................119 4.6.1 Modeling the laser scanner....................................................................................119 4.6.2 Attaching a laser scanner (or other sensor) to a mesh base...................................120 4.6.3 Configuring the laser node launch file..................................................................121 4.7 Adding a Pan and Tilt Head..............................................................................122 4.7.1 Using an Asus Xtion Pro instead of a Kinect........................................................124 4.7.2 Modeling the pan-and-tilt head..............................................................................124 4.7.3 Figuring out rotation axes......................................................................................127 4.7.4 A pan and tilt head using meshes on Pi Robot......................................................128 4.7.5 Using an Asus Xtion Pro mesh instead of a Kinect on Pi Robot...........................129 4.8 Adding One or Two Arms................................................................................129 4.8.1 Placement of the arm(s).........................................................................................130 4.8.2 Modeling the arm...................................................................................................130 4.8.3 Adding a gripper frame for planning.....................................................................133 4.8.4 Adding a second arm.............................................................................................134 4.8.5 Using meshes for the arm servos and brackets......................................................136 4.9 Adding a Telescoping Torso to the Box Robot.................................................138 4.10 Adding a Telescoping Torso to Pi Robot........................................................139 4.11 A Tabletop One-Arm Pi Robot.......................................................................140 4.12 Testing your Model with the ArbotiX Simulator............................................142 4.12.1 A fake Box Robot................................................................................................142 4.12.2 A fake Pi Robot...................................................................................................145 4.13 Creating your own Robot Description Package..............................................145 4.13.1 Using rosbuild......................................................................................................145 4.13.2 Using catkin.........................................................................................................146 4.13.3 Copying files from the rbx2_description package...............................................147 4.13.4 Creating a test launch file....................................................................................147 5. Controlling Dynamixel Servos: Take 2......................................................149 5.1 Installing the ArbotiX Packages.......................................................................149 5.2 Launching the ArbotiX Nodes..........................................................................150 5.3 The ArbotiX Configuration File.......................................................................154 5.4 Testing the ArbotiX Joint Controllers in Fake Mode........................................160 5.5 Testing the Arbotix Joint Controllers with Real Servos....................................162 5.6 Relaxing All Servos..........................................................................................165 5.7 Enabling or Disabling All Servos.....................................................................168 6. Robot Diagnostics........................................................................................169 6.1 The DiagnosticStatus Message.........................................................................170 6.2 The Analyzer Configuration File......................................................................171 6.3 Monitoring Dynamixel Servo Temperatures....................................................172 6.3.1 Monitoring the servos for a pan-and-tilt head.......................................................172 6.3.2 Viewing messages on the /diagnostics topic.........................................................175 6.3.3 Protecting servos by monitoring the /diagnostics topic.........................................177 6.4 Monitoring a Laptop Battery............................................................................181 6.5 Creating your Own Diagnostics Messages.......................................................182 6.6 Monitoring Other Hardware States...................................................................188 7. Dynamic Reconfigure..................................................................................191 7.1 Adding Dynamic Parameters to your own Nodes.............................................192 7.1.1 Creating the .cfg file..............................................................................................192 7.1.2 Making the .cfg file executable.............................................................................193 7.1.3 Configuring the CMakeLists.txt file......................................................................194 7.1.4 Building the package.............................................................................................194 7.2 Adding Dynamic Reconfigure Capability to the Battery Simulator Node........194 7.3 Adding Dynamic Reconfigure Client Support to a ROS Node.........................198 7.4 Dynamic Reconfigure from the Command Line...............................................201 8. Multiplexing Topics with mux & yocs.......................................................203 8.1 Configuring Launch Files to Use mux Topics..................................................204 8.2 Testing mux with the Fake TurtleBot...............................................................205 8.3 Switching Inputs using mux Services...............................................................206 8.4 A ROS Node to Prioritize mux Inputs..............................................................207 8.5 The YOCS Controller from Yujin Robot..........................................................210 8.5.1 Adding input sources.............................................................................................213 9. Head Tracking in 3D...................................................................................215 9.1 Tracking a Fictional 3D Target.........................................................................216 9.2 Tracking a Point on the Robot..........................................................................217 9.3 The 3D Head Tracking Node............................................................................220 9.3.1 Real or fake head tracking.....................................................................................220 9.1.2 Projecting the target onto the camera plane...........................................................221 9.4 Head Tracking with Real Servos......................................................................224 9.4.1 Real servos and fake target....................................................................................225 9.4.2 Real servos, real target...........................................................................................226 9.4.3 The nearest_cloud.py node and launch file...........................................................228 10. Detecting and Tracking AR Tags.............................................................233 10.1 Installing and Testing the ar_track_alvar Package..........................................234 10.1.1 Creating your own AR Tags................................................................................234 10.1.2 Generating and printing the AR tags...................................................................236 10.1.3 Launching the camera driver and ar_track_alvar node.......................................236 10.1.4 Testing marker detection.....................................................................................238 10.1.5 Understanding the /ar_pose_marker topic...........................................................238 10.1.6 Viewing the markers in RViz..............................................................................240 10.2 Accessing AR Tag Poses in your Programs....................................................240 10.2.1 The ar_tags_cog.py script....................................................................................240 10.2.2 Tracking the tags with a pan-and-tilt head..........................................................244 10.3 Tracking Multiple Tags using Marker Bundles..............................................245 10.4 Following an AR Tag with a Mobile Robot....................................................245 10.4.1 Running the AR follower script on a TurtleBot .................................................248 10.5 Exercise: Localization using AR Tags............................................................249 11. Arm Navigation using MoveIt!.................................................................251 11.1 Do I Need a Real Robot with a Real Arm?.....................................................252 11.2 Degrees of Freedom.......................................................................................252 11.3 Joint Types.....................................................................................................253 11.4 Joint Trajectories and the Joint Trajectory Action Controller.........................254 11.5 Forward and Inverse Arm Kinematics............................................................257 11.6 Numerical versus Analytic Inverse Kinematics..............................................258 11.7 The MoveIt! Architecture...............................................................................258 11.8 Installing MoveIt!...........................................................................................260 11.9 Creating a Static URDF Model for your Robot .............................................261 11.10 Running the MoveIt! Setup Assistant...........................................................262 11.10.1 Load the robot's URDF model...........................................................................263 11.2.2 Generate the collision matrix.............................................................................264 11.10.3 Add the base_odom virtual joint.......................................................................264 11.10.4 Adding the right arm planning group................................................................265 11.10.5 Adding the right gripper planning group...........................................................269 11.10.6 Defining robot poses..........................................................................................271 11.10.7 Defining end effectors.......................................................................................273 11.10.8 Defining passive joints......................................................................................273 11.10.9 Generating the configuration files.....................................................................273 11.11 Configuration Files Created by the MoveIt! Setup Assistant........................275 11.11.1 The SRDF file (robot_name.srdf)......................................................................275 11.11.2 The fake_controllers.yaml file...........................................................................276 11.11.3 The joint_limits.yaml file..................................................................................277 11.11.4 The kinematics.yaml file...................................................................................278 11.12 The move_group Node and Launch File.......................................................280 11.13 Testing MoveIt! in Demo Mode...................................................................280 11.13.1 Exploring additional features of the Motion Planning plugin...........................284 11.13.2 Re-running the Setup Assistant at a later time..................................................285 11.14 Testing MoveIt! from the Command Line....................................................286 11.15 Determining Joint Configurations and End Effector Poses...........................289 11.16 Using the ArbotiX Joint Trajectory Action Controllers................................292 11.16.1 Testing the ArbotiX joint trajectory action controllers in simulation...............292 11.16.2 Testing the ArbotiX joint trajectory controllers with real servos......................300 11.17 Configuring MoveIt! Joint Controllers.........................................................301 11.17.1 Creating the controllers.yaml file......................................................................302 11.17.2 Creating the controller manager launch file......................................................304 11.18 The MoveIt! API..........................................................................................305 11.19 Forward Kinematics: Planning in Joint Space..............................................306 11.20 Inverse Kinematics: Planning in Cartesian Space.........................................314 11.21 Pointing at or Reaching for a Visual Target..................................................322 11.22 Setting Constraints on Planned Trajectories.................................................324 11.22.1 Executing Cartesian Paths.................................................................................324 11.22.2 Setting other path constraints............................................................................330 11.23 Adjusting Trajectory Speed..........................................................................333 11.24 Adding Obstacles to the Planning Scene......................................................337 11.25 Attaching Objects and Tools to the Robot....................................................346 11.26 Pick and Place..............................................................................................348 11.27 Adding a Sensor Controller..........................................................................360 11.28 Running MoveIt! on a Real Arm..................................................................363 11.28.1 Creating your own launch files and scripts.......................................................364 11.28.2 Running the robot's launch files........................................................................364 11.2.3 Forward kinematics on a real arm.....................................................................365 11.28.4 Inverse kinematics on a real arm.......................................................................366 11.28.5 Cartesian paths on a real arm.............................................................................367 11.28.6 Pick-and-place on a real arm.............................................................................367 11.28.7 Pointing at or reaching for a visual target..........................................................367 11.29 Creating a Custom Fast IK Plugin................................................................368 12. Gazebo: Simulating Worlds and Robots.................................................375 12.1 Installing Gazebo............................................................................................376 12.2 Hardware Graphics Acceleration....................................................................377 12.3 Installing the ROS Gazebo Packages..............................................................378 12.4 Installing the Kobuki ROS Packages..............................................................379 12.5 Installing the UBR-1 Files..............................................................................379 12.6 Using the Gazebo GUI...................................................................................379 12.7 Missing Model Bug in Gazebo 1.9.................................................................381 12.8 Testing the Kobuki Robot in Gazebo..............................................................383 12.8.1 Accessing simulated sensor data.........................................................................385 12.8.2 Adding safety control to the Kobuki...................................................................389 12.8.3 Running the nav_square.py script from Volume 1..............................................391 12.9 Loading Other Worlds and Objects................................................................392 12.10 Testing the UBR-1 Robot in Gazebo............................................................393 12.10.1 UBR-1 joint trajectories.....................................................................................394 12.1.2 The UBR-1 and MoveIt!....................................................................................395 12.11 Real Pick-and-Place using the UBR-1 Perception Pipeline..........................397 12.11.1 Limitations of depth cameras............................................................................398 12.11.2 Running the demo..............................................................................................399 12.11.3 Understanding the real_pick_and_place.py script.............................................404 12.12 Running Gazebo Headless + RViz...............................................................407 13. Rosbridge: Building a Web GUI for your Robot...................................411 13.1 Installing the rosbridge Packages...................................................................411 13.2 Installing the mjpeg_sever Package................................................................412 13.3 Installing a Simple Web Server (mini-httpd)..................................................415 13.4 Starting mini-httpd, rosbridge and mjpeg_server............................................416 13.5 A Simple rosbridge HTML/Javascript GUI....................................................417 13.6 Testing the GUI with a Fake TurtleBot..........................................................420 13.7 Testing the GUI with a Real Robot.................................................................420 13.8 Viewing the Web GUI on another Device on your Network..........................421 13.9 Using the Browser Debug Console.................................................................421 13.10 Understanding the Simple GUI.....................................................................423 13.10.1 The HTML layout: simple_gui.html.................................................................423 13.1.2 The JavaScript code: simple_gui.js...................................................................428 13.11 A More Advanced GUI using jQuery, jqWidgets and KineticJS..................438 13.12 Rosbridge Summary.....................................................................................443 Appendix: Plug and Play USB Devices for ROS: Creating udev Rules......445 13.13 Adding yourself to the dialout Group...........................................................445 13.14 Determining the Serial Number of a Device.................................................446 13.15 UDEV Rules.................................................................................................447 13.16 Testing a UDEV Rule...................................................................................448 13.17 Using a UDEV Device Name in a ROS Configuration File..........................448
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值