【ROS书籍】ROSByExampleⅡ——第三章 使用ROS任务执行

第三章  使用ROS任务执行

  正如我们在卷一讲述的,机器人的程序相对简单来运行一个特定的行为,例如人脸跟踪、位置导航、或跟随一个人。但是一个期望的完全自主机器人从一个巨大的行为列表中选择一个自己的行为取决于手头的任务和当前的条件。


在本章节中,我们将学习如何使用两个不同启用ROS任务执行框架:SMACH(使用状态机和宣言“smash”)和pi_trees(行为树)。每一种方法都有其有点和缺点,并且一个是否容易取决于你的编程背景。但相比于写一个巨大的if-then语句列表来说,这两种方式都提供了一个更结构化的方法进行任务管理。


整个任务控制器通常被称为任务执行,并且大多数这样期望的控制器至少包括以下主要特点:


  • 任务优先级:如果一个更高的优先级任务要求同样的资源(例如驱动电机),一个低的优先级任务应该等待。
  • 暂停和恢复:当一个高优先级的给定任务是控制,那么通常需要暂停当前运行的任务(或多个任务),然后当进程返回控制时,应当恢复被抢占(preempted)的任务。例如Neato吸尘器在充电完成后,能够返回离开的位置。
  • 任务层次结构:任务通常可以分解成子任务来处理细节。例如,一个称为再充电的高级任务可能包含三个子任务:导航到充电桩、对接机器人和插上电池充电。同样的,对接任务可以分解成:匹配信标、驱动前进、对接停止。
  • 条件:传感器数据和内部编程变量可以限制何时以及如何执行一个任务、在ROS中,这些变量通常通过各个节点以消息的形式发布。例如,电池监控任务订阅一个诊断话题,该话题包含当前电池状态。当检测到电池电量过低时,一个检查电池条件应该被触发并暂停或终止其他任务,执行再充电任务。
  • 并发性:多个任务可以并行。例如,导航任务(到下一个地点)和电池监控任务必须同时运行。


  为了保持这个整体,我们使用两个贯穿章节的示例场景:一个“巡逻机器人”必须按顺序访问一系列位置,同时保持电池检测。和一个“家庭清洁机器人”必须访问一组房间,并且在每一个房间执行各种清洁相关的任务。

3.1 虚拟电池仿真


  一个永远运行的机器需要监控它自己的电池状态,并且必要时再充电。为了使我们的示例更真实,因此,我们将使用一个节点来模拟一个电池,通过在一个电池状态ROS话题上发布一个实时减少的值。其他节点可以订阅这个话题并且当电池电量过低时可以做出反应。


  电池模拟节点battery_simulator.py可以在rbx2_utils/nodes目录下找到。这个脚本相当简单除了部分设计到的动态重新配置,关于细节我们将在第7章详细介绍。现在,我们只需要注意以下:


这个节点需要三个参数:


  • l  rate:(默认1Hz)——多久发布电池状态
  • l  battery_runtime:(默认60秒)——电池需要多少秒减少到0
  • l  initial_batery_level:(默认100)——当开始运行节点时,初始化电池容量。


  节点使用initial_batery_level这个浮点型开始在/battery_level话题发布,随后通过给定battery_runtime参数在一定时间内计数到0。正如我们在battery_simulator.launch中的那样,这些参数可以被指定,该launch文件可以在rbx2_utils/launch目录下找到。你也可以在命令行作为一个参数指定电池运行时间(单位秒)。让我们现在通过运行launch文件来测试模拟器:


$ roslaunch rbx2_utils battery_simulator.launch

你可以通过打开另一个终端然后运行以下命令来验证模拟器:

$ rostopic echo /battery_level


  类似的输出如下:

data: 91.0
---
data: 90.6666641235
---
data: 90.3333358765
---
等等


  这个节点也定义了一个ROS服务,称为set_battery_level需要一个浮点型数值作为一个参数输入,并且会设置电池容量。我们将使用这个服务来模拟再充电通过设置等级到100或设置一个很低的值来模拟电池突然耗尽。


set_battery_level服务可以使用以下方式:


$ rosservice call /battery_simulator/set_battery_level 100

  参数范围是0到100。模拟再充电,使用set_battery_level服务来设置等级到一个高的数字例如100.。,模拟突然耗尽电池,设置这个值到一个很低的数字例如30。这将允许我们测试各个节点如何应对低电池状态。


  电池模拟器也可以使用rqt_reconfigure来控制。为了实时电池充电或手动设置电池等级,启动rqt_reconfigure:


$ rosrun rqt_reconfigure rqt_reconfigure

  然后点击battery_simulator节点来设置以下选项:





  使用滑块或文本框来改变电池运行时或电池容量。


  在这一章我们使用电池模拟器来比较不同的任务框架是如何使我们能够处理较低的电池状态的同时执行其他任务。


3.2 运行例程的公共设置

  我们例程的所有代码分享了一个公共设置。将会有四个坐标点(目标位置)分布在广场的角落,相距为1米。充电桩位于广场的中心。所有的坐标点和充电桩将会在Rviz作为可视化标记进行显示。其中坐标点是颜色方块,充电桩为黄色圆盘。这些标记并没有深度,所以机器人可以自由通过。基本设置如下所示,使用的是虚拟TurtleBot。


  这些变量的设置通过task_setup.py设置,该文件位于rbx2_tasks/src/rbx2_tasks目录下。在每一个例子中,我们将导入这个文件来配置基本环境。一些更加重要的变量在task_setup.py文件中设置,如下:


  • l  squre_size(默认:1.0米)
  • l  low_battery_threshold(默认:50)
  • l  n_patrols(默认:2)
  • l  move_base_timeout(默认:10秒)


  我们也可以定义坐标点和充电桩的位置。任何这些都随你的需要而可以改变。


  最后,我们定义一个move_base客户端和一个cmd_vel发布器用于控制机器人运动。


3.3 简单回顾ROS行为

  在本章节中,由于我们需要使用相当多的move_base行为,因此是一个好的方式来回顾ROS的行为概念。一定要从ROS Wiki的actionlib概述开始,然后使用在线教程提供的C++和Python教程。


  回想一下ROS行为期望行为客户端提交的一个目标。行为服务器通常会提供目标的反馈进展和一个目标是否成功(succeed)、崩溃或抢占(preempted)的结果


  也许ROS行为最熟悉的例子是导航堆栈里的MoveBaseAction。move_base包实现了一个行为服务器来接收一个目标机器人的位姿(位置和方向)和试图到达的目标,到达目标过程通过发布Twist消息的同时监视里程计和激光扫描数据来避障。在这个过程中,反馈将以一个时间戳的位姿的形式提供,该形式对应机器人状态,以及目标状态(例如ACTIVE、SUCCEEDED、ABORTED等)。行为结果只是一个带有时间戳的状态消息,用于表明到达目标成功(succeed)或崩溃或抢占(preempted)等。


  你可以使用命令来查看MoveBaseAction的完整定义:

$ rosmsg show MoveBaseAction

查看反馈消息语法,使用命令:

$ rosmsg show MoveBaseActionFeedback


  并且查看结果可能返回状态列表,运行命令:


$ rosmsg show MoveBaseActionResult

  回想一下卷一,我们通过使用一系列的move_base行为来编程我们的机器人进行方块导航。对于广场的每一个角落,我们发送对应的位姿到move_base行为服务器,然后提交下一个目标位姿之前等待结果。然而,虚拟设机器人在坐标点之间移动,我们也想机器人在每一个位置运行一组子任务。例如,一个任务可能会寻找一个特定的对象,随后记录其位置或如果机器人有一个手臂和手来进行捡起实验。与此同时,我们想让机器人检测电池容量并导航到固定位置等等。



  所有的这些可以通过ROS的行为来实现,但其中不得不为每一个任务创建一个行为服务器,然后使用一组if-then条件来协调任务或在行为之间进行回调。当然是可能的,结果将会相当乏味。幸运的是,SMACH和行为树会帮助将这些更复杂的情况变的更容易。

 

3.4 巡逻机器人示例

虚拟设我们的机器人任务是在一个广场巡逻,通过按顺序的从一个角落到另一个角落的形式。如果电池电量低于设定的阈值时,机器人将停止巡逻而导航到充电桩。再充电完成后,机器人将回到离开的地方继续巡逻。

基本的巡逻任务看起来像这样:

  • 初始化
    • n  设置目标点
    • n  设置回充坐标点
    • n  设置需要巡逻的数字
  • 任务(遵循优先级):
    • n  CHECK_BATTERY
    • n  RECHARGE
    • n  PATROL
  • 传感器和执行器
    • n  电池传感器;激光扫描仪;RGB-D相机等等。
    • n  驱动电机


CHECK_BATTERY任务是简单设置一个标记,当电源电量等于设置的阈值时。


RECHARGE任务可以分解成以下几个子任务:


RECHARGE:NAV_DOCK CHAGE


NAV_DOCK意味着导航到充电位置。


PATROL任务可以分解成以下导航子任务序列:


PATROL:NAV_0 → NAV_1 → NAV_2→ NAV_3


  这里的每一个导航任务的下坐标通过坐标数字(每一个广场的角落)给定。导航任务可以使用标准ROS MoveBaseAction目标实现,并且导航栈正如我们在卷一做过的。

在开始学习如何使用SMACH或行为树实现巡逻机器人之前,让我们回顾下它怎样使用一个标准的脚本完成。


3.5 巡逻机器人使用标准脚本

  我们的脚本将订阅电池等级话题,并带有一个回调函数,如果等级下降到我们给定的阈值时,该回调函数设置一个low_battery标志为True。


def battery_cb(self, msg):
  if msg.data <self.low_battery_threshold:
    self.low_battery = True
  else:
    self.low_battery = False

当接收到电池容量话题内的值时,这个检查会以同样的频率触发。


与此同时,我们的主控制循环有可能开始如下查看:


while n_patrols < max_patrols:
  if low_battery:
    recharge() 
  else:
    patrol()

  每一个训练开始,我们检查电池等级,如果有必要进行回充。否则,我们开始巡逻,当然,这是简单的策略用于练习,由于当机器人巡逻是一个过程,电池有可能在两次检查过程中没电。让我们看看如何正确解决这个问题。


  patrol()例程通过一组坐标序列移动机器人,看起来像这样:


def patrol(): 
  for location in waypoints:
  nav_to_waypoint(location)

当我们以这样的形式编写的时候,我想我们应该移动电池检查到nav_to_waypoint()函数:


def nav_to_waypoint(location):
  if low_battery:
    recharge() 
  else:
    move_to(location)

  至少我们现在可以在移动到每一个坐标点之前检查电池等级。然而,move_to(localtion)函数可能会话一些时间来处理,取决于到下一个坐标点有多远。因此我们真正需要移动电源检查更深一些,并且放到move_to()进程中。


  在ROS中,move_to()函数可能调用MoveBaseAction服务器实现,因此对于move_base客户端来说,电池检查应该在反馈回调函数中完成。结果看起来像这样:

move_base.send_goal(goal, feedback_cb=self.nav_feedback_cb)
  def nav_feedback_cb(self, msg):
    if self.low_battery:
      self.recharge()

现在我们从MoveBaseAction服务器接收反馈消息来检查电池状态,这样可以有足够的频率来避免电池没电。recharge()函数会在发送一个新目标到MoveBaseAction服务进行导航之前取消当前move_base目标,从而选择导航机器人到充电位置来进行充电。


全部的代码可以在patrol_script.py中找到,该文件位于rbx2_tasks/nodes目录下。


源码链接patrol_script.py


脚本是相当直观的,并且不会详细描述。不过,你可以对其进行测试如下。


首先通过fake_turtlebot.launch文件在ArbotiX模拟器中启动虚拟Turtlebot,该launch文件位于rbx2_tasks/launch目录下。这个文件将启动虚拟Turtlebot、一个配有空白地图的move_base行为服务器和一个默认运行时间为60秒的电池模拟器节点:


$ roslaunch rbx2_tasks fake_turtlebot.launch

接下来,启动带有nav_tasks.rviz配置文件的RViz:

$ rosrun rviz rviz -d `rospack find rbx2_tasks`/nav_tasks.rviz

最后,运行patrol_script.py脚本:

$ rosrun rbx2_tasks patrol_script.py

在RViz中,界面应该如下所示:



  机器人应当在广场周围执行两次循环,同时检测电池水平。每当电池低于脚本定义的阈值(50)时,机器人将运动到广场中间中间的圆形进行充电。一次充电(电池级别设置会100),机器人应该到离开的地方继续巡逻。例如,如果当它完成充电后,没有到第二个为坐标,它将在继续循环之前返回到第一个坐标点。


  • 0
    点赞
  • 11
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
当前大多数搞机器人开发的用户所装的ROS是indigo版本,而且是基于Ubuntu14.04的。如果你跑别的版本的rbx代码老出错,不用怀疑,就是代码版本问题!ros by example for indigo volume 1很多地方(包括CSDN)都可以下载,而volume 2则只此一家哦!下面是本的目录: Contents Preface................................................................................................................vii Printed vs PDF Versions of the Book...............................................................ix 1. Scope of this Volume.......................................................................................1 2. Installing the ros-by-example Code...............................................................3 3. Task Execution using ROS.............................................................................7 3.1 A Fake Battery Simulator.....................................................................................8 3.2 A Common Setup for Running the Examples.....................................................10 3.3 A Brief Review of ROS Actions........................................................................11 3.4 A Patrol Bot Example.........................................................................................12 3.5 The Patrol Bot using a Standard Script...............................................................13 3.6 Problems with the Script Approach....................................................................16 3.7 SMACH or Behavior Trees?..............................................................................17 3.8 SMACH: Tasks as State Machines.....................................................................17 3.8.1 SMACH review.......................................................................................................18 3.8.2 Patrolling a square using SMACH..........................................................................19 3.8.3 Testing SMACH navigation in the ArbotiX simulator............................................23 3.8.4 Accessing results from a SimpleActionState...........................................................26 3.8.5 SMACH Iterators.....................................................................................................27 3.8.6 Executing commands on each transition.................................................................30 3.8.7 Interacting with ROS topics and services................................................................31 3.8.8 Callbacks and Introspection.....................................................................................36 3.8.9 Concurrent tasks: Adding the battery check to the patrol routine...........................36 3.8.10 Comments on the battery checking Patrol Bot......................................................44 3.8.11 Passing user data between states and state machines............................................44 3.8.12 Subtasks and hierarchical state machines..............................................................48 3.8.13 Adding the battery check to the house cleaning robot...........................................54 3.8.14 Drawbacks of state machines................................................................................54 3.9 Behavior Trees...................................................................................................55 3.9.1 Behavior Trees versus Hierarchical State Machines...............................................56 3.1.2 Key properties of behavior trees..............................................................................57 3.9.3 Building a behavior tree..........................................................................................58 3.9.4 Selectors and sequences...........................................................................................60 3.9.5 Customizing behaviors using decorators (meta-behaviors).....................................61 3.10 Programming with Behavior Trees and ROS....................................................63 3.10.1 Installing the pi_trees library.................................................................................63 3.10.2 Basic components of the pi_trees library..............................................................63 3.10.3 ROS-specific behavior tree classes........................................................................68 3.10.4 A Patrol Bot example using behavior trees..........................................................72 3.10.5 A housing cleaning robot using behavior trees.....................................................79 3.10.6 Parallel tasks..........................................................................................................85 3.10.7 Adding and removing tasks...................................................................................87 4. Creating a URDF Model for your Robot....................................................89 4.1 Start with the Base and Wheels..........................................................................90 4.1.1 The robot_state_publisher and joint_state_publisher nodes....................................91 4.1.2 The base URDF/Xacro file......................................................................................92 4.1.3 Alternatives to using the /base_footprint frame......................................................97 4.1.4 Adding the base to the robot model.........................................................................97 4.1.5 Viewing the robot's transform tree..........................................................................98 4.1.6 Using a mesh for the base........................................................................................99 4.2 Simplifying Your Meshes.................................................................................104 4.3 Adding a Torso.................................................................................................104 4.3.1 Modeling the torso.................................................................................................105 4.3.2 Attaching the torso to the base..............................................................................106 4.3.3 Using a mesh for the torso.....................................................................................107 4.3.4 Adding the mesh torso to the mesh base...............................................................108 4.4 Measure, Calculate and Tweak.........................................................................110 4.5 Adding a Camera..............................................................................................110 4.5.1 Placement of the camera........................................................................................111 4.5.2 Modeling the camera.............................................................................................112 4.5.3 Adding the camera to the torso and base...............................................................114 4.5.4 Viewing the transform tree with torso and camera................................................115 4.5.5 Using a mesh for the camera.................................................................................116 4.5.6 Using an Asus Xtion Pro instead of a Kinect........................................................118 4.6 Adding a Laser Scanner (or other Sensors)......................................................119 4.6.1 Modeling the laser scanner....................................................................................119 4.6.2 Attaching a laser scanner (or other sensor) to a mesh base...................................120 4.6.3 Configuring the laser node launch file..................................................................121 4.7 Adding a Pan and Tilt Head..............................................................................122 4.7.1 Using an Asus Xtion Pro instead of a Kinect........................................................124 4.7.2 Modeling the pan-and-tilt head..............................................................................124 4.7.3 Figuring out rotation axes......................................................................................127 4.7.4 A pan and tilt head using meshes on Pi Robot......................................................128 4.7.5 Using an Asus Xtion Pro mesh instead of a Kinect on Pi Robot...........................129 4.8 Adding One or Two Arms................................................................................129 4.8.1 Placement of the arm(s).........................................................................................130 4.8.2 Modeling the arm...................................................................................................130 4.8.3 Adding a gripper frame for planning.....................................................................133 4.8.4 Adding a second arm.............................................................................................134 4.8.5 Using meshes for the arm servos and brackets......................................................136 4.9 Adding a Telescoping Torso to the Box Robot.................................................138 4.10 Adding a Telescoping Torso to Pi Robot........................................................139 4.11 A Tabletop One-Arm Pi Robot.......................................................................140 4.12 Testing your Model with the ArbotiX Simulator............................................142 4.12.1 A fake Box Robot................................................................................................142 4.12.2 A fake Pi Robot...................................................................................................145 4.13 Creating your own Robot Description Package..............................................145 4.13.1 Using rosbuild......................................................................................................145 4.13.2 Using catkin.........................................................................................................146 4.13.3 Copying files from the rbx2_description package...............................................147 4.13.4 Creating a test launch file....................................................................................147 5. Controlling Dynamixel Servos: Take 2......................................................149 5.1 Installing the ArbotiX Packages.......................................................................149 5.2 Launching the ArbotiX Nodes..........................................................................150 5.3 The ArbotiX Configuration File.......................................................................154 5.4 Testing the ArbotiX Joint Controllers in Fake Mode........................................160 5.5 Testing the Arbotix Joint Controllers with Real Servos....................................162 5.6 Relaxing All Servos..........................................................................................165 5.7 Enabling or Disabling All Servos.....................................................................168 6. Robot Diagnostics........................................................................................169 6.1 The DiagnosticStatus Message.........................................................................170 6.2 The Analyzer Configuration File......................................................................171 6.3 Monitoring Dynamixel Servo Temperatures....................................................172 6.3.1 Monitoring the servos for a pan-and-tilt head.......................................................172 6.3.2 Viewing messages on the /diagnostics topic.........................................................175 6.3.3 Protecting servos by monitoring the /diagnostics topic.........................................177 6.4 Monitoring a Laptop Battery............................................................................181 6.5 Creating your Own Diagnostics Messages.......................................................182 6.6 Monitoring Other Hardware States...................................................................188 7. Dynamic Reconfigure..................................................................................191 7.1 Adding Dynamic Parameters to your own Nodes.............................................192 7.1.1 Creating the .cfg file..............................................................................................192 7.1.2 Making the .cfg file executable.............................................................................193 7.1.3 Configuring the CMakeLists.txt file......................................................................194 7.1.4 Building the package.............................................................................................194 7.2 Adding Dynamic Reconfigure Capability to the Battery Simulator Node........194 7.3 Adding Dynamic Reconfigure Client Support to a ROS Node.........................198 7.4 Dynamic Reconfigure from the Command Line...............................................201 8. Multiplexing Topics with mux & yocs.......................................................203 8.1 Configuring Launch Files to Use mux Topics..................................................204 8.2 Testing mux with the Fake TurtleBot...............................................................205 8.3 Switching Inputs using mux Services...............................................................206 8.4 A ROS Node to Prioritize mux Inputs..............................................................207 8.5 The YOCS Controller from Yujin Robot..........................................................210 8.5.1 Adding input sources.............................................................................................213 9. Head Tracking in 3D...................................................................................215 9.1 Tracking a Fictional 3D Target.........................................................................216 9.2 Tracking a Point on the Robot..........................................................................217 9.3 The 3D Head Tracking Node............................................................................220 9.3.1 Real or fake head tracking.....................................................................................220 9.1.2 Projecting the target onto the camera plane...........................................................221 9.4 Head Tracking with Real Servos......................................................................224 9.4.1 Real servos and fake target....................................................................................225 9.4.2 Real servos, real target...........................................................................................226 9.4.3 The nearest_cloud.py node and launch file...........................................................228 10. Detecting and Tracking AR Tags.............................................................233 10.1 Installing and Testing the ar_track_alvar Package..........................................234 10.1.1 Creating your own AR Tags................................................................................234 10.1.2 Generating and printing the AR tags...................................................................236 10.1.3 Launching the camera driver and ar_track_alvar node.......................................236 10.1.4 Testing marker detection.....................................................................................238 10.1.5 Understanding the /ar_pose_marker topic...........................................................238 10.1.6 Viewing the markers in RViz..............................................................................240 10.2 Accessing AR Tag Poses in your Programs....................................................240 10.2.1 The ar_tags_cog.py script....................................................................................240 10.2.2 Tracking the tags with a pan-and-tilt head..........................................................244 10.3 Tracking Multiple Tags using Marker Bundles..............................................245 10.4 Following an AR Tag with a Mobile Robot....................................................245 10.4.1 Running the AR follower script on a TurtleBot .................................................248 10.5 Exercise: Localization using AR Tags............................................................249 11. Arm Navigation using MoveIt!.................................................................251 11.1 Do I Need a Real Robot with a Real Arm?.....................................................252 11.2 Degrees of Freedom.......................................................................................252 11.3 Joint Types.....................................................................................................253 11.4 Joint Trajectories and the Joint Trajectory Action Controller.........................254 11.5 Forward and Inverse Arm Kinematics............................................................257 11.6 Numerical versus Analytic Inverse Kinematics..............................................258 11.7 The MoveIt! Architecture...............................................................................258 11.8 Installing MoveIt!...........................................................................................260 11.9 Creating a Static URDF Model for your Robot .............................................261 11.10 Running the MoveIt! Setup Assistant...........................................................262 11.10.1 Load the robot's URDF model...........................................................................263 11.2.2 Generate the collision matrix.............................................................................264 11.10.3 Add the base_odom virtual joint.......................................................................264 11.10.4 Adding the right arm planning group................................................................265 11.10.5 Adding the right gripper planning group...........................................................269 11.10.6 Defining robot poses..........................................................................................271 11.10.7 Defining end effectors.......................................................................................273 11.10.8 Defining passive joints......................................................................................273 11.10.9 Generating the configuration files.....................................................................273 11.11 Configuration Files Created by the MoveIt! Setup Assistant........................275 11.11.1 The SRDF file (robot_name.srdf)......................................................................275 11.11.2 The fake_controllers.yaml file...........................................................................276 11.11.3 The joint_limits.yaml file..................................................................................277 11.11.4 The kinematics.yaml file...................................................................................278 11.12 The move_group Node and Launch File.......................................................280 11.13 Testing MoveIt! in Demo Mode...................................................................280 11.13.1 Exploring additional features of the Motion Planning plugin...........................284 11.13.2 Re-running the Setup Assistant at a later time..................................................285 11.14 Testing MoveIt! from the Command Line....................................................286 11.15 Determining Joint Configurations and End Effector Poses...........................289 11.16 Using the ArbotiX Joint Trajectory Action Controllers................................292 11.16.1 Testing the ArbotiX joint trajectory action controllers in simulation...............292 11.16.2 Testing the ArbotiX joint trajectory controllers with real servos......................300 11.17 Configuring MoveIt! Joint Controllers.........................................................301 11.17.1 Creating the controllers.yaml file......................................................................302 11.17.2 Creating the controller manager launch file......................................................304 11.18 The MoveIt! API..........................................................................................305 11.19 Forward Kinematics: Planning in Joint Space..............................................306 11.20 Inverse Kinematics: Planning in Cartesian Space.........................................314 11.21 Pointing at or Reaching for a Visual Target..................................................322 11.22 Setting Constraints on Planned Trajectories.................................................324 11.22.1 Executing Cartesian Paths.................................................................................324 11.22.2 Setting other path constraints............................................................................330 11.23 Adjusting Trajectory Speed..........................................................................333 11.24 Adding Obstacles to the Planning Scene......................................................337 11.25 Attaching Objects and Tools to the Robot....................................................346 11.26 Pick and Place..............................................................................................348 11.27 Adding a Sensor Controller..........................................................................360 11.28 Running MoveIt! on a Real Arm..................................................................363 11.28.1 Creating your own launch files and scripts.......................................................364 11.28.2 Running the robot's launch files........................................................................364 11.2.3 Forward kinematics on a real arm.....................................................................365 11.28.4 Inverse kinematics on a real arm.......................................................................366 11.28.5 Cartesian paths on a real arm.............................................................................367 11.28.6 Pick-and-place on a real arm.............................................................................367 11.28.7 Pointing at or reaching for a visual target..........................................................367 11.29 Creating a Custom Fast IK Plugin................................................................368 12. Gazebo: Simulating Worlds and Robots.................................................375 12.1 Installing Gazebo............................................................................................376 12.2 Hardware Graphics Acceleration....................................................................377 12.3 Installing the ROS Gazebo Packages..............................................................378 12.4 Installing the Kobuki ROS Packages..............................................................379 12.5 Installing the UBR-1 Files..............................................................................379 12.6 Using the Gazebo GUI...................................................................................379 12.7 Missing Model Bug in Gazebo 1.9.................................................................381 12.8 Testing the Kobuki Robot in Gazebo..............................................................383 12.8.1 Accessing simulated sensor data.........................................................................385 12.8.2 Adding safety control to the Kobuki...................................................................389 12.8.3 Running the nav_square.py script from Volume 1..............................................391 12.9 Loading Other Worlds and Objects................................................................392 12.10 Testing the UBR-1 Robot in Gazebo............................................................393 12.10.1 UBR-1 joint trajectories.....................................................................................394 12.1.2 The UBR-1 and MoveIt!....................................................................................395 12.11 Real Pick-and-Place using the UBR-1 Perception Pipeline..........................397 12.11.1 Limitations of depth cameras............................................................................398 12.11.2 Running the demo..............................................................................................399 12.11.3 Understanding the real_pick_and_place.py script.............................................404 12.12 Running Gazebo Headless + RViz...............................................................407 13. Rosbridge: Building a Web GUI for your Robot...................................411 13.1 Installing the rosbridge Packages...................................................................411 13.2 Installing the mjpeg_sever Package................................................................412 13.3 Installing a Simple Web Server (mini-httpd)..................................................415 13.4 Starting mini-httpd, rosbridge and mjpeg_server............................................416 13.5 A Simple rosbridge HTML/Javascript GUI....................................................417 13.6 Testing the GUI with a Fake TurtleBot..........................................................420 13.7 Testing the GUI with a Real Robot.................................................................420 13.8 Viewing the Web GUI on another Device on your Network..........................421 13.9 Using the Browser Debug Console.................................................................421 13.10 Understanding the Simple GUI.....................................................................423 13.10.1 The HTML layout: simple_gui.html.................................................................423 13.1.2 The JavaScript code: simple_gui.js...................................................................428 13.11 A More Advanced GUI using jQuery, jqWidgets and KineticJS..................438 13.12 Rosbridge Summary.....................................................................................443 Appendix: Plug and Play USB Devices for ROS: Creating udev Rules......445 13.13 Adding yourself to the dialout Group...........................................................445 13.14 Determining the Serial Number of a Device.................................................446 13.15 UDEV Rules.................................................................................................447 13.16 Testing a UDEV Rule...................................................................................448 13.17 Using a UDEV Device Name in a ROS Configuration File..........................448
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值