ROS By Example _Hydro_volume1_ CN_4

4.12.4 Setting the ROS_MASTER_URI and ROS_HOSTNAME Variables
    In any ROS network, one machine is designated as the ROS master and it alone runs the roscore process. Other machines must then set the ROS_MASTER_URI environment variable to point to the master host. Each computer must also set its ROS hostname
appropriately as we will show.
    In general, it does not matter which machine you choose to be the master. However, for a completely autonomous robot, you'll probably want to make the robot computer the master so that it does not depend in any way on the desktop.
    在一般情况下,你选择任意机器成为主机。然而,对于一个完全自主机器人,你可能会想使机器人的计算机成为主机,以便它不用任何方式依赖于桌面。
    If we want the robot to be the ROS master, we set its ROS_HOSTNAME to its Zeroconf name and run the roscore process:

On the robot:


$ export ROS_HOSTNAME=my_robot.local
$ roscore

    Next, move to your desktop, set the ROS_HOSTNAME to its Zeroconf name and then set the ROS_MASTER_URI variable to point to your robot's Zeroconf URI.

On the desktop:


$ export ROS_HOSTNAME=my_desktop.local
$ export ROS_MASTER_URI=http://my_robot.local:11311


    For an added check on time synchronization, we can run the ntpdate command to synchronize the desktop with the robot.
    有关时间同步的补充检查,我们就可以运行的ntpdate命令同步台式机与机器人。

On the desktop:


$ sudo ntpdate -b my_robot.local
    If all goes well, you should be able to see the /rosout and /rosout_agg topics on your desktop as follows:
$ rostopic list


/rosout
/rosout_agg

4.12.5 Opening New Terminals
    In any new terminal window you open on your desktop or robot, you need to set the ROS_HOSTNAME variable to that machine's Zeroconf name. And for new desktop terminals, you also have to set the ROS_MASTER_URI to point to the robot. (Or more generally, on any non-master computer, new terminals must set the ROS_MASTER_URI to point to the master computer.)
    在任何新的终端窗口,您在桌面或机器人上打开,则需要将ROS_HOSTNAME变量设置为机器的零配置名称。而对于新的桌面终端,还必须设置ROS_MASTER_URI指向机器人。 (或者更一般地,任何非主计算机上,新的终端必须设置ROS_MASTER_URI指向主计算机)。
    If you will use the same robot and desktop for a while, you can save yourself some time by adding the appropriate export lines to the end of the ~/ .bashrc file on each computer. If the robot will always be the master, add the following line to the end of its ~/.bashrc file:
    如果您将同时使用相同的robot和桌面,你可以通过在每台计算机上的〜/.bashrc文件的末尾添加适当的export行,保存自己。如果机器人将永远是主机,下面一行添加到其〜/.bashrc文件的末尾:
export ROS_HOSTNAME=my_robot.local
    And on the desktop, add the following two lines to the end of its ~/.bashrc file:
export ROS_HOSTNAME=my_desktop.local
export ROS_MASTER_URI=http://my_robot.local:11311
    (Of course, replace the Zeroconf names with those that match your setup.)
     You can also set your desktop up as the ROS master instead of the robot. In this case, simply reverse the roles and Zeroconf hostnames in the examples above.
    您还可以设置台式机了作为ROS主机,而不是机器人。在这种情况下,简单地将上面的实施例的角色和零配置主机名转换以下就行。

4.12.6 Running Nodes on both Machines
    Now that you have your ROS network set up between your robot and desktop computer, you can run ROS nodes on either machine and both will have access to all topics and services.
    您可以在任意机器上运行ROS节点,所有的机器都将有机会获得所有话题和服务。
    While many nodes and launch files can be run on either computer, the robot's startup files must always be run on robot since these nodes provide drivers to the robot's hardware. This includes the drivers for the robot base and any cameras, laser scanners or other sensors you want to use. On the other hand, the desktop is a good place to run RViz since it is very CPU-intensive and besides, you'll generally want to monitor your robot from your desktop anyway.
    尽管许多节点和启动文件可以在任意计算机上运行,但是机器人的启动文件必须始终在机器人上运行,因为这些节点提供了机器人的硬件的驱动程序。这包括机器人基座和摄像头驱动,激光扫描仪或其他传感器的驱动程序。而另一方面,在台式机上跑RViz,因为它是CPU密集型的,另外,你通常要从台式机上监视机器人。
    Since the robot's computer may not always have a keyboard and monitor, you can use ssh to log into your robot and launch driver nodes from your desktop. Here's an example of how you might do this.
    由于机器人的计算机可能并不总是有一个键盘和显示器,您可以使用ssh登录到你的机器人,并从您的桌面启动驱动程序节点。
    From your desktop computer, use ssh to log in to your robot.
    On the desktop:
$ ssh my_robot.local
    Once logged in to the robot, fire up roscore and your robot's startup launch file(s).
    On the robot (via ssh):
$ export ROS_HOSTNAME=my_robot.local(要是~/.bashrc文件里面已经有了,可忽略)
$ roscore &
$ roslaunch my_robot startup.launch


    (You can omit the first export line above if you have already included it in the robot's ~/.bashrc file.)

    Notice how we send the roscore process into the background using the & symbol after the command. This brings back the command prompt so we can launch our robot's startup file without having to open another ssh session. If possible, launch all your robot's hardware drivers in one startup.launch file (it can be named anything you like). This way you will not have to open additional terminals to launch other drivers.
    注意我们是如何发送roscore过程到background,在命令后使用&符号。这带回在命令提示符下,所以我们可以启动你的机器人启动文件,而无需打开另一个SSH会话。如果可能的话,在一个startup.launch文件(它可以被命名为任何你喜欢的)中启动机器人的所有硬件驱动程序。这样,您就不必打开其他终端启动其他驱动程序。
    Back on your desktop, open another terminal window, set the ROS_MASTER_URI to point to your robot, then fire up RViz:
    On the desktop:
$ export ROS_HOSTNAME=my_desktop.local
$ export ROS_MASTER_URI=http://my_robot.local:11311
$ rosrun rviz rviz -d `rospack find rbx1_nav`/nav.rviz


    (You can omit the two export lines if you have already included them in the desktop's ~/.bashrc file.)
    Here we are running RViz with one of the configuration files included in the ros-by- example navigation package but you can also simply launch RViz without any configuration file.
    在这里,我们用配置文件中的一个运行RViz,配置文件包括在ros-by- example导航包中,但你也可以简单地启动RViz无需任何配置文件。

4.12.7 ROS Networking across the Internet
    While outside the scope of this book, setting up ROS nodes to communicate over the Internet is similar to the instructions given above using Zeroconf. The main difference is that now you need to use fully qualified hostnames or IP addresses instead of local Zeroconf names. Furthermore, it is likely that one or more of the machines will be behind a firewall so that some form of VPN (e.g. OpenVPN) will have to be set up. Finally, since most machines on the ROS network will be connected to a local router (e.g. wifi access point), you will need to set up port forwarding on that router or use dynamic DNS. While all this is possible, it is definitely not trivial to set up.
    尽管超出本书的范围,设置的ROS节点通过Internet进行通信以外类似于以上使用零配置的介绍。主要的区别是,现在你需要使用完全合格的主机名或IP地址而不是当地的Zeroconf名称。此外,一个或多个机械在防火墙之外是很可能的,所以使VPN的一些形式(例如OpenVPN的)必须要设置。最后,由于ROS网络上的大多数机器将被连接到本地路由器(如无线接入点),你需要设置路由器上的转发端口,或使用动态DNS。虽然这一切都是可能的,设置绝对不是小事的。

4.13 ROS Recap回顾
    Since it might have been awhile since you did the Beginner and tf Tutorials, here is a brief recap of the primary ROS concepts. The core entity in ROS is called a node. A node is generally a small program written in Python or C++ that executes some relatively simple task or process. Nodes can be started and stopped independently of one another and they communicate by passing messages. A node can publish messages on certain topics or provide services to other nodes.
    在ROS的代码实体被称为一个节点。节点一般是用Python编写或C++一个小程序,执行一些相对简单的任务或进程。节点可以被启动,并彼此独立地停止,并且它们通过传递消息通信。一个节点可以对某些主题发布消息或为其他节点提供服务。
    For example, a publisher node might report data from sensors attached to your robot's microcontroller. A message on the /head_sonar topic with a value of 0.5 would mean that the sensor is currently detecting an object 0.5 meters away. (Remember that ROS uses meters for distance and radians for angular measurements.) Any node that wants to know the reading from this sensor need only subscribe to the /head_sonar topic. To make use of these values, the subscriber node defines a callback function that gets executed whenever a new message arrives on the subscribed topic. How often this happens depends on the rate at which the publisher node updates its messages.
    例如,发布者节点可能报告来自连接到机器人的微控制器传感器的数据。在与值0.5的/head_sonar(头声纳)主题的消息将意味着,传感器当前检测到的对象距离0.5米。 (请记住,ROS使用米测量距离和弧度测量角度。)任何想要知道从该传感器的读取的节点只需要订阅/头声纳主题。要使用这些值,订阅节点定义一个每当有新邮件到达订阅话题时被执行的回调函数。如何经常发生这种情况频率取决于在该发布节点更新其消息的速率。
    A node can also define one or more services. A ROS service produces some behavior or sends back a reply when sent a request from another node. A simple example would be a service that turns an LED on or off. A more complex example would be a service that returns a navigation plan for a mobile robot when given a goal location and the starting pose of the robot.
    节点还可以定义一个或多个服务。一个ROS服务产生一些行为,或当另一个节点的请求时发送回复。一个简单的服务是打开或关闭的LED灯服务。一个更复杂的服务是当给定一个目标的位置和机器人的开始姿态,它给移动机器人返回一个导航计划。
    Higher level ROS nodes will subscribe to a number of topics and services, combine the results in a useful way, and perhaps publish messages or provide services of their own. For example, the object tracker node we will develop later in the book subscribes to camera messages on a set of video topics and publishes movement commands on another topic that are read by the robot's base controller to move the robot in the appropriate direction.
    更高层次的ROS节点能够订阅了多个主题和服务,会用一种有用的方式结合结果,也许为自己发布信息或提供服务。例如,对象跟踪节点,we will develop later in the book 通过一组视频主题订阅照相机信息,并在另一个主题上发布移动命令。机器人的基本控制器读取此话题,在适当的方向移动机器人。

4.14 What is a ROS Application?
    If you are not already familiar with a publish/subscribe architecture like ROS, programming your robot to do something useful might seem a little mysterious at first. For instance, when programming an Arduino-based robot using C, one usually creates a single large program that controls the robot's behavior. Moreover, the program will usually talk directly to the hardware, or at least, to a library specifically designed for the
hardware you are using.
    如果您还不熟悉ROS发布/订阅架构,那么一开始编程机器人做一些有用的事情似乎有点神秘。例如,当使用C编程一个Arduino为基础的机器人,人们通常会创建用于控制机器人的行为的单一大的程序。此外,该方案通常会直接与硬件交互,或至少,以专门为正在使用的硬件所设计的库。
    When using ROS, the first step is to divide up the desired behavior into independent functions that can be handled by separate nodes. For example, if your robot uses a webcam or a depth camera like a Kinect or Xtion Pro, one node will connect to the camera and simply publish the image and/or depth data so that other nodes can use it. If your robot uses a mobile base, a base controller node will listen for motion commands on some topic and control the robot's motors to move the robot accordingly. These nodes can be used without modification in many different applications whenever the desired behavior requires vision and/or motion control.
    当使用ROS,第一步是将所需的行为切分成可以由单独的节点来处理的独立功能。例如,如果你的机器人使用摄像头或深度相机像Kinect或者VS Xtion Pro,一个节点会连接到相机,只是发布的图像和/或深度数据,以便其他节点可以使用它。如果你的机器人使用了移动基站,基站控制器节点将侦听某些话题的运动指令,控制机器人的马达,相应地移动机器人。这些节点可以不加修改地在许多不同的应用使用,只要所需行为需要视觉和/或运动控制。
    An example of a complete application is the "follower" application we will develop later in the book. (The original C++ version by Tony Pratkanis can be found in the turtlebot_follower package.) The goal of the follower app is to program a Kinect- equipped robot like the TurtleBot to follow the nearest person. In addition to the camera and base controller nodes, we need a third node that subscribes to the camera topic and
publishes on the motion control topic. This "follower" node must process the image data (using OpenCV or PCL for example) to find the nearest person-like object, then command the base to steer in the appropriate direction. One might say that the follower node is our ROS application; however, to be more precise, the application really consists of all three nodes running together. To run the application, we use a ROS launch file to fire up the whole collection of nodes as a group. Remember that launch files can also include other launch files allowing for even easier reuse of existing code in new applications.
    一个完整的应用程序的一个例子是“跟随者”的应用程序,我们将在书后面开发 。(原始C ++版本by Tony Pratkanis turtlebot_follower包中找到。)随应用程序的目的是进行编程像TurtleBot一个Kinect-配备机器人跟随最近的人。除了摄像头和主控制器节点,我们需要第三个节点去订阅相机的话题,发布运动控制话题。这种“跟随者”的节点必须处理的图像数据(使用OpenCV的或PCL为例),找到最近的人状物体,然后命令base控制器在适当的方向引导。有人可能会说,跟随节点是我们的ROS应用;然而,更准确地说,应用程序真正由一起运行的所有三个节点组成。要运行此应用程序,我们使用了ROS启动文件启动连接成一组的全部节点。请记住,启动文件还可以包含其他允许现有代码在新的应用程序更容易重用的启动文件。
    Once you get used to this style of programming, there are some significant advantages. As we have already mentioned, many nodes can be reused without modification in other applications. Indeed, some ROS applications are little more than launch files combining existing nodes in new ways or using different values for the parameters. Furthermore, many of the nodes in a ROS application can run on different robots without
modification. For example, the TurtleBot follower application can run on any robot that uses a depth camera and a mobile base. This is because ROS allows us to abstract away the underlying hardware and work with more generic messages instead.
    一旦你习惯了这种风格的程序设计,也有一些显著优势。正如我们已经提到的,许多节点可以不加修改地在其它应用中重复使用。事实上,一些ROS应用比用新方法结合现有节点,或使用不同的参数值的luanch文件更小。此外,许多在一个ROS应用程序的节点都可以在不同的机器人运行而无需修改。例如,TurtleBot追踪应用程序可以在使用深度相机和移动基地的任何机器人运行。这是因为ROS可以让我们抽象掉底层硬件和工作,更通用的消息来代替。
    Finally, ROS is a network-centric framework. This means that you can distribute the nodes of your application across multiple machines as long as they can all see each other on the network. For example, while the camera and motor control nodes have to run on the robot's computer, the follower node and RViz could run on any machine on the Internet. This allows the computational load to be distributed across multiple computers
if necessary.

4.15 Installing Packages with SVN(subversion颠覆), Git, and Mercurial易变的
    Once in awhile the ROS package you need won't be available as a Debian package and you will need to install it from source. There are three major source control systems popular with code developers: SVN, Git and Mecurial. The type of system used by the developer determines how you install the source. To make sure you are ready for all three systems, run the following install command on your Ubuntu machine:
    曾经在一段时间内ROS包你需要将不可用作为一个Debian软件包,您将需要从源代码安装。有三个主要的源代码控制系统很受代码的开发人员:SVN,Git和Mercurial的。开发者所使用系统的类型决定了你如何安装源。为了确保你已经准备好了所有的三个系统,运行以下你的Ubuntu计算机上安装的命令:
$ sudo apt-get install git subversion mercurial
    For all three systems, there are two operations you will use most of the time. The first operation allows you to check out the software for the first time, while the second is used for getting updates that might be available later on. Since these commands are different for all three systems, let's look at each in turn.
    对于三个系统,大部分时间你会用两个操作。第一个可以让你第一次检查软件,而第二个是用于获取,可能是可用以后更新。由于这些命令是所有三个系统不同,让我们来看看每个反过来。

4.15.1 SVN
    (一个开放源代码的版本控制系统,本质:版本管理工具,相较于RCS、CVS,它采用了分支管理系统,它的设计目标就是取代CVS。互联网上很多版本控制服务已从CVS迁移到Subversion)
    GIT:
    Mercurial:
    三个使用方法完全类似:不翻译了
Let's assume that the SVN source you would like to check out is located at http://repository/svn/package_name. To do the initial checkout and build the package in your personal catkin directory, run the following commands. (If necessary, change the first command to reflect the actual location of your catkin source directory.)



$ cd ~/catkin_ws/src
$ svn checkout http://repository/svn/package_name
$ cd ~/catkin_ws
$ catkin_make
$ source devel/setup.bash
$ rospack profile

To update the package later on, run the commands:

$ cd ~/catkin_ws/src/package_name
$ svn update
$ cd ~/catkin_ws
$ catkin_make
$ source devel/setup.bash


4.15.2 Git

Let's assume that the Git source you would like to check out is located at git://repository/package_name. To do the initial checkout and build the package in your personal catkin directory, run the following commands. (If necessary, change the first command to reflect the actual location of your personal catkin source directory.)

$ cd ~/catkin_ws/src
$ git clone git://repository/package_name
$ cd ~/catkin_ws
$ catkin_make
$ source devel/setup.bash
$ rospack profile

To update the package later on, run the commands:

$ cd ~/catkin_ws/src/package_name
$ git pull
$ cd ~/catkin_ws
$ catkin_make
$ source devel/setup.bash



4.15.3 Mercurial
Let's assume that the Mercurial source you'd like to check out is located at http://repository/package_name. To do the initial checkout and build the package in your personal catkin directory, run the following commands. (If necessary, change the first command to reflect the actual location of your personal catkin source directory.)

$ cd ~/catkin_ws/src
$ hg clone http://repository/package_name
$ cd ~/catkin_ws
$ catkin_make
$ source devel/setup.bash
$ rospack profile

(In case you are wondering why Mercurial uses hg for its main command name, Hg is the symbol for the element Mercury on the Periodic Table in chemistry.) To update the package later on, run the commands:

$ cd ~/catkin_ws/src/package_name
$ hg update
$ cd ~/catkin_ws
$ catkin_make
$ source devel/setup.bash



4.16 Removing Packages from your Personal catkin Directory
    To remove a package installed in your personal catkin directory, first remove the entire
package source directory or move it to a different location outside of your ROS_PACKAGE_PATH. For example, to remove a package called my_catkin_package from your ~/catkin_ws/src directory, run the commands:

$ cd ~/catkin_ws/src
$ \rm -rf my_catkin_package



    You also have to remove all catkin build objects. Unfortunately, there is no (easy) way to do this for just the package you removed—you have to remove all build objects for all packages and then rerun catkin_make:
CAUTION! Do not include the src directory in the rm command below or you will lose all your personal catkin source files!

$ cd ~/catkin_ws
$ \rm -rf devel build install
$ catkin_make
$ source devel/setup.bash



    You can test that the package has been removed using the roscd command:
$ roscd my_ros_package



which should produce the output:
roscd: No such package 'my_ros_package'

4.17 How to Find Third-Party ROS Packages

    Sometimes the hardest thing to know about ROS is what's available from other developers. For example, suppose you are interested in running ROS with an Arduino and want to know if someone else has already created a ROS package to do the job. There are a few ways to do the search.


4.17.1 Searching the ROS Wiki
    The ROS Wiki at wiki.ros.org includes a searchable index to many ROS packages and stacks. If a developer has created some ROS software they'd like to share with others, they tend to post an announcement to the ros-users mailing list together with the link to their repository. If they have also created documentation on the ROS Wiki, the package should show up in a search of the index shortly after the announcement.
    该ROS维基wiki.ros.org包括一个可搜索的索引许多ROS包和堆栈。如果开发者已经创建了一些ROS的软件,他们希望与他人分享,他们往往会发布一个公告的ROS用户邮件列表以及链接到他们的存储库。如果他们还创建对ROS维基文档,包应在消息公布后不久,显示在搜索索引。
    The end result is that you can often find what you are looking for by simply doing a keyword search on the ROS Wiki. Coming back to our Arduino example, if we type "Arduino" (without the quotes) into the Search box, we find links referring to two packages: rosserial_arduino and ros_arduino_bridge.

4.17.2 Using the roslocate Command
    If you know the exact package name you are looking for and you want to find the URL to the package repository, use the roslocate command. (This command is only available if you installed rosinstall as described earlier.) For example, to find the location of the ros_arduino_bridge package for ROS Groovy, run the command:

$ roslocate uri ros_arduino_bridge


which should yield the result:
Using ROS_DISTRO: indigo
Not found via rosdistro - falling back to information provided by rosdoc
https://github.com/hbrobotics/ros_arduino_bridge.git

    This means that we can install the package into our personal catkin directory using the git command:

$ cd ~/catkin_ws/src
$ git clone git://github.com/hbrobotics/ros_arduino_bridge.git
$ cd ~/catkin_ws
$ catkin_make
$ source devel/setup.bash
$ rospack profile



NOTE: Starting with ROS Groovy, the roslocate command will only return a result if the package or stack has be submitted to the indexer by the package maintainer for the ROS distribution you are currently using.

4.18 Getting Further Help with ROS
    There are several sources for additional help with ROS. Probably the best place to start is at the main ROS wiki at http://wiki.ros.org. As described in the previous section, be sure to use the Search box at the top right of the page.
    If you can't find what you are looking for on the Wiki, try the ROS Questions and Answers forum at http://answers.ros.org. The answers site is a great place to get help. You can browse through the list of questions, do keyword searches, look up topics based on tags, and even get email notifications when a topic is updated. But be sure to do some kind of search before posting a new question to avoid duplication.
    Next, you can search one of the ROS mailing list archives邮件列表归档:
• ros-users(有连接): for general ROS news and announcements
• ros-kinect: for Kinect related issues
• pcl-users: for PCL related issues
    NOTE: Please do not use the ros-users mailing list to post questions about using ROS or debugging packages. Use http://answers.ros.org instead.
    If you want to subscribe to one or more of these lists, use the appropriate link listed below:
• ros-users: Subscription page
• ros-kinect – It appears that this list may no longer be active.
• pcl_users - Subscription page

$ cd ~/catkin_ws/src
$ svn checkout http://repository/svn/package_name
$ cd ~/catkin_ws
$ catkin_make
$ source devel/setup.bash
$ rospack profile

To update the package later on, run the commands:

$ cd ~/catkin_ws/src/package_name
$ svn update
$ cd ~/catkin_ws
$ catkin_make
$ source devel/setup.bash


  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
当前大多数搞机器人开发的用户所装的ROS是indigo版本,而且是基于Ubuntu14.04的。如果你跑别的版本的rbx代码老出错,不用怀疑,就是代码版本问题!ros by example for indigo volume 1很多地方(包括CSDN)都可以下载,而volume 2则只此一家哦!下面是本书的目录: Contents Preface................................................................................................................vii Printed vs PDF Versions of the Book...............................................................ix 1. Scope of this Volume.......................................................................................1 2. Installing the ros-by-example Code...............................................................3 3. Task Execution using ROS.............................................................................7 3.1 A Fake Battery Simulator.....................................................................................8 3.2 A Common Setup for Running the Examples.....................................................10 3.3 A Brief Review of ROS Actions........................................................................11 3.4 A Patrol Bot Example.........................................................................................12 3.5 The Patrol Bot using a Standard Script...............................................................13 3.6 Problems with the Script Approach....................................................................16 3.7 SMACH or Behavior Trees?..............................................................................17 3.8 SMACH: Tasks as State Machines.....................................................................17 3.8.1 SMACH review.......................................................................................................18 3.8.2 Patrolling a square using SMACH..........................................................................19 3.8.3 Testing SMACH navigation in the ArbotiX simulator............................................23 3.8.4 Accessing results from a SimpleActionState...........................................................26 3.8.5 SMACH Iterators.....................................................................................................27 3.8.6 Executing commands on each transition.................................................................30 3.8.7 Interacting with ROS topics and services................................................................31 3.8.8 Callbacks and Introspection.....................................................................................36 3.8.9 Concurrent tasks: Adding the battery check to the patrol routine...........................36 3.8.10 Comments on the battery checking Patrol Bot......................................................44 3.8.11 Passing user data between states and state machines............................................44 3.8.12 Subtasks and hierarchical state machines..............................................................48 3.8.13 Adding the battery check to the house cleaning robot...........................................54 3.8.14 Drawbacks of state machines................................................................................54 3.9 Behavior Trees...................................................................................................55 3.9.1 Behavior Trees versus Hierarchical State Machines...............................................56 3.1.2 Key properties of behavior trees..............................................................................57 3.9.3 Building a behavior tree..........................................................................................58 3.9.4 Selectors and sequences...........................................................................................60 3.9.5 Customizing behaviors using decorators (meta-behaviors).....................................61 3.10 Programming with Behavior Trees and ROS....................................................63 3.10.1 Installing the pi_trees library.................................................................................63 3.10.2 Basic components of the pi_trees library..............................................................63 3.10.3 ROS-specific behavior tree classes........................................................................68 3.10.4 A Patrol Bot example using behavior trees..........................................................72 3.10.5 A housing cleaning robot using behavior trees.....................................................79 3.10.6 Parallel tasks..........................................................................................................85 3.10.7 Adding and removing tasks...................................................................................87 4. Creating a URDF Model for your Robot....................................................89 4.1 Start with the Base and Wheels..........................................................................90 4.1.1 The robot_state_publisher and joint_state_publisher nodes....................................91 4.1.2 The base URDF/Xacro file......................................................................................92 4.1.3 Alternatives to using the /base_footprint frame......................................................97 4.1.4 Adding the base to the robot model.........................................................................97 4.1.5 Viewing the robot's transform tree..........................................................................98 4.1.6 Using a mesh for the base........................................................................................99 4.2 Simplifying Your Meshes.................................................................................104 4.3 Adding a Torso.................................................................................................104 4.3.1 Modeling the torso.................................................................................................105 4.3.2 Attaching the torso to the base..............................................................................106 4.3.3 Using a mesh for the torso.....................................................................................107 4.3.4 Adding the mesh torso to the mesh base...............................................................108 4.4 Measure, Calculate and Tweak.........................................................................110 4.5 Adding a Camera..............................................................................................110 4.5.1 Placement of the camera........................................................................................111 4.5.2 Modeling the camera.............................................................................................112 4.5.3 Adding the camera to the torso and base...............................................................114 4.5.4 Viewing the transform tree with torso and camera................................................115 4.5.5 Using a mesh for the camera.................................................................................116 4.5.6 Using an Asus Xtion Pro instead of a Kinect........................................................118 4.6 Adding a Laser Scanner (or other Sensors)......................................................119 4.6.1 Modeling the laser scanner....................................................................................119 4.6.2 Attaching a laser scanner (or other sensor) to a mesh base...................................120 4.6.3 Configuring the laser node launch file..................................................................121 4.7 Adding a Pan and Tilt Head..............................................................................122 4.7.1 Using an Asus Xtion Pro instead of a Kinect........................................................124 4.7.2 Modeling the pan-and-tilt head..............................................................................124 4.7.3 Figuring out rotation axes......................................................................................127 4.7.4 A pan and tilt head using meshes on Pi Robot......................................................128 4.7.5 Using an Asus Xtion Pro mesh instead of a Kinect on Pi Robot...........................129 4.8 Adding One or Two Arms................................................................................129 4.8.1 Placement of the arm(s).........................................................................................130 4.8.2 Modeling the arm...................................................................................................130 4.8.3 Adding a gripper frame for planning.....................................................................133 4.8.4 Adding a second arm.............................................................................................134 4.8.5 Using meshes for the arm servos and brackets......................................................136 4.9 Adding a Telescoping Torso to the Box Robot.................................................138 4.10 Adding a Telescoping Torso to Pi Robot........................................................139 4.11 A Tabletop One-Arm Pi Robot.......................................................................140 4.12 Testing your Model with the ArbotiX Simulator............................................142 4.12.1 A fake Box Robot................................................................................................142 4.12.2 A fake Pi Robot...................................................................................................145 4.13 Creating your own Robot Description Package..............................................145 4.13.1 Using rosbuild......................................................................................................145 4.13.2 Using catkin.........................................................................................................146 4.13.3 Copying files from the rbx2_description package...............................................147 4.13.4 Creating a test launch file....................................................................................147 5. Controlling Dynamixel Servos: Take 2......................................................149 5.1 Installing the ArbotiX Packages.......................................................................149 5.2 Launching the ArbotiX Nodes..........................................................................150 5.3 The ArbotiX Configuration File.......................................................................154 5.4 Testing the ArbotiX Joint Controllers in Fake Mode........................................160 5.5 Testing the Arbotix Joint Controllers with Real Servos....................................162 5.6 Relaxing All Servos..........................................................................................165 5.7 Enabling or Disabling All Servos.....................................................................168 6. Robot Diagnostics........................................................................................169 6.1 The DiagnosticStatus Message.........................................................................170 6.2 The Analyzer Configuration File......................................................................171 6.3 Monitoring Dynamixel Servo Temperatures....................................................172 6.3.1 Monitoring the servos for a pan-and-tilt head.......................................................172 6.3.2 Viewing messages on the /diagnostics topic.........................................................175 6.3.3 Protecting servos by monitoring the /diagnostics topic.........................................177 6.4 Monitoring a Laptop Battery............................................................................181 6.5 Creating your Own Diagnostics Messages.......................................................182 6.6 Monitoring Other Hardware States...................................................................188 7. Dynamic Reconfigure..................................................................................191 7.1 Adding Dynamic Parameters to your own Nodes.............................................192 7.1.1 Creating the .cfg file..............................................................................................192 7.1.2 Making the .cfg file executable.............................................................................193 7.1.3 Configuring the CMakeLists.txt file......................................................................194 7.1.4 Building the package.............................................................................................194 7.2 Adding Dynamic Reconfigure Capability to the Battery Simulator Node........194 7.3 Adding Dynamic Reconfigure Client Support to a ROS Node.........................198 7.4 Dynamic Reconfigure from the Command Line...............................................201 8. Multiplexing Topics with mux & yocs.......................................................203 8.1 Configuring Launch Files to Use mux Topics..................................................204 8.2 Testing mux with the Fake TurtleBot...............................................................205 8.3 Switching Inputs using mux Services...............................................................206 8.4 A ROS Node to Prioritize mux Inputs..............................................................207 8.5 The YOCS Controller from Yujin Robot..........................................................210 8.5.1 Adding input sources.............................................................................................213 9. Head Tracking in 3D...................................................................................215 9.1 Tracking a Fictional 3D Target.........................................................................216 9.2 Tracking a Point on the Robot..........................................................................217 9.3 The 3D Head Tracking Node............................................................................220 9.3.1 Real or fake head tracking.....................................................................................220 9.1.2 Projecting the target onto the camera plane...........................................................221 9.4 Head Tracking with Real Servos......................................................................224 9.4.1 Real servos and fake target....................................................................................225 9.4.2 Real servos, real target...........................................................................................226 9.4.3 The nearest_cloud.py node and launch file...........................................................228 10. Detecting and Tracking AR Tags.............................................................233 10.1 Installing and Testing the ar_track_alvar Package..........................................234 10.1.1 Creating your own AR Tags................................................................................234 10.1.2 Generating and printing the AR tags...................................................................236 10.1.3 Launching the camera driver and ar_track_alvar node.......................................236 10.1.4 Testing marker detection.....................................................................................238 10.1.5 Understanding the /ar_pose_marker topic...........................................................238 10.1.6 Viewing the markers in RViz..............................................................................240 10.2 Accessing AR Tag Poses in your Programs....................................................240 10.2.1 The ar_tags_cog.py script....................................................................................240 10.2.2 Tracking the tags with a pan-and-tilt head..........................................................244 10.3 Tracking Multiple Tags using Marker Bundles..............................................245 10.4 Following an AR Tag with a Mobile Robot....................................................245 10.4.1 Running the AR follower script on a TurtleBot .................................................248 10.5 Exercise: Localization using AR Tags............................................................249 11. Arm Navigation using MoveIt!.................................................................251 11.1 Do I Need a Real Robot with a Real Arm?.....................................................252 11.2 Degrees of Freedom.......................................................................................252 11.3 Joint Types.....................................................................................................253 11.4 Joint Trajectories and the Joint Trajectory Action Controller.........................254 11.5 Forward and Inverse Arm Kinematics............................................................257 11.6 Numerical versus Analytic Inverse Kinematics..............................................258 11.7 The MoveIt! Architecture...............................................................................258 11.8 Installing MoveIt!...........................................................................................260 11.9 Creating a Static URDF Model for your Robot .............................................261 11.10 Running the MoveIt! Setup Assistant...........................................................262 11.10.1 Load the robot's URDF model...........................................................................263 11.2.2 Generate the collision matrix.............................................................................264 11.10.3 Add the base_odom virtual joint.......................................................................264 11.10.4 Adding the right arm planning group................................................................265 11.10.5 Adding the right gripper planning group...........................................................269 11.10.6 Defining robot poses..........................................................................................271 11.10.7 Defining end effectors.......................................................................................273 11.10.8 Defining passive joints......................................................................................273 11.10.9 Generating the configuration files.....................................................................273 11.11 Configuration Files Created by the MoveIt! Setup Assistant........................275 11.11.1 The SRDF file (robot_name.srdf)......................................................................275 11.11.2 The fake_controllers.yaml file...........................................................................276 11.11.3 The joint_limits.yaml file..................................................................................277 11.11.4 The kinematics.yaml file...................................................................................278 11.12 The move_group Node and Launch File.......................................................280 11.13 Testing MoveIt! in Demo Mode...................................................................280 11.13.1 Exploring additional features of the Motion Planning plugin...........................284 11.13.2 Re-running the Setup Assistant at a later time..................................................285 11.14 Testing MoveIt! from the Command Line....................................................286 11.15 Determining Joint Configurations and End Effector Poses...........................289 11.16 Using the ArbotiX Joint Trajectory Action Controllers................................292 11.16.1 Testing the ArbotiX joint trajectory action controllers in simulation...............292 11.16.2 Testing the ArbotiX joint trajectory controllers with real servos......................300 11.17 Configuring MoveIt! Joint Controllers.........................................................301 11.17.1 Creating the controllers.yaml file......................................................................302 11.17.2 Creating the controller manager launch file......................................................304 11.18 The MoveIt! API..........................................................................................305 11.19 Forward Kinematics: Planning in Joint Space..............................................306 11.20 Inverse Kinematics: Planning in Cartesian Space.........................................314 11.21 Pointing at or Reaching for a Visual Target..................................................322 11.22 Setting Constraints on Planned Trajectories.................................................324 11.22.1 Executing Cartesian Paths.................................................................................324 11.22.2 Setting other path constraints............................................................................330 11.23 Adjusting Trajectory Speed..........................................................................333 11.24 Adding Obstacles to the Planning Scene......................................................337 11.25 Attaching Objects and Tools to the Robot....................................................346 11.26 Pick and Place..............................................................................................348 11.27 Adding a Sensor Controller..........................................................................360 11.28 Running MoveIt! on a Real Arm..................................................................363 11.28.1 Creating your own launch files and scripts.......................................................364 11.28.2 Running the robot's launch files........................................................................364 11.2.3 Forward kinematics on a real arm.....................................................................365 11.28.4 Inverse kinematics on a real arm.......................................................................366 11.28.5 Cartesian paths on a real arm.............................................................................367 11.28.6 Pick-and-place on a real arm.............................................................................367 11.28.7 Pointing at or reaching for a visual target..........................................................367 11.29 Creating a Custom Fast IK Plugin................................................................368 12. Gazebo: Simulating Worlds and Robots.................................................375 12.1 Installing Gazebo............................................................................................376 12.2 Hardware Graphics Acceleration....................................................................377 12.3 Installing the ROS Gazebo Packages..............................................................378 12.4 Installing the Kobuki ROS Packages..............................................................379 12.5 Installing the UBR-1 Files..............................................................................379 12.6 Using the Gazebo GUI...................................................................................379 12.7 Missing Model Bug in Gazebo 1.9.................................................................381 12.8 Testing the Kobuki Robot in Gazebo..............................................................383 12.8.1 Accessing simulated sensor data.........................................................................385 12.8.2 Adding safety control to the Kobuki...................................................................389 12.8.3 Running the nav_square.py script from Volume 1..............................................391 12.9 Loading Other Worlds and Objects................................................................392 12.10 Testing the UBR-1 Robot in Gazebo............................................................393 12.10.1 UBR-1 joint trajectories.....................................................................................394 12.1.2 The UBR-1 and MoveIt!....................................................................................395 12.11 Real Pick-and-Place using the UBR-1 Perception Pipeline..........................397 12.11.1 Limitations of depth cameras............................................................................398 12.11.2 Running the demo..............................................................................................399 12.11.3 Understanding the real_pick_and_place.py script.............................................404 12.12 Running Gazebo Headless + RViz...............................................................407 13. Rosbridge: Building a Web GUI for your Robot...................................411 13.1 Installing the rosbridge Packages...................................................................411 13.2 Installing the mjpeg_sever Package................................................................412 13.3 Installing a Simple Web Server (mini-httpd)..................................................415 13.4 Starting mini-httpd, rosbridge and mjpeg_server............................................416 13.5 A Simple rosbridge HTML/Javascript GUI....................................................417 13.6 Testing the GUI with a Fake TurtleBot..........................................................420 13.7 Testing the GUI with a Real Robot.................................................................420 13.8 Viewing the Web GUI on another Device on your Network..........................421 13.9 Using the Browser Debug Console.................................................................421 13.10 Understanding the Simple GUI.....................................................................423 13.10.1 The HTML layout: simple_gui.html.................................................................423 13.1.2 The JavaScript code: simple_gui.js...................................................................428 13.11 A More Advanced GUI using jQuery, jqWidgets and KineticJS..................438 13.12 Rosbridge Summary.....................................................................................443 Appendix: Plug and Play USB Devices for ROS: Creating udev Rules......445 13.13 Adding yourself to the dialout Group...........................................................445 13.14 Determining the Serial Number of a Device.................................................446 13.15 UDEV Rules.................................................................................................447 13.16 Testing a UDEV Rule...................................................................................448 13.17 Using a UDEV Device Name in a ROS Configuration File..........................448

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值