简单看一下ROSPY

3 篇文章 0 订阅

开始之前,首先要认识一下ROS里面的传感器

1 Imu.msg 陀螺仪数据,加速度m/s2,旋转速度rad/s,
(1)Header header 时间戳与坐标系 (2)geometry_msgs/Quaternion orientation 四元数表示方向 (3)float64[9] orientation_covariance协方差
(4)geometry_msgs/Vector3 angular_velocity 角速度 (5)float64[9] angular_velocity_covariance 角速度协方差 (6)geometry_msgs/Vector3 linear_acceleration 线性加速度(7)float64[9] linear_acceleration_covariance 线性加速度的协方差
2 Joy.msg 遥控器输出数据
(1)Header header时间戳
(2)float32[] axes 轴反馈
(3)int32[] buttons 按钮反馈
3 LaserScan.msg 线激光
(1)Header header 第一束光线获取的时间戳,frame_id 中,角度增益正向为z轴逆时针旋转,0角度为x轴正向
(2) float32 angle_min, float32 angle_max ,float32 angle_increment 起始角,结束角度,角度增益 (rad)
(3)float32 time_increment 两次测量之间的时间差(s),用于激光摄像头移动时,3d点的插入
(4)float32 scan_time 两次扫描之间的时间差(s)
(5)float32 range_min,float32 range_max 激光测距仪的最小距离、最大距离 (m)
(6)float32[] ranges 距离数组(m)
(7)float32[] intensities 强度数组(没有)
4 JointState.msg 关节状态(由扭矩控制的关节)
(1)std_msgs/Header header
(2)string[] name
(3)float64[] position 位置(m或rad)
(4)float64[] velocity速度(m/s或rad/s)
(5)float64[] effort 力(Nm或N)
资料来自:http://blog.csdn.net/ji0525084/article/details/14134367
因为我用的是纯粹的激光扫描,视觉图像方面的暂时线pass掉了。

其次认识一下 geometry_msgs消息类型

Point 点
float64 x ,float64 y,float64 z
Point32 float32 x,float32 y,float32 z 一般使用Point,大规模点云使用Point32
PointStamped
(1)std_msgs/Header header 包含坐标系和时间戳信息 (2)geometry_msgs/Point point 点
Polygon 多边形 geometry_msgs/Point32[] points
PolygonStamped
(1)std_msgs/Header header
(2)geometry_msgs/Polygon polygon
Pose 位姿(1)geometry_msgs/Point position 位置 (2)geometry_msgs/Quaternion orientation 姿态,即方向
Pose2D 2d平面的位姿float64 x,float64 y,float64 theta
PoseArray 位姿序列
(1)std_msgs/Header header
(2)geometry_msgs/Pose[] poses
PoseStamped 位姿
(1)std_msgs/Header header
(2)geometry_msgs/Pose pose
PoseWithCovariance
(1)geometry_msgs/Pose pose 即(x, y, z, X 旋转, Y旋转, Z旋转)(2)float64[36] covariance协方差
PoseWithCovarianceStamped
(1)std_msgs/Header header (2)geometry_msgs/PoseWithCovariance pose
Quaternion 四元数旋转float64 x,float64 y,float64 z,float64 w
QuaternionStamped
Transform 坐标系之间的变换
(1)geometry_msgs/Vector3 translation 平移向量 (2)geometry_msgs/Quaternion rotation 旋转向量
TransformStamped
Twist 速度
(1)geometry_msgs/Vector3 linear 线速度(2)geometry_msgs/Vector3 angular角速度
TwistStamped
TwistWithCovariance
TwistWithCovarianceStamped
Vector3 向量float64 x,float64 y,float64 z
Vector3Stamped
Wrench 力
(1)geometry_msgs/Vector3 force 力(2)geometry_msgs/Vector3 torque 扭矩
WrenchStamped

趁热打铁看一下nav_msgs消息类型

一、消息类型

1 GridCells 栅格单元
(1)std_msgs/Header header头,时间戳与坐标系
(2)float32 cell_width,float32 cell_height宽度与高度
(3)geometry_msgs/Point[] cells数组
2 MapMetaData 占有率栅格地图数据
(1)time map_load_time 地图被加载的时间
(2)float32 resolution 分辨率(m/cell)(3)uint32 width,uint32 height (4)geometry_msgs/Pose origin 真实世界中的原点 (m,m,rad) 图像中的(0,0)点
3 OccupancyGrid 2d栅格地图
(1)Header header
(2)nav_msgs/MapMetaData info 地图信息
(3)int8[] data 占有率地图数据序列,概率为【0-100】,未知为-1
4 Odometry 估计的位姿与速度
(1)std_msgs/Header header 位姿所在坐标系
(2)string child_frame_id 速度所在坐标系 (3)geometry_msgs/PoseWithCovariance pose 位姿 (4)geometry_msgs/TwistWithCovariance twist 速度
5 Path 路径
(1)Header header
(2)geometry_msgs/PoseStamped[] poses 代表路径的三维点坐标数组
二、服务
1 GetMap.srv 接收地图
不发送数据
接受:nav_msgs/OccupancyGrid map
2 GetPlan.srv 规划当前位置到目标位置的路径
发送(1)geometry_msgs/PoseStamped start 起始点
(2)geometry_msgs/PoseStamped goal 目标点
(3) float32 tolerance 如果不能到达目标,最近可到的约束
-接受
nav_msgs/Path plan 路径

现在可以看一下ROSPY

还真是没什么东西
python的math模块 http://blog.csdn.net/iamaiearner/article/details/9381347
math.copysign(x,y)—返回与y同号的x值
sqrt–返回平方根
pow(x,y)–返回x的y次方

对着下面检查直线距离的程序demo捋一遍吧

#!/usr/bin/env python

import rospy 
from geometry_msgs.msg import Twist, Point //导入点和速度
from math import copysign, sqrt, pow
import tf
#校正的类
class CalibrateLinear():
    def __init__(self):
        # Give the node a name
        rospy.init_node('calibrate_linear', anonymous=False
        # Set rospy to execute a shutdown function when terminating the script
        rospy.on_shutdown(self.shutdown)

        # How fast will we check the odometry values? 读取里程计的速率
        self.rate = rospy.get_param('~rate', 20)
        r = rospy.Rate(self.rate)

        # Set the distance to travel 
        self.test_distance = rospy.get_param('~test_distance', 1.0) # meters
        self.speed = rospy.get_param('~speed', 0.15) # meters per second
        self.tolerance = rospy.get_param('~tolerance', 0.01) # meters
        self.odom_linear_scale_correction = rospy.get_param('~odom_linear_scale_correction', 1.0)
        self.start_test = rospy.get_param('~start_test', True)

        # Publisher to control the robot's speed
        self.cmd_vel = rospy.Publisher('/cmd_vel', Twist, queue_size=5)


        # The base frame is base_footprint for the TurtleBot but base_link for Pi Robot
        self.base_frame = rospy.get_param('~base_frame', '/base_footprint')

        # The odom frame is usually just /odom
        self.odom_frame = rospy.get_param('~odom_frame', '/odom')

        # Initialize the tf listener
        self.tf_listener = tf.TransformListener()

        # Give tf some time to fill its buffer
        rospy.sleep(2)

        # Make sure we see the odom and base frames
        self.tf_listener.waitForTransform(self.odom_frame, self.base_frame, rospy.Time(), rospy.Duration(60.0))        

        rospy.loginfo("Bring up rqt_reconfigure to control the test.")

        self.position = Point()

        # Get the starting position from the tf transform between the odom and base frames
        self.position = self.get_position()

        x_start = self.position.x
        y_start = self.position.y

        move_cmd = Twist()

        while not rospy.is_shutdown():
            # Stop the robot by default
            move_cmd = Twist()

            if self.start_test:
                # Get the current position from the tf transform between the odom and base frames
                self.position = self.get_position()

                # Compute the Euclidean distance from the target point
                distance = sqrt(pow((self.position.x - x_start), 2) +
                                pow((self.position.y - y_start), 2))

                # Correct the estimated distance by the correction factor
                distance *= self.odom_linear_scale_correction

                # How close are we?
                error =  distance - self.test_distance

                # Are we close enough?
                if not self.start_test or abs(error) <  self.tolerance:
                    self.start_test = False
                    params = {'start_test': False}
                    rospy.loginfo(params)
                else:
                    # If not, move in the appropriate direction
                    move_cmd.linear.x = copysign(self.speed, -1 * error)
            else:
                self.position = self.get_position()
                x_start = self.position.x
                y_start = self.position.y

            self.cmd_vel.publish(move_cmd)
            r.sleep()

        # Stop the robot
        self.cmd_vel.publish(Twist())

    def get_position(self):
        # Get the current transform between the odom and base frames
        try:
            (trans, rot)  = self.tf_listener.lookupTransform(self.odom_frame, self.base_frame, rospy.Time(0))
        except (tf.Exception, tf.ConnectivityException, tf.LookupException):
            rospy.loginfo("TF Exception")
            return

        return Point(*trans)

    def shutdown(self):
        # Always stop the robot when shutting down the node
        rospy.loginfo("Stopping the robot...")
        self.cmd_vel.publish(Twist())
        rospy.sleep(1)

if __name__ == '__main__':
    try:
        CalibrateLinear()
        rospy.spin()
    except:
        rospy.loginfo("Calibration terminated.")
  • 4
    点赞
  • 12
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值