本次实验目的在于采用捕获鼠标坐标点,通过点击鼠标左键或者右键,发布目标点,然后让baxter跟踪目标点
terminal 1 – 启动action_server服务:
in the baxter.sh's floder
检查本机IP,注意修改baxter.sh 内的 your_ip
./baxter.sh or run 'baxter' in the terminal -->see detail in the ~/.bashrc
rosrun baxter_tools enable_robot -e 使能机器人
rosrun baxter_tools enable_robot -s 查看机器人状态
rosrun baxter_interface joint_trajectory_action_server.py 关节轨迹动作服务
terminal 2 – 启动kinect视频流并捕获鼠标点:
./baxter.sh
rosrun kinect2_bridge kinect2_bridge
rosrun kinect2_viewer click_PublishPosition
其中,click_PublishPosition.cpp 中须在Viewer.cpp的基础上做如下修改,
1,创建 onMouse()鼠标坐标点采样的外部函数:
void onMouse(int event , int x, int y, int flags, void * ustc){
mouseX = x;
mouseY = y;
mouseBtnType = event ;
}
2,在imageViewer() 上做回调函数,从而更新随时变化的鼠标位置。
cv::set MouseCallback(window_name, onMouse, nullptr);
3,由于需要把采样到的鼠标坐标点发布到topic上,方便控制机器人的脚本订阅,因而需要创建Publisher
ros::Publisher leftBtnPointPub = nh. advertise< geometry_msgs::PointStamped >
("/kinect2/click_point/left" , 1 );
ros::Publisher rightBtnPointPub = nh. advertise< geometry_msgs::PointStamped >
("/kinect2/click_point/right" , 1 );
4,修改imageViewer(),作为彩色图像回显主函数中,发布鼠标三维点。
void imageViewer()
{
cv::Mat color, depth, depthDisp, combined
std ::chrono::time_point<std ::chrono::high_resolution_clock> start, now
double fps = 0
size_t frameCount = 0
std ::ostringstream oss
std ::ostringstream ossXYZ
const cv::Point pos(5 , 15 )
const cv::Scalar colorText = CV_RGB(255 , 0 , 0 )
const double sizeText = 0.5
const int lineText = 1
const int font = cv::FONT_HERSHEY_SIMPLEX