网上关于深度相机跑ORB_SLAM的教程大多数是以Kinect作为传感器的。高博的博客中有一篇是用的Kinect2,在ROS运行的orb_slam2。但是有头没有Kinect相机,但我想没人起码有一台安卓手机吧,于是用安卓手机的摄像头做单目采集图像,发送到ros系统给orb_slam构建立体图
1.ROS下orb-slam安装
网页 打开https://github.com/raulmur/ORB_SLAM
认真阅读README.md文档,文档有ORB_SLAM,安装依赖,安装步骤
1安装步骤
ORB_SLAM 编译运行依赖 opencv2.4 eigen3(3.1.0以上) boost blas lapack g2o(无需下载,源码自带) DBoW2(无需下载,源码自带)
1.1安装依赖
$ sudo apt-get install libeigen3-dev
按如下步骤进行安装 一般不会有问题
Installation
Make sure you have installed ROS and all library dependencies (boost, eigen3, opencv, blas, lapack).//安装好依赖
Clone the repository:
git clone https://github.com/raulmur/ORB_SLAM.git ORB_SLAM
Add the path where you cloned ORB-SLAM to the
ROS_PACKAGE_PATH
environment variable. To do this, modify your .bashrc and add at the bottom the following line (replace PATH_TO_PARENT_OF_ORB_SLAM): //配置环境变量export ROS_PACKAGE_PATH=${ROS_PACKAGE_PATH}:PATH_TO_PARENT_OF_ORB_SLAM
Build g2o. Go into
Thirdparty/g2o/
and execute:mkdir build cd build cmake .. -DCMAKE_BUILD_TYPE=Release make
Tip: To achieve the best performance in your computer, set your favorite compilation flags in line 61 and 62 of
Thirdparty/g2o/CMakeLists.txt
(by default -03 -march=native)Build DBoW2. Go into Thirdparty/DBoW2/ and execute:
mkdir build cd build cmake .. -DCMAKE_BUILD_TYPE=Release make
Tip: Set your favorite compilation flags in line 4 and 5 of
Thirdparty/DBoW2/CMakeLists.txt
(by default -03 -march=native)Build ORB_SLAM. In the ORB_SLAM root execute:
If you use ROS Indigo, remove the depency of opencv2 in the manifest.xml.
mkdir build cd build cmake .. -DROS_BUILD_TYPE=Release 这步骤可能会报错 make
Tip: Set your favorite compilation flags in line 12 and 13 of
./CMakeLists.txt
(by default -03 -march=native)在执行 cmake .. -DROS_BUILD_TYPE=Release 这步骤可能会报如下错误
解决办法 在.bashrc 文件末添加(一定是文件末)
export ROS_PACKAGE_PATH=${ROS_PACKAGE_PATH}:PATH_TO_PARENT_OF_ORB_SLAM
同时 在执行make命令会报错 eigen的问题
1 template <typename MatrixType> 2 class LinearSolverEigen: public LinearSolver<MatrixType> 3 { 4 public: 5 typedef Eigen::SparseMatrix<double, Eigen::ColMajor> SparseMatrix; 6 typedef Eigen::Triplet<double> Triplet; 7 typedef Eigen::PermutationMatrix<Eigen::Dynamic, Eigen::Dynamic, SparseMatrix::Index> PermutationMatrix;
改为
1 template <typename MatrixType> 2 class LinearSolverEigen: public LinearSolver<MatrixType> 3 { 4 public: 5 typedef Eigen::SparseMatrix<double, Eigen::ColMajor> SparseMatrix; 6 typedef Eigen::Triplet<double> Triplet; 8 typedef Eigen::PermutationMatrix<Eigen::Dynamic, Eigen::Dynamic, int> PermutationMatrix; 9 /**
再次make 成功
使用方法
. Usage
See section 5 to run the Example Sequence.
Launch ORB-SLAM from the terminal (
roscore
should have been already executed):rosrun ORB_SLAM ORB_SLAM PATH_TO_VOCABULARY PATH_TO_SETTINGS_FILE
You have to provide the path to the ORB vocabulary and to the settings file. The paths must be absolute or relative to the ORB_SLAM directory.
We already provide the vocabulary file we use in ORB_SLAM/Data/ORBvoc.txt.tar.gz
. Uncompress the file, as it will be loaded much faster.
The last processed frame is published to the topic
/ORB_SLAM/Frame
. You can visualize it usingimage_view
:rosrun image_view image_view image:=/ORB_SLAM/Frame _autosize:=true
The map is published to the topic
/ORB_SLAM/Map
, the current camera pose and global world coordinate origin are sent through/tf
in frames/ORB_SLAM/Camera
and/ORB_SLAM/World
respectively. Runrviz
to visualize the map:in ROS Fuerte:
rosrun rviz rviz -d Data/rviz.vcg
in ROS Groovy or a newer version:
rosrun rviz rviz -d Data/rviz.rviz
ORB_SLAM will receive the images from the topic
/camera/image_raw
. You can now play your rosbag or start your camera node. If you have a sequence with individual image files, you will need to generate a bag from them. We provide a tool to do that: https://github.com/raulmur/BagFromImages.
Tip: Use a roslaunch to launch ORB_SLAM
, image_view
and rviz
from just one instruction. We provide an example:
in ROS Fuerte:
roslaunch ExampleFuerte.launch
in ROS Groovy or a newer version:
roslaunch ExampleGroovyOrNewer.launch
#5. Example Sequence We provide the settings and the rosbag of an example sequence in our lab. In this sequence you will see a loop closure and two relocalisation from a big viewpoint change.
Download the rosbag file:
http://webdiis.unizar.es/~raulmur/orbslam/downloads/Example.bag.tar.gz.Alternative link: https://drive.google.com/file/d/0B8Qa2__-sGYgRmozQ21oRHhUZWM/view?usp=sharing
Uncompress the file.
Launch ORB_SLAM with the settings for the example sequence. You should have already uncompressed the vocabulary file (
/Data/ORBvoc.txt.tar.gz
)
in ROS Fuerte:
roslaunch ExampleFuerte.launch
*in ROS Groovy or newer versions*:
roslaunch ExampleGroovyHydro.launch
Once the ORB vocabulary has been loaded, play the rosbag (press space to start):
rosbag play --pause Example.bag
#6. The Settings File
ORB_SLAM reads the camera calibration and setting parameters from a YAML file. We provide an example in Data/Settings.yaml
, where you will find all parameters and their description. We use the camera calibration model of OpenCV.
Please make sure you write and call your own settings file for your camera (copy the example file and modify the calibration)
测试orb_slam
终端运行
$ roscore
ORB_SLAM$ roslaunch ./ExampleGroovyOrNewer.launch //ORB_SLAM目录下
$ rosbag play --pause Example.bag
运行起来后
以上OK吴文提之后接下来 不使用bag数据包,从安卓手机摄像头采集图像 下发到ROS环境
安卓手机采集图像下发下发到ROS环境 操作参见 博客 https://www.cnblogs.com/hitcm/p/5616364.html
安装里面的apk:
git clone https://github.com/hitcm/Android_Camera-IMU.git
为了后面的显示,还需要执行sudo apt-get install ros-indigo-imu-tools
1.首先保证pc和安卓手机置于同一局域网下,然后开启ROS系统的roscore,打开安卓上的应用,在IP Port中修改ip地址为pc的ip地址,port不需要修改。
2.修改完毕后点击Connect,则连接成功,进入相机界面。
3.在pc上执行rostopic list,可以看到相机数据和imu数据的名字如下,前三个topic即是。
4.执行命令cd到上面下载git的本地目录下,执行上述git提供的launch文件,roslaunch android_cam-imu.launch
使用的效果如下:
OK 到此ROS能正常采集手机图像了
接下来手机当单目 将orb-slam跑起
1.终端启动 roslaunch android_cam-imu.launch
手20.机端apk