cu icar
Autononomy
Dr. Krovi curse, presentation
turtlebot3:https://www.turtlebot.com/
cost map:http://wiki.ros.org/costmap_2d
exploring turtelbot 3 and ros,
Ubuntu 1.604, install ros kinetic ,assemble turtle bot and install ros on tutlebot3
SLAM simulations, g_mapping in gazebo eniroment, visualization in Rviz, RQT Graph.
mover package to move turtle bot 3 in a straight line, simulated in gazebo
slam:tutlebot3_wold.launch, g-mapping metho
raspberry camera
line track:
chanllenges faced: light reflecting off the line, cause the camera to miss the line
Blob tracking:
detect the R/B color of RGB intensity and YUV threshold, set a bouding box around the detected color; dtermin the centroid
demo:mira
line following, blob tracking ,leg detection obstacle detection.
open ground: light affects the color.
leg detection: use people_detection pkg from fetch robotics to determing the shape of leg by 2d lidar scans.
carrot controller: x,y detect leg
Terrain Obstacle Detection and Analysis using LIDARhttps://pdfs.semanticscholar.org/a795/2062e73af37a9cdb21e6581e8f9a2ab786ee.pdf
Obstacle avoiding: get the map of the envrioment by SLAM
motion planner: A*
code:python
tune the rotot base footprint, inflation radius and the cost scaling factor in the navigation stack.
chanllege: localizaing,timesysnc between ros master and slave,
practise in gazebo, then integrate and transfer to turtlebot 3.
learned: ros, open source, python
Group 2
blob tracking:20 cm dsitance, by color and centroid .
HSV, use one single value to detect the color. mean value from the image dataset, particle filter works.
use color and shape.
leg detect package, co-ordiantes of the human wiht respect to hte odom frame.
http://wiki.ros.org/leg_detector
pid method was used to control the steer of the turtlebot 3.
chanllege: multi-person.
nav2d: error with gazebo.
amcl package:
Group3