Introduction
The workflow of this project is:
- Detect 2D human joints from color image by Openpose.
- Compute 3D human joints from 2D joints and depth image.
- Detect target objects by YOLO.
- If the right arm is fully stretched, then the person is doing a "pointing" action. (The arm is stretched if the angle between the upper arm and forearm is smaller than the threshold.)
- The pointing direction(ray) is defined as the vector from shoulder to wrist.
- The 3D pixel that is in front of the wrist and is very close to the pointing ray is where the person is pointing to.
- Backproject this 3D pixel to 2D image. If this 2D pixel is in one of the objects' bounding boxes, then we know the person is pointing to that object! Done!
https://github.com/felixchenfy/ros_3d_pointing_detection
依赖项 openpose 3d版