mediapipe Hand-tracking Usage 实记

  • Web Demo:https://storage.googleapis.com/tfjs-models/demos/handpose/index.html
  • source code: https://github.com/tensorflow/tfjs-models/tree/master/handpose

相关文档

环境准备

  • 操作系统:macOS Catalina version 10.15.5

bazel

$ brew install bazel
$ bazel --version
$ brew upgrade bazel

参考:Installing Bazel on macOS

opencv

1、 安装OpenCV

$ brew install opencv  # 安装目录为 /usr/local/Cellar/opencv

参考: MacOS使用homebrew安装OpenCV及遇到的坑

2、 fatal error: opencv2/core/version.hpp: No such file or directory

ln -s /usr/local/Cellar/opencv/4.3.0_1/include/opencv4/opencv2 /usr/local/Cellar/opencv/4.3.0_1/include/opencv2

编辑 ./mediapipe/third_party/opencv_macos.BUILD:

# Description:
#   OpenCV libraries for video/image processing on MacOS

licenses(["notice"])  # BSD license

exports_files(["LICENSE"])

# The following build rule assumes that OpenCV is installed by
# 'brew install opencv@3' command on macos.
# If you install OpenCV separately, please modify the build rule accordingly.
cc_library(
    name = "opencv",
    srcs = glob(
        [
            # "local/opt/opencv@3/lib/libopencv_core.dylib",
            # "local/opt/opencv@3/lib/libopencv_calib3d.dylib",
            # "local/opt/opencv@3/lib/libopencv_features2d.dylib",
            # "local/opt/opencv@3/lib/libopencv_highgui.dylib",
            # "local/opt/opencv@3/lib/libopencv_imgcodecs.dylib",
            # "local/opt/opencv@3/lib/libopencv_imgproc.dylib",
            # "local/opt/opencv@3/lib/libopencv_video.dylib",
            # "local/opt/opencv@3/lib/libopencv_videoio.dylib",

            "local/opt/opencv@4/lib/libopencv_core.dylib",
            "local/opt/opencv@4/lib/libopencv_calib3d.dylib",
            "local/opt/opencv@4/lib/libopencv_features2d.dylib",
            "local/opt/opencv@4/lib/libopencv_highgui.dylib",
            "local/opt/opencv@4/lib/libopencv_imgcodecs.dylib",
            "local/opt/opencv@4/lib/libopencv_imgproc.dylib",
            "local/opt/opencv@4/lib/libopencv_video.dylib",
            "local/opt/opencv@4/lib/libopencv_videoio.dylib",
            
        ],
    ),
    # hdrs = glob(["local/opt/opencv@3/include/opencv2/**/*.h*"]),
    # includes = ["local/opt/opencv@3/include/"],
    hdrs = glob(["local/opt/opencv@4/include/opencv2/**/*.h*"]),
    includes = ["local/opt/opencv@4/include/"],
    linkstatic = 1,
    visibility = ["//visibility:public"],
)

参考:OpenCV 2 headers not found on Arch Linux with OpenCV 4 #496

使用

Option 1: Running on CPU

hand_tracking_cpu
$ bazel build -c opt --define MEDIAPIPE_DISABLE_GPU=1 mediapipe/examples/desktop/hand_tracking:hand_tracking_cpu
# $ GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/hand_tracking/hand_tracking_cpu --calculator_graph_config_file=mediapipe/graphs/hand_tracking/hand_tracking_desktop_live.pbtxt --input_video_path=<input_video_path>
$ GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/hand_tracking/hand_tracking_cpu --calculator_graph_config_file=mediapipe/graphs/hand_tracking/hand_tracking_desktop_live.pbtxt --input_video_path=/Users/snorlaxse/Desktop/multi-hand-demo.mp4 --output_video_path=/Users/snorlaxse/Desktop/multi-hand-demo-output.mp4

第一次编译的时候会下载一些依赖,之后就不会了.
编译完成后,项目根目录下会出现 4个 alias文件夹(alias: origin)
**1、 bazel-bin: /private/var/tmp/_bazel_snorlaxse/8b6206b871f2e541142be86f99764a24/execroot/mediapipe/bazel-out/darwin-opt/bin
2、 bazel-mediapipe-master: /private/var/tmp/_bazel_snorlaxse/8b6206b871f2e541142be86f99764a24/execroot/mediapipe
3、 bazel-out: /private/var/tmp/_bazel_snorlaxse/8b6206b871f2e541142be86f99764a24/execroot/mediapipe/bazel-out
4、 bazel-testlogs: /private/var/tmp/_bazel_snorlaxse/8b6206b871f2e541142be86f99764a24/execroot/mediapipe/bazel-out/darwin-opt/testlogs
**

参考:Macos Catalina Hand tracking · Issue #477 · google/mediapipe

multi_hand_tracking_cpu
$ bazel build -c opt --define MEDIAPIPE_DISABLE_GPU=1 mediapipe/examples/desktop/multi_hand_tracking:multi_hand_tracking_cpu
$ GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/multi_hand_tracking/multi_hand_tracking_cpu --calculator_graph_config_file=mediapipe/graphs/hand_tracking/multi_hand_tracking_desktop_live.pbtxt --input_video_path=<input_video_path> --output_video_path=<output_video_path>
  

欲尝试检测图片数据

$ GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/multi_hand_tracking/multi_hand_tracking_cpu --calculator_graph_config_file=mediapipe/graphs/hand_tracking/multi_hand_tracking_desktop_live.pbtxt --input_video_path=/Users/snorlaxse/Desktop/A-1.JPG --output_video_path=/Users/snorlaxse/Desktop/A-1-output.JPG

不可用…

Option 2: Running on GPU

未尝试…

拓展

  • Accessing landmarks, tracking multiple hands, and enabling depth on desktop #200

    @JECBello For C++ desktop example, what you described is correct way to get result protos
    I have created an example for Object Detection desktop CPU of how to print out detection protos
    https://github.com/mgyong/mediapipe-issue200
    I copied the demo_run_graph_main.cc into demo_run_graph_main_out.cc

    1、Created a listener for detection stream
    2、i polled the detection stream kDetectionsStream using Next() for the detection packet
    3、Load detection_packet into variable output_detections
    4、In my case, the output of the packet is of type std::vector<::mediapipe::Detection>. In your case, it should be landmark proto

    Once you have example of how you do it for hand tracking, appreciate if you could share your implementation with the community
    A. For multi-hands, we are working on an example that will hopefully be available in early mid Nov.
    B. Can you create a separate issue on the question for desktop implementations of hand tracking such that they can capture depth (similar to how the android/ios 3D builds can output z coordinates)

  • Extracting landmarks #703

    Call CalculatorGraph::ObserveOutputStream to register a packet callback, then use Packet::Get,especially Get<NormalizedLandmarkList>, to read data.
    If you want to interfere the graph’s execution, just write a Node and put it in the graph pbtxt.

多手检测点输出

bazel build -c opt --define MEDIAPIPE_DISABLE_GPU=1 mediapipe/examples/desktop/multi_hand_tracking:multi_hand_tracking_cpu
# GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/multi_hand_tracking/multi_hand_tracking_cpu --calculator_graph_config_file=mediapipe/graphs/hand_tracking/multi_hand_tracking_desktop_live.pbtxt --input_video_path=/Users/snorlaxse/Desktop/multi-hand-demo.mp4 --output_video_path=/Users/snorlaxse/Desktop/multi-hand-demo-output.mp4
GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/multi_hand_tracking/multi_hand_tracking_cpu --input_video_path=/Users/snorlaxse/Desktop/multi-hand-demo.mp4 --output_video_path=/Users/snorlaxse/Desktop/multi-hand-demo-output.mp4


写在最后:若本文章对您有帮助,请点个赞啦 ٩(๑•̀ω•́๑)۶

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

爱学习的卡比兽

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值