这个大神算法的一作是由林山川在字节跳动实习期间完成的,他以一作身份发表论文《Real-Time High-Resolution Background Matting》,提出了Background Matting V2方法。实时感受一下这只AI的效果,
demo地址:https://peterl1n.github.io/RobustVideoMatting/#/demo
GitHub地址:https://github.com/PeterL1n/RobustVideoMatting
论文地址:https://arxiv.org/abs/2108.11515
Speed
Speed is measured with inference_speed_test.py
for reference.
GPU | dType | HD (1920x1080) | 4K (3840x2160) |
---|---|---|---|
RTX 3090 | FP16 | 172 FPS | 154 FPS |
RTX 2060 Super | FP16 | 134 FPS | 108 FPS |
GTX 1080 Ti | FP32 | 104 FPS | 74 FPS |
- Note 1: HD uses
downsample_ratio=0.25
, 4K usesdownsample_ratio=0.125
. All tests use batch size 1 and frame chunk 1. - Note 2: GPUs before Turing architecture does not support FP16 inference, so GTX 1080 Ti uses FP32.
- Note 3: We only measure tensor throughput. The provided video conversion script in this repo is expected to be much slower, because it does not utilize hardware video encoding/decoding and does not have the tensor transfer done on parallel threads. If you are interested in implementing hardware video encoding/decoding in Python, please refer to PyNvCodec.
Third-Party Projects
- NCNN C++ Android (@FeiGeChuanShu)
- lite.ai.toolkit (@DefTruth)
- Gradio Web Demo (@AK391)
- Unity Engine demo with NatML (@natsuite)
- MNN C++ Demo (@DefTruth)
- TNN C++ Demo (@DefTruth)
在windows上实现的时候需要借助使用Lite.AI.ToolKit C++工具箱来跑RobustVideoMatting的一些案例(GitHub - DefTruth/lite.ai.toolkit: 🛠 A lite C++ toolkit of awesome AI models with ONNXRuntime, NCNN, MNN and TNN. YOLOX, YOLOP, YOLOv5, YOLOR, NanoDet, YOLOX, SCRFD, YOLOX . MNN, NCNN, TNN, ONNXRuntime, CPU/GPU.) ,ONNXRuntime、MNN、NCNN和TNN四个版本。
二、蓝绿幕背景抠图
参考文章《Software Chroma Keying in an Immersive Virtual Environment》,其中
Alpha生成部分利用下面的方法:
蓝绿幕抠图Demo:
链接:https://pan.baidu.com/s/1Yw0Mqm4O8WfaCcYaPplwNg
提取码:ji7a
以上内容纯属于自己研究,希望对大家能提供帮助,有需要的问题可以发邮件请教:187100248@qq.com。
参考文献: