mmdeploy安装
可以参考我另一篇配置教程: conda虚拟环境安装配置mmdeploy
参考官方教程:官方文档
参考:https://zhuanlan.zhihu.com/p/484842986
模型转换用法
用tools/deploy.py可以将pth模型直接转为trt、ort等后端推理模型
python tools/deploy.py \
configs/mmdet3d/voxel-detection/voxel-detection_tensorrt_dynamic-kitti.py \
${$MMDET3D_DIR}/configs/pointpillars/hv_pointpillars_secfpn_6x8_160e_kitti-3d-3class.py \
${$MMDET3D_DIR}/checkpoints/hv_pointpillars_secfpn_6x8_160e_kitti-3d-3class_20200620_230421-aa0f3adb.pth \
${$MMDET3D_DIR}/demo/data/kitti/kitti_000008.bin \
--work-dir work-dir \
--device cuda:0 \
--show
终端显示: mmdeploy - INFO - ALL process success. 表明模型转换成功,在work-dir下会生成end2end.onnx,end2end.engine或其他后端推理模型
torch模型推理
使用mmdet3d提供的api进行torch模型推理
from mmdet3d.apis import init_model, inference_detector, show_result_meshlab
config_file = "./configs/pointpillars/hv_pointpillars_secfpn_6x8_160e_kitti-3d-3class.py"
checkpoinnt_file = './work_dirs/hv_pointpillars_secfpn_6x8_160e_kitti-3d-3class/latest.pth'
model = init_model(config_file, checkpoint_file, device='cuda:0'
pcd = './demo/data/kitti/kitti_000008.bin'
result, data = inference_detector(model, pcd)
out_dir = './work_dirs'
show_result_meshlab(data, result, out_dir, show=True)
onnx模型推理
python tools/test.py \
configs/mmdet3d/voxel-detection/voxel-detection_onnxruntime_dynamic.py \
${MMDET3D_DIR}/configs/pointpillars/hv_pointpillars_secfpn_6x8_160e_kitti-3d-3class.py \
--model ../work-dir/end2end.onnx \
--metrics bbox \
--device cuda:0
tensorrt模型推理
python tools/test.py \
configs/mmdet3d/voxel-detection/voxel-detection_tensorrt_dynamic-kitti.py \
${MMDET3D_DIR}/configs/pointpillars/hv_pointpillars_secfpn_6x8_160e_kitti-3d-3class.py \
--model ../work-dir/end2end.onnx \
--metrics bbox \
--device cuda:0