yolov5测试trt int8推理速度

参考:https://github.com/wang-xinyu/tensorrtx/tree/master/yolov5
在编译c++的tensorrt的时候报错,就用python版的测试。

// Install python-tensorrt, pycuda, etc.
// Ensure the yolov5s.engine and libmyplugins.so have been built
python yolov5_det_trt.py

// Another version of python script, which is using CUDA Python instead of pycuda.
python yolov5_det_trt_cuda_python.py

这里需要有yolov5s.engine
用官方的export导出的engine只能导出fp16的yolov5s.engine,不能进行量化。于是又找了量化工程。
参考:https://www.likecs.com/show-308293829.html
量化时候又报错:

# python convert_trt_quant.py
*** onnx to tensorrt begin ***
found all 230 images to calib.
Reading engine from file models_save/yolov5s_int8.trt
[05/05/2023-10:10:02] [TRT] [I] [MemUsageChange] Init CUDA: CPU +325, GPU +0, now: CPU 477, GPU 6922 (MiB)
[05/05/2023-10:10:02] [TRT] [I] Loaded engine size: 10 MiB
[05/05/2023-10:10:02] [TRT] [E] 1: [stdArchiveReader.cpp::StdArchiveReader::35] Error Code 1: Serialization (Serialization assertion safeVersionRead == safeSerializationVersion failed.Version tag does not match. Note: Current Version: 0, Serialized Engine Version: 96)
[05/05/2023-10:10:02] [TRT] [E] 4: [runtime.cpp::deserializeCudaEngine::50] Error Code 4: Internal Error (Engine deserialization failed.)
Traceback (most recent call last):
  File "convert_trt_quant.py", line 104, in <module>
    main()
  File "convert_trt_quant.py", line 100, in main
    assert engine_fixed, 'Broken engine_fixed'
AssertionError: Broken engine_fixed

查找原因是tensorrt版本不一致,我的是8.x,工程需要的是7.x,于是又重新拉取trt7.2的镜像。
docker pull nvcr.io/nvidia/pytorch:20.11-py3

  • 0
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 1
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值