问题描述:
基于mnn编译的面向安卓平台的模型推理so动态在android studio调用其推理函数时报空指针错误(完整错误输出在最后),且错误行在于mnn推理输出的指针数组元素的判断语句(如下代码),但之前使用其他模型时加入类似语句并未报错,所以猜测问题出在了模型上面
int* output_tensor_array = output_tensor->host<int>();
output_tensor_array[col + 1024 * row] == 2 //此处为addr2line显示的报错点
解决:
经过大量试验,发现在转换pytorch模型至onnx模型时,opset_version=10的模型在后续so调用函数均正常,opset_version=11的均错误,但是目前使用的模型实现中有新的pytorch算子只有在opset_version=11的时候才能转换为onnx格式,最终把该新算子改成简单的算子替代重新训练即可
torch.onnx.export(net, # model being run
x, # model input (or a tuple for multiple inputs)
output_path, # where to save the model (can be a file or file-like object)
export_params=True, # store the trained parameter weights inside the model file
opset_version=10, # the ONNX version to export the model to
do_constant_folding=True, # whether to execute constant folding for optimization
input_names = ['input'], # the model's input names
output_names = ['output'], # the model's output names
)
完整错误输出:
2021-07-11 16:07:46.463 29653-29653/? A/DEBUG: *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
2021-07-11 16:07:46.463 29653-29653/? A/DEBUG: Build fingerprint: 'HUAWEI/MRX-W09/HWMRX:10/HUAWEIMRX-W09/11.0.0.180C00:user/release-keys'
2021-07-11 16:07:46.463 29653-29653/? A/DEBUG: Revision: '0&#