【OnnxRuntime】在linux下编译并安装C++版本的onnx-runtime

安装C++接口的onnx-runtime

安装依赖项:

安装CMake:可以通过包管理器(如apt、yum等)安装CMake。

安装C++编译器:确保系统中已安装C++编译器,如GCC或Clang。

下载源文件

克隆ONNX Runtime的GitHub仓库,指定版本是为了适配python3.8:

git clone --branch v1.5.2 --recursive https://gitee.com/lee-zq/onnxruntime.git

构建ONNX Runtime

进入ONNX Runtime的源代码目录:
// 首先创建一个conda环境,因为./build.sh实际上调用的是./tools/cl_build/build.py

cd onnxruntime
conda create -n onnxruntime python=3.8
./build.sh --skip_tests --config Release --build_shared_lib --parallel

注意,若配合cuda使用,命令行末尾应添加

-cuda_home /usr/local/cuda-11.3 --cudnn_home /usr/local/cuda-11.3

注意,编译前,确保机器装有linux环境:

sudo apt-get install build-essential

这将使用所有可用的CPU核心进行编译。如果希望使用指定数量的核心,可以将$(nproc)替换为所需的核心数量。

安装ONNX Runtime

运行以下命令进行安装:

cd /build/Linux/Release
sudo make install

通过按照上述步骤,您应该能够在Linux上成功安装ONNX Runtime。请注意,安装过程中可能需要根据您的系统和需求进行适当的调整。

Install the project...
-- Install configuration: "Release"
-- Installing: /usr/local/include/onnxruntime/core/common
-- Installing: /usr/local/include/onnxruntime/core/common/code_location.h
-- Installing: /usr/local/include/onnxruntime/core/common/common.h
-- Installing: /usr/local/include/onnxruntime/core/common/const_pointer_container.h
-- Installing: /usr/local/include/onnxruntime/core/common/eigen_common_wrapper.h
-- Installing: /usr/local/include/onnxruntime/core/common/exceptions.h
-- Installing: /usr/local/include/onnxruntime/core/common/logging
-- Installing: /usr/local/include/onnxruntime/core/common/logging/capture.h
-- Installing: /usr/local/include/onnxruntime/core/common/logging/isink.h
-- Installing: /usr/local/include/onnxruntime/core/common/logging/logging.h
-- Installing: /usr/local/include/onnxruntime/core/common/logging/macros.h
-- Installing: /usr/local/include/onnxruntime/core/common/logging/severity.h
-- Installing: /usr/local/include/onnxruntime/core/common/make_unique.h
-- Installing: /usr/local/include/onnxruntime/core/common/optional.h
-- Installing: /usr/local/include/onnxruntime/core/common/status.h
-- Installing: /usr/local/include/onnxruntime/core/graph
-- Installing: /usr/local/include/onnxruntime/core/graph/basic_types.h
-- Installing: /usr/local/include/onnxruntime/core/graph/constants.h
-- Installing: /usr/local/include/onnxruntime/core/graph/function.h
-- Installing: /usr/local/include/onnxruntime/core/graph/graph.h
-- Installing: /usr/local/include/onnxruntime/core/graph/graph_nodes.h
-- Installing: /usr/local/include/onnxruntime/core/graph/graph_viewer.h
-- Installing: /usr/local/include/onnxruntime/core/graph/indexed_sub_graph.h
-- Installing: /usr/local/include/onnxruntime/core/graph/node_arg.h
-- Installing: /usr/local/include/onnxruntime/core/graph/onnx_protobuf.h
-- Installing: /usr/local/include/onnxruntime/core/graph/schema_registry.h
-- Installing: /usr/local/include/onnxruntime/core/framework
-- Installing: /usr/local/include/onnxruntime/core/framework/alloc_kind.h
-- Installing: /usr/local/include/onnxruntime/core/framework/allocator.h
-- Installing: /usr/local/include/onnxruntime/core/framework/customregistry.h
-- Installing: /usr/local/include/onnxruntime/core/framework/data_types.h
-- Installing: /usr/local/include/onnxruntime/core/framework/data_types_internal.h
-- Installing: /usr/local/include/onnxruntime/core/framework/endian.h
-- Installing: /usr/local/include/onnxruntime/core/framework/execution_provider.h
-- Installing: /usr/local/include/onnxruntime/core/framework/fence.h
-- Installing: /usr/local/include/onnxruntime/core/framework/framework_common.h
-- Installing: /usr/local/include/onnxruntime/core/framework/func_api.h
-- Installing: /usr/local/include/onnxruntime/core/framework/kernel_def_builder.h
-- Installing: /usr/local/include/onnxruntime/core/framework/kernel_registry.h
-- Installing: /usr/local/include/onnxruntime/core/framework/ml_value.h
-- Installing: /usr/local/include/onnxruntime/core/framework/op_kernel.h
-- Installing: /usr/local/include/onnxruntime/core/framework/op_kernel_info.h
-- Installing: /usr/local/include/onnxruntime/core/framework/op_node_proto_helper.h
-- Installing: /usr/local/include/onnxruntime/core/framework/ortdevice.h
-- Installing: /usr/local/include/onnxruntime/core/framework/ortmemoryinfo.h
-- Installing: /usr/local/include/onnxruntime/core/framework/run_options.h
-- Installing: /usr/local/include/onnxruntime/core/framework/sparse_tensor.h
-- Installing: /usr/local/include/onnxruntime/core/framework/tensor.h
-- Installing: /usr/local/include/onnxruntime/core/framework/tensor_shape.h
-- Installing: /usr/local/include/onnxruntime/core/providers/cpu
-- Installing: /usr/local/include/onnxruntime/core/providers/cpu/cpu_provider_factory.h
-- Installing: /usr/local/include/onnxruntime/core/optimizer
-- Installing: /usr/local/include/onnxruntime/core/optimizer/graph_transformer.h
-- Installing: /usr/local/include/onnxruntime/core/optimizer/graph_transformer_level.h
-- Installing: /usr/local/include/onnxruntime/core/optimizer/graph_transformer_utils.h
-- Installing: /usr/local/include/onnxruntime/core/optimizer/rewrite_rule.h
-- Installing: /usr/local/include/onnxruntime/core/optimizer/rule_based_graph_transformer.h
-- Installing: /usr/local/include/onnxruntime/core/session
-- Installing: /usr/local/include/onnxruntime/core/session/automl_data_containers.h
-- Installing: /usr/local/include/onnxruntime/core/session/environment.h
-- Installing: /usr/local/include/onnxruntime/core/session/experimental_onnxruntime_cxx_api.h
-- Installing: /usr/local/include/onnxruntime/core/session/experimental_onnxruntime_cxx_inline.h
-- Installing: /usr/local/include/onnxruntime/core/session/onnxruntime_c_api.h
-- Installing: /usr/local/include/onnxruntime/core/session/onnxruntime_cxx_api.h
-- Installing: /usr/local/include/onnxruntime/core/session/onnxruntime_cxx_inline.h
-- Installing: /usr/local/include/onnxruntime/core/session/onnxruntime_session_options_config_keys.h
-- Installing: /usr/local/lib/libonnxruntime.so.1.5.2
-- Installing: /usr/local/lib/libonnxruntime.so
-- Installing: /usr/local/bin/onnx_test_runner

安装完成。

以下是在 Windows 上使用 C++ 安装 onnxruntime-gpu 的步骤: 1. 安装 Visual Studio 2019,确保安装时勾选了 C++ 工作负载。 2. 安装 CUDA Toolkit 和 cuDNN。请根据您的 GPU 型号和操作系统版本选择相应的版本。在安装 CUDA Toolkit 时,建议选择自定义安装安装所需的组件。 3. 下载 onnxruntime-gpu 的源代码。您可以从 onnxruntime 的 GitHub 仓库中克隆源代码,也可以下载预编译的二进制文件。 4. 使用 Visual Studio 打开解决方案文件 onnxruntime\onnxruntime.sln。 5. 在 Visual Studio 的“解决方案资源管理器”中右键单击“onnxruntime”项目,选择“生成”->“生成解决方案”。 6. 打开“onnxruntime\cmake\windows\CMakeSettings.json”文件,修改“CUDA_TOOLKIT_ROOT_DIR”和“CUDNN_HOME”变量的值为您安装 CUDA Toolkit 和 cuDNN 的路径。 7. 在 Visual Studio 的“解决方案资源管理器”中右键单击“onnxruntime”项目,选择“属性”。 8. 在“配置属性”->“VC++ 目录”中,添加 CUDA Toolkit 和 cuDNN 的 include 文件夹路径。 9. 在“配置属性”->“连接器”->“常规”中,添加 CUDA Toolkit 和 cuDNN 的库文件夹路径。 10. 在“配置属性”->“连接器”->“输入”中,添加以下库文件: - cublas.lib - cudnn.lib - cudart.lib - nvinfer.lib - nvinfer_plugin.lib - onnxruntime.lib 11. 在 Visual Studio 中重新生成解决方案。 12. 测试 onnxruntime-gpu 是否安装成功。您可以使用 onnxruntime-gpu 提供的 C++ API 来加载和运行 ONNX 模型。 希望这些步骤可以帮助您成功安装 onnxruntime-gpu。
评论 8
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

颢师傅

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值