Unreal Engine NeuralNetworkInference 翻译

Unreal Engine的NeuralNetworkInference(NNI)框架提供了运行深度学习和神经网络推理的高效、易用和完整解决方案。它支持DirectML、AVX、CoreML等加速器,并通过ONNX实现与多种深度学习框架的互操作性。UNeuralNetwork类是主要接口,可以加载并执行ONNX模型的推理。示例代码展示了如何创建、加载网络以及执行推理操作。
摘要由CSDN通过智能技术生成
/**
 * NeuralNetworkInference (NNI) is Unreal Engine's framework for running deep learning and neural network inference. It is focused on:
 * - Efficiency: Underlying state-of-the-art accelerators (DirectML, AVX, CoreML, etc).
 * - Ease-of-use: Simple but powerful API.
 * - Completeness: All the functionality of any state-of-the-art deep learning framework.
 *
 * UNeuralNetwork is the key class of NNI, and the main one users should interact with. It represents the deep neural model itself. It is capable of
 * loading and running inference (i.e., a forward pass) on any ONNX (Open Neural Network eXchange) model. ONNX is the industry standard for ML
 * interoperability, and all major frameworks (PyTorch, TensorFlow, MXNet, Ca
  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 1
    评论
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值