RuntimeError: Exporting the operator var to ONNX opset version 11 is not supported. Please open a bu

jetson tx2 部署.pth文件–onnx时报错:
RuntimeError: Exporting the operator var to ONNX opset version 11 is not supported. Please open a bug to request ONNX export support for the missing operator.
是因为torch.var不支持。
解决办法:
在/anaconda3/envs/torch1.7/lib/python3.7/site-packages/torch/onnx/symbolic_opset11.py中加入以下代码段,即可顺利解决。


# This file exports ONNX ops for opset 11
import functools
import math
import sys
import warnings
from typing import List, Optional, Tuple, Union

import torch
import torch._C._onnx as _C_onnx
import torch.nn.modules.utils
import torch.onnx
from torch import _C

# Monkey-patch graph manipulation methods on Graph, used for the ONNX symbolics
from torch.onnx import symbolic_helper


@symbolic_helper.parse_args("v", "is", "i", "i")
def _var_mean(g, input, dim, correction, keepdim):
    if dim is None:
        mean = g.op("ReduceMean", input, keepdims_i=0)
        t_mean = mean
        num_elements = numel(g, input)
    else:
        mean = g.op("ReduceMean", input, axes_i=dim, keepdims_i=keepdim)
        t_mean = g.op("ReduceMean", input, axes_i=dim, keepdims_i=1)
        redudced_dims = g.op("Shape", input)
        # dim could contain one or multiple dimensions
        redudced_dims = g.op(
            "Gather",
            redudced_dims,
            g.op("Constant", value_t=torch.tensor(dim)),
            axis_i=0,
        )
        num_elements = g.op("ReduceProd", redudced_dims, keepdims_i=0)
    sub_v = g.op("Sub", input, t_mean)
    sqr_sub = g.op("Mul", sub_v, sub_v)
    keepdim_mean = 0 if dim is None else keepdim
    var = g.op("ReduceMean", sqr_sub, axes_i=dim, keepdims_i=keepdim_mean)
    # Correct bias in calculating variance, by dividing it over (N - correction) instead on N
    if correction is None:
        correction = 1
    if correction != 0:
        num_elements = g.op(
            "Cast", num_elements, to_i=symbolic_helper.cast_pytorch_to_onnx["Float"]
        )
        one = g.op("Constant", value_t=torch.tensor(correction, dtype=torch.float))
        mul = g.op("Mul", var, num_elements)
        var = g.op("Div", mul, g.op("Sub", num_elements, one))
    return var, mean


def std(g, input, *args):
    var, _ = var_mean(g, input, *args)
    return g.op("Sqrt", var)


def var(g, input, *args):
    var, _ = var_mean(g, input, *args)
    return var

# var_mean (and all variance-related functions) has multiple signatures, so need to manually figure
# out the correct arguments:
# aten::var_mean(Tensor self, bool unbiased)
# aten::var_mean(Tensor self, int[1] dim, bool unbiased, bool keepdim=False)
# aten::var_mean(Tensor self, int[1]? dim=None, *, int? correction=None, bool keepdim=False)
def var_mean(g, input, *args):
    if len(args) == 1:
        return _var_mean(g, input, None, args[0], None)
    else:
        return _var_mean(g, input, *args)


def std_mean(g, input, *args):
    var, mean = var_mean(g, input, *args)
    return g.op("Sqrt", var), mean
  • 1
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
这个错误通常是由于你的TensorFlow版本启用了Eager Execution(即立即执行模式),而在Eager Execution模式下,无法导出/导入Meta Graph。 解决这个问题的方法之一是禁用Eager Execution。你可以使用以下代码在TensorFlow 2.0中禁用Eager Execution: ```python import tensorflow as tf tf.compat.v1.disable_eager_execution() ``` 如果你正在使用TensorFlow 1.x,则可以按照以下方式禁用Eager Execution: ```python import tensorflow as tf tf.compat.v1.disable_eager_execution() ``` 如果你需要使用Eager Execution,但又需要导出/导入Meta Graph,则可以使用`tf.compat.v1.Session()`来创建一个不启用Eager Execution的会话。例如,你可以按照以下方式导出Meta Graph: ```python import tensorflow as tf # 构建计算图 x = tf.placeholder(tf.float32, shape=[None, 784], name='x') y = tf.placeholder(tf.float32, shape=[None, 10], name='y') W = tf.Variable(tf.zeros([784, 10]), name='W') b = tf.Variable(tf.zeros([10]), name='b') logits = tf.matmul(x, W) + b loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y)) train_op = tf.train.GradientDescentOptimizer(0.5).minimize(loss) # 创建会话并导出Meta Graph with tf.compat.v1.Session() as sess: sess.run(tf.compat.v1.global_variables_initializer()) saver = tf.compat.v1.train.Saver() saver.save(sess, 'my-model', global_step=0, write_meta_graph=True) ``` 在这个示例中,我们使用`tf.compat.v1.Session()`创建了一个不启用Eager Execution的会话,并使用`saver.save()`方法导出了Meta Graph,使得我们可以在其他地方导入该图。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值