Ascend上构建MindSpore报has no member named ‘update output desc dpse‘ ;did you mean ‘update_output_desc_d

系统环境

硬件环境(Ascend/GPU/CPU): Ascend

MindSpore版本: 2.1.1

执行模式(PyNative/ Graph): 不限

报错信息

2.1 问题描述

Ascend上构建MindSpore报头文件缺失,报错如下:

In file included from /home/dev/mindspore/mindspore/ccsrc/transform/graph_ir/op_declare/experiment_ops_declare.h:21:0,

from /home/dev/mindspo re/mindspore/ccsrc/transform/graph_ ir/op_declare/experiment_ops_declare.cc:17:/home/dev/mindspo re/mindspore/ccsrc/transform/graph_ir/op_declare/experiment_ops_decla re.cc: In lambda function:

/home/dev/mindspore/mindspore/ccsrc/transform/graph_ir/op_declare/op_declare_macro.h:191:18: error: 'using element_type = class ge: : op: : FlashAttentionScore (aka class ge::op::F  lashAttentionScore}' hasno member named 'set_attr_input_layout'; did you mean 'set_attr_scale_value'? (void)p->set_attr_##name(ConvertAny(value, _VA ARGS)):

/home/dev/mindspore/mindspore/ccsrc/transform/graph_ir/op_declare/experiment_ ops_declare.cc:33:20: note: in expansion of macro 'ATTR_DESC'   {"input_layout", ATTR_DESC( input_layout, AnyTraits<std: :string>())},

/home/dev/mindspore/mindspore/ccs rc/transform/graph_ir/op_declare/experiment_ops_declare.cc: In lambda function:     /home/dev/mindspore/mindspore/ccsrc/trans form/graph ir/op_declare/op_declare macro.h:196:32: error: 'using element_type = class ge: :op: :FlashAttentionScore {aka class ge::op::F    lashAttentionScore}' has no member named 'get_attr_input_layout'; did you mean 'get_attr_scale_value'?    auto real_value = p->get_att r_##name();

/home/dev/mindspore/mindspore/cc src/transform/graph_ ir/op declare/experiment ops declare .cc:33:20: note: in expansion of maCrO 'ATTR DESC'   {"input_layout", ATTR_DESC( input_layout, AnyTraits<std: :string>())},     chenqxia

/home/dev/mindspore/mindspore/ccs rc/transform/graph_ir/op_declare/experiment_ops_declare.cc: In lambda function:     /home/dev/mindspo re/mindspore/ccsrc/transform/graph_ir/op_declare/op_declare_macro.h:191:18: error: 'using element_type = class ge::op::FlashAttentionScoreGrad {aka class ge::0

p: : FlashAttentionScoreGrad}' hasno member named 'set_attr_input_layout'; did you mean 'set_att r_scale_value'? (void)p->set_attr_##name(ConvertAny(value,—VA_ARGS ):

/home/dev/mindspore/mindspore/cc src/transform/graph_ir/op_declare/experiment_ops_declare .cc:52:20: note: in expansion of TmacrO 'ATTR_DESC'

{"input_layout", ATTR_DESC( input_layout, AnyTraits<std: :string>())},

/home/dev/mindspore/mindspore/ccs rc/transform/graph_ir/op_declare/experiment_ops_declare.cc: In lambda function:     /home/dev/mindspore/mindspore/ccsrc/transform/graph_ir/op_declare/op_declare_macro.h:196:32: error: 'using element_type = class ge: : op: : FlashAttentionScoreGrad {aka class ge::0  p:: FlashAttentionScoreGrad}' has no member named 'get_att r_input_layout'; did you mean 'get_attr_scale_value'?  auto real_value = p->get att r_##name():

/home/dev/mindspore/mindspore/ccs rc/transform/graph_ir/op_declare/experiment_ops_declare.cc :52:20: note: in expansion of macro *ATTR_DESC'

{"input_layout", ATTR_DESC( input_layout, AnyTraits<std: :string())}, /home/dev/mindspore/mindspore/ccs rc/transform/graph_ir/op_declare/experiment_ops_declare.cc: In lambda function: /home/dev/mindspo re/mindspore/ccsrc/transform/graph ir/op_declare/op_declare_macro.h:223:18: error: 'using element_type = class ge::op::FlashAttentionScoreGrad {aka class ge::0

p::FlashAttentionScoreG rad}' has no member named 'update output desc dpse' i did you mean 'update_output_desc_dq'?

(void)p->update_output_desc_##name(desc); -henpxao复制

根因分析

MindSpore和CANN包版本不匹配。

解决方案

需要将对应CANN包的latest/opp/built-in/op_proto/inc/experiment_ops.h文件复制并覆盖mindspore目录下的graphengine/910b/third_party/fwkacllib/inc/ops/experiment_ops.h,不需要清缓存,直接重新执行构建命令即可。

  • 5
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值