关于GMac和FLOPs讨论

FLOPs: s小写,指浮点运算数,理解为计算量。可以用来衡量算法/模型的复杂度。(模型) 在论文中常用GFLOPs
(1 GFLOPs = 10^9 FLOPs)10亿次浮点运算数
(1 MFLOPs = 10^6 FLOPs)1百万次浮点运算数
一个MFLOPS(megaFLOPS)等于每秒一百万(=10^6)次的浮点运算,
一个GFLOPS(gigaFLOPS)等于每秒十亿(=10^9)次的浮点运算,
一个TFLOPS(teraFLOPS)等于每秒一万亿(=10^12)次的浮点运算,(1太拉)
一个PFLOPS(petaFLOPS)等于每秒一千万亿(=10^15)次的浮点运算,
一个EFLOPS(exaFLOPS)等于每秒一百京(=10^18)次的浮点运算,
一个ZFLOPS(zettaFLOPS)等于每秒十万京(=10^21)次的浮点运算。
https://github.com/sovrasov/flops-counter.pytorch

with torch.cuda.device(0):
macs, params = get_model_complexity_info(model, (3, 256, 192), as_strings=True,
                                         print_per_layer_stat=True, verbose=True)
    print('{:<30}  {:<8}'.format('Computational complexity: ', macs))
    print('{:<30}  {:<8}'.format('Number of parameters: ', params))

关于GMACs = GFLOPs讨论https://github.com/sovrasov/flops-counter.pytorch/issues/16
回答:
is there a typo here? I did a little reading and it seems that @snownus has it right. In general a multiply-accumulate is one multiplication and one addition, which can each be floating point operations. So 1 GMAC counts as 2 GFLOPs, meaning GMACs = .5 * GFLOPs (I’m not sure if this is what was already meant).

As for fused multiply-add (FMA) it seems that (if it is supported on a given chip/system) the two FLOPs are indeed computed “in a single step” (see here) or “at once” (see here). But this confuses our conversion. Perhaps in the case of FMA it is more accurate to say 1 GMACs = 1 GFLOPs? Hopefully someone with more expertise than me can clarify!
Params(M) MACs(G)

HRNet-W32 256×192实测
Computational complexity: 7.7 GMac
Number of parameters: 28.54 M
HRNet-W32256×192
论文中7.10 GFLOPs

  • 4
    点赞
  • 14
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 2
    评论
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

勇气的动力

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值