# 报错处理 ‘torch.dtype‘ object has no attribute ‘element_size‘

在导入大模型时,有时会报错:'torch.dtype' object has no attribute 'element_size'。

可以修改报错文件 /miniconda3/lib/python3.8/site-packages/transformers/modeling_utils.py中的信息,只需注释5行代码,并新增9行代码即可顺利运行,修改如下:

        for param in total_parameters:
            if param.requires_grad or not only_trainable:
                # For 4bit models, we need to multiply the number of parameters by 2 as half of the parameters are
                # used for the 4bit quantization (uint8 tensors are stored)
                if is_loaded_in_4bit and isinstance(param, bnb.nn.Params4bit):
    #                quant_storage = self.hf_quantizer.quantization_config.bnb_4bit_quant_storage
    #                # For compatibility with older PT version - see: https://github.com/huggingface/peft/pull/1635
    #                nb_params = (
    #                    quant_storage.itemsize if hasattr(quant_storage, "itemsize") else quant_storage.element_size()
    #                )
    #               total_numel.append(param.numel() * 2 * nb_params)
                    if hasattr(param, "element_size"):
                        num_bytes = param.element_size()
                    elif not hasattr(param, "quant_storage"):
                        num_bytes = 1
                    else:
                        num_bytes = param.quant_storage.itemsize
                    total_numel.append(
                        param.numel() * 2 * num_bytes
                    )
                else:
                    total_numel.append(param.numel())

参考资料:

fix for itemsize => element_size() for torch backwards compat by winglian · Pull Request #30133 · huggingface/transformers · GitHub

  • 2
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值