ImportError: /usr/local/app/.local/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64

情况描述

环境:

linux
transformers 4.39.0
tokenizers 0.15.2
torch 2.1.2+cu121
flash-attn 2.3.3

在使用vllm运行xverse/XVERSE-13B-256K时(代码如下):

qwen_model = AutoModelForSequenceClassification.from_pretrained(
    args.pre_train,
    trust_remote_code=True,
    attn_implementation="flash_attention_2",
    torch_dtype=torch.bfloat16,
    device_map="auto",   # balanced_low_0
    num_labels=5
)

报错如下

Traceback (most recent call last):
  File "/usr/local/app/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1364, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "/data/miniconda3/envs/xxx/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/usr/local/app/.local/lib/python3.10/site-packages/transformers/models/qwen2/modeling_qwen2.py", line 49, in <module>
    from flash_attn import flash_attn_func, flash_attn_varlen_func
  File "/usr/local/app/.local/lib/python3.10/site-packages/flash_attn/__init__.py", line 3, in <module>
    from flash_attn.flash_attn_interface import (
  File "/usr/local/app/.local/lib/python3.10/site-packages/flash_attn/flash_attn_interface.py", line 10, in <module>
    import flash_attn_2_cuda as flash_attn_cuda
ImportError: /usr/local/app/.local/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda9SetDeviceEi

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/cfs/xxx/xxx/long-context/xxx/train.py", line 434, in <module>
    qwen_model = AutoModelForCausalLM.from_pretrained(
  File "/usr/local/app/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 565, in from_pretrained
    model_class = _get_model_class(config, cls._model_mapping)
  File "/usr/local/app/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 387, in _get_model_class
    supported_models = model_mapping[type(config)]
  File "/usr/local/app/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 740, in __getitem__
    return self._load_attr_from_module(model_type, model_name)
  File "/usr/local/app/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 754, in _load_attr_from_module
    return getattribute_from_module(self._modules[module_name], attr)
  File "/usr/local/app/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 698, in getattribute_from_module
    if hasattr(module, attr):
  File "/usr/local/app/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1354, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/usr/local/app/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1366, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import transformers.models.qwen2.modeling_qwen2 because of the following error (look up to see its traceback):
/usr/local/app/.local/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda9SetDeviceEi

解决

pip install flash-attn==2.5.9.post1
  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论
这个错误是一个ImportError,它指的是找不到一个名为"/usr/local/lib/python3.8/dist-packages/mmcv/_ext.cpython-38-x86_64-linux-gnu.so"的共享对象或动态链接库。具体的错误信息是"undefined symbol: Z27points_in_boxes_cpu_forwardN2at6TensorES0_S0"。 要解决这个问题,可以尝试使用ldd命令来查看"/usr/lib/python3/dist-packages/PyQt5/QtCore.cpython-37m-arm-linux-gnueabihf.so"依赖的库。可以在终端中输入"ldd /usr/lib/python3/dist-packages/PyQt5/QtCore.cpython-37m-arm-linux-gnueabihf.so"来查看。 另外,你还可以参考一个博客文章,链接为https://forums.linuxmint.com/viewtopic.php?f=47&t=291157,在这篇文章中有可能会提供一些关于解决这个问题的有用信息。<span class="em">1</span><span class="em">2</span><span class="em">3</span> #### 引用[.reference_title] - *1* [ImportError: /usr/local/anaconda3/envs/py38/lib/python3.8/site-packages/mmcv/_ext.cpython-38-x86_64-](https://blog.csdn.net/weixin_42130300/article/details/121616567)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v93^chatsearchT3_1"}}] [.reference_item style="max-width: 50%"] - *2* *3* [报错[ImportError: /usr/lib/python3/dist-packages/PyQt5/QtCore.cpython-37m-arm-linux-gnuea]](https://blog.csdn.net/mmmm0584/article/details/119382032)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v93^chatsearchT3_1"}}] [.reference_item style="max-width: 50%"] [ .reference_list ]

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

Cyril_KI

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值