[linux] Qwen2Tokenizer报错 transformers版本问题

上午没问题,下午pull了新代码,就有了报错。。

发现是transformers版本问题。但。。其实我都默认安的是最新版本。。

也许这就是人生吧。。 

报错:

File "/Pai-Megatron-Patch/megatron_patch/tokenizer/__init__.py", line 213, in build_tokenizer
    tokenizer = _Qwen2Tokenizer(args.load, args.extra_vocab_size)
  File "/Pai-Megatron-Patch/megatron_patch/tokenizer/__init__.py", line 174, in __init__
    tokenizer = _Qwen2Tokenizer(args.load, args.extra_vocab_size)
  File "/Pai-Megatron-Patch/megatron_patch/tokenizer/__init__.py", line 174, in __init__
    tokenizer = _Qwen2Tokenizer(args.load, args.extra_vocab_size)
  File "/Pai-Megatron-Patch/megatron_patch/tokenizer/__init__.py", line 174, in __init__
    tokenizer = _Qwen2Tokenizer(args.load, args.extra_vocab_size)
  File "/Pai-Megatron-Patch/megatron_patch/tokenizer/__init__.py", line 174, in __init__
    tokenizer = _Qwen2Tokenizer(args.load, args.extra_vocab_size)
  File "/Pai-Megatron-Patch/megatron_patch/tokenizer/__init__.py", line 174, in __init__
    tokenizer = _Qwen2Tokenizer(args.load, args.extra_vocab_size)
  File "/Pai-Megatron-Patch/megatron_patch/tokenizer/__init__.py", line 174, in __init__
    tokenizer = _Qwen2Tokenizer(args.load, args.extra_vocab_size)
  File "/Pai-Megatron-Patch/megatron_patch/tokenizer/__init__.py", line 174, in __init__
    tokenizer = _Qwen2Tokenizer(args.load, args.extra_vocab_size)
  File "/Pai-Megatron-Patch/megatron_patch/tokenizer/__init__.py", line 174, in __init__
    self.tokenizer = AutoTokenizer.from_pretrained(
  File "/usr/local/lib/python3.10/dist-packages/transformers/models/auto/tokenization_auto.py", line 784, in from_pretrained
    self.tokenizer = AutoTokenizer.from_pretrained(
  File "/usr/local/lib/python3.10/dist-packages/transformers/models/auto/tokenization_auto.py", line 784, in from_pretrained
    self.tokenizer = AutoTokenizer.from_pretrained(
  File "/usr/local/lib/python3.10/dist-packages/transformers/models/auto/tokenization_auto.py", line 784, in from_pretrained
    self.tokenizer = AutoTokenizer.from_pretrained(
  File "/usr/local/lib/python3.10/dist-packages/transformers/models/auto/tokenization_auto.py", line 784, in from_pretrained
    self.tokenizer = AutoTokenizer.from_pretrained(
  File "/usr/local/lib/python3.10/dist-packages/transformers/models/auto/tokenization_auto.py", line 784, in from_pretrained
    self.tokenizer = AutoTokenizer.from_pretrained(
  File "/usr/local/lib/python3.10/dist-packages/transformers/models/auto/tokenization_auto.py", line 784, in from_pretrained
    self.tokenizer = AutoTokenizer.from_pretrained(
  File "/usr/local/lib/python3.10/dist-packages/transformers/models/auto/tokenization_auto.py", line 784, in from_pretrained
    self.tokenizer = AutoTokenizer.from_pretrained(
  File "/usr/local/lib/python3.10/dist-packages/transformers/models/auto/tokenization_auto.py", line 784, in from_pretrained
        raise ValueError(raise ValueError(
ValueError    ValueErrorraise ValueError(: : 
Tokenizer class Qwen2Tokenizer does not exist or is not currently imported.Tokenizer class Qwen2Tokenizer does not exist or is not currently imported.
        raise ValueError(raise ValueError(    ValueError
raise ValueError(

解决方案:

原本的:

pip install -U transformers -i https://pypi.doubanio.com/simple/

改成:

pip install  transformers==4.40  -i https://mirrors.aliyun.com/pypi/simple/

Successfully installed huggingface-hub-0.23.3 safetensors-0.4.3 tokenizers-0.19.1 transformers-4.41.2

  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

心心喵

喵喵(*^▽^*)

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值