torch/transformers版本查看,transformers不同版本执行时,带来不同的bug

1.版本查看

# -*- encoding:utf-8 -*-
import torch
import numpy as np
import transformers
print(torch.__version__)  # 1.7.1
print(transformers.__version__) # 2.1.1

2.目前使用广泛的transformers版本

bert/gpt2调用完全OK,配上Python3.6+

安装固定版本的transformer:

pip install transformers==3.4.0

3.transformers不同版本执行时,带来不同的bug

重点:

2.1.1的版本太低,导致人家开源的一些预训练模型不能用,下面的那个pip install git+https://github.com/huggingface/transformers安装后,会导致一个新的问题:ModuleNotFoundError: No module named 'transformers.modeling_gpt2',查看了之前配置的环境,发现transformer的版本是3.4.0,这样可以加载这个链接下的预训练模型:https://huggingface.co/models?search=chinese。

一、版本=2.1.1,太低时报错

版本太低,会导致不能使用hugging face模型库中的模型https://huggingface.co/models

Traceback (most recent call last):
  File "train.py", line 461, in <module>
    main()
  File "train.py", line 419, in main
    model, n_ctx = create_model(args, vocab_size)
  File "train.py", line 116, in create_model
    model = GPT2LMHeadModel.from_pretrained(args.pretrained_model)
  File "/Users/wang/anaconda3/envs/torch/lib/python3.6/site-packages/transformers/modeling_utils.py", line 337, in from_pretrained
    **kwargs
  File "/Users/wang/anaconda3/envs/torch/lib/python3.6/site-packages/transformers/configuration_utils.py", line 146, in from_pretrained
    raise EnvironmentError(msg)
OSError: Model name 'ckiplab/gpt2-base-chinese' was not found in model name list (gpt2, gpt2-medium, gpt2-large, gpt2-xl, distilgpt2). We assumed 'ckiplab/gpt2-base-chinese' was a path or url to a configuration file named config.json or a directory containing such a file but couldn't find any such file at this path or url.
(torch) localhost:generate_comment wang$ 

使用git源码安装,此时调用gpt2接口时,会导致一个新的问题:ModuleNotFoundError: No module named 'transformers.modeling_gpt2'


(torch) localhost:generate_comment wang$ pip install git+https://github.com/huggingface/transformers
Collecting git+https://github.com/huggingface/transformers
  Cloning https://github.com/huggingface/transformers to /private/var/folders/6w/y9sqyr2x49n1tjdhmn9q_71c0000gn/T/pip-req-build-yjo40ahe
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
    Preparing wheel metadata ... done
Requirement already satisfied: filelock in /Users/wang/anaconda3/envs/torch/lib/python3.6/site-packages (from transformers==4.2.0.dev0) (3.0.12)
Requirement already satisfied: numpy in /Users/wang/anaconda3/envs/torch/lib/python3.6/site-packages (from transformers==4.2.0.dev0) (1.19.4)
Requirement already satisfied: packaging in /Users/wang/anaconda3/envs/torch/lib/python3.6/site-packages (from transformers==4.2.0.dev0) (20.8)
Requirement already satisfied: regex!=2019.12.17 in /Users/wang/anaconda3/envs/torch/lib/python3.6/site-packages (from transformers==4.2.0.dev0) (2020.11.13)
Requirement already satisfied: tokenizers==0.9.4 in /Users/wang/anaconda3/envs/torch/lib/python3.6/site-packages (from transformers==4.2.0.dev0) (0.9.4)
Requirement already satisfied: dataclasses in /Users/wang/anaconda3/envs/torch/lib/python3.6/site-packages (from transformers==4.2.0.dev0) (0.8)
Requirement already satisfied: requests in /Users/wang/anaconda3/envs/torch/lib/python3.6/site-packages (from transformers==4.2.0.dev0) (2.25.1)
Requirement already satisfied: sacremoses in /Users/wang/anaconda3/envs/torch/lib/python3.6/site-packages (from transformers==4.2.0.dev0) (0.0.43)
Requirement already satisfied: tqdm>=4.27 in /Users/wang/anaconda3/envs/torch/lib/python3.6/site-packages (from transformers==4.2.0.dev0) (4.54.1)
Requirement already satisfied: pyparsing>=2.0.2 in /Users/wang/anaconda3/envs/torch/lib/python3.6/site-packages (from packaging->transformers==4.2.0.dev0) (2.4.7)
Requirement already satisfied: chardet<5,>=3.0.2 in /Users/wang/anaconda3/envs/torch/lib/python3.6/site-packages (from requests->transformers==4.2.0.dev0) (4.0.0)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /Users/wang/anaconda3/envs/torch/lib/python3.6/site-packages (from requests->transformers==4.2.0.dev0) (1.26.2)
Requirement already satisfied: certifi>=2017.4.17 in /Users/wang/anaconda3/envs/torch/lib/python3.6/site-packages (from requests->transformers==4.2.0.dev0) (2020.12.5)
Requirement already satisfied: idna<3,>=2.5 in /Users/wang/anaconda3/envs/torch/lib/python3.6/site-packages (from requests->transformers==4.2.0.dev0) (2.10)
Requirement already satisfied: click in /Users/wang/anaconda3/envs/torch/lib/python3.6/site-packages (from sacremoses->transformers==4.2.0.dev0) (7.1.2)
Requirement already satisfied: joblib in /Users/wang/anaconda3/envs/torch/lib/python3.6/site-packages (from sacremoses->transformers==4.2.0.dev0) (1.0.0)
Requirement already satisfied: six in /Users/wang/anaconda3/envs/torch/lib/python3.6/site-packages (from sacremoses->transformers==4.2.0.dev0) (1.15.0)
Building wheels for collected packages: transformers
  Building wheel for transformers (PEP 517) ... done
  Created wheel for transformers: filename=transformers-4.2.0.dev0-py3-none-any.whl size=1522337 sha256=aef572d1f465e6b4faf7ff591dcb51d6433144afc4eb01e199b825217b0dfd03
  Stored in directory: /private/var/folders/6w/y9sqyr2x49n1tjdhmn9q_71c0000gn/T/pip-ephem-wheel-cache-ke6q7ies/wheels/5a/0a/d0/eb8d0ea1d7d02156f8675d6e5dfa52c03601cbe377290db8dc
Successfully built transformers
Installing collected packages: transformers
  Attempting uninstall: transformers
    Found existing installation: transformers 2.2.0
    Uninstalling transformers-2.2.0:
      Successfully uninstalled transformers-2.2.0
Successfully installed transformers-4.2.0.dev0
(torch) localhost:generate_comment wang$

解决方法 :

查看了之前配置的环境,发现transformer的版本是3.4.0,这样可以加载这个链接下的预训练模型:https://huggingface.co/models?search=chinese。

4.不同电脑或者机器安装不同的torch

比如,我想在开发机(开发机大多是linux系统)上安装能用gpu的torch,那么选项如下:

 

1.torch官网:https://pytorch.org/

  • 9
    点赞
  • 33
    收藏
    觉得还不错? 一键收藏
  • 34
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 34
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值