成功解决Transformers=4.3.3的管道方式实现模型调用时,AutoConfig不能识别的错误

项目场景:

本人在使用transfromers的管道方式实现情感分类时,调用自己本地预训练模型出现的bug


问题描述

报错如下:

D:\python36\lib\site-packages\OpenSSL\crypto.py:8: CryptographyDeprecationWarning: Python 3.6 is no longer supported by the Python core team. Therefore, support for it is deprecated in cryptography and will be removed in a future release.
  from cryptography import utils, x509
Traceback (most recent call last):
  File "D:/python36/pythonProject/study_pytorch/Bert_Study/管线调用Bert.py", line 4, in <module>
    nlp = pipeline("sentiment-analysis",model='D:\model\chinese_bert_models',config='D:\model\chinese_bert_models',tokenizer=tokenizers)
  File "D:\python36\lib\site-packages\transformers\pipelines\__init__.py", line 377, in pipeline
    config = AutoConfig.from_pretrained(config, revision=revision)
NameError: name 'AutoConfig' is not defined

 # Instantiate config if needed
    if isinstance(config, str):
        config = AutoConfig.from_pretrained(config, revision=revision)

    # Instantiate modelcard if needed
    if isinstance(modelcard, str):
        modelcard = ModelCard.from_pretrained(modelcard, revision=revision)

    # Instantiate model if needed
    if isinstance(model, str):
        # Handle transparent TF/PT model conversion
        model_kwargs = {}

原因分析:

例如:报错原因是pipelines的__init__.py中的AutoConfig不能被识别。原因猜测:可能transformer4.3.3版本太新,以致于代码不完善。具体原因也不太清楚。


解决方案:

我首先没有直接去导入AutoConfig,而是去分析和它类似的类ModelCard,发现这个类在上面导入时导入了,然后我类比将AutoConfig从transformer中导入,并在这个库文件中做了标记。

# limitations under the License.
from transformers import AutoConfig#自己导入了AutoConfig
import warnings

然后问题成功解决,代码完美运行,至于transformer为什么没有导入这个类的原因,暂时没有头绪,有感兴趣的朋友,可以一起学习。

运行成功示意图:

D:\python36\lib\site-packages\OpenSSL\crypto.py:8: CryptographyDeprecationWarning: Python 3.6 is no longer supported by the Python core team. Therefore, support for it is deprecated in cryptography and will be removed in a future release.
  from cryptography import utils, x509
Some weights of the model checkpoint at D:\model\chinese_bert_models were not used when initializing BertForSequenceClassification: ['cls.predictions.bias', 'cls.predictions.transform.dense.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.transform.LayerNorm.bias', 'cls.predictions.decoder.weight', 'cls.predictions.decoder.bias', 'cls.seq_relationship.weight', 'cls.seq_relationship.bias']
- This IS expected if you are initializing BertForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing BertForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
Some weights of BertForSequenceClassification were not initialized from the model checkpoint at D:\model\chinese_bert_models and are newly initialized: ['classifier.weight', 'classifier.bias']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
[{'label': 'LABEL_1', 'score': 0.5313035249710083}]

  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

晨风入晚林

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值