最近在研究基于知识图谱补全模型时,遇到了一个问题:
使用transformer引入 AutoTokenizer时
from transformers import AutoTokenizer
from transformers import AutoModel, AutoConfig
服务器端总是报错:
OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like bert-base-uncased is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
跟着网上的方法,如:自己下载需要的数据集放进模型中,并更改数据集或引用模型位置的路径和名称,或者更换安装包版本,都没有解决我的问题。
- 更换安装包版本(30条消息) Unable to load weights from pytorch checkpoint file for ‘bert-base-uncased‘ at ..._Ray Mond的博客-CSDN博客
- 自己下载文件解决方法:(30条消息) OSError: We couldn‘t connect to ‘https://huggingface.co‘ to load this file_应龙与巨蜥的博客-CSDN博客
解决方法:
经我研究,在main.py文件中加入以下一句代码便可以正常运行
# 离线状态下可运行
TRANSFORMERS_OFFLINE=1