离线使用bert_base_uncased

在复现代码RSTNet[1]时,其中bert模型训练需要从hugging face上下载预训练参数,但由于网络原因,始终无法连接上hugging face。因此,采用离线的方式进行。步骤如下:

1.进入Hugging Face – The AI community building the future. (需要梯子)在左上角检索bert-base-uncased,点击。

2.来到这个页面后,点击Files and versions。

3.下载文件,红色圈中部分全下,剩下的是对应不同框架的模型,例如本人是pytorch的代码,就下载了pytorch_model.bin(图中打勾的地方)。

4.全部下载之后,将其放入一个文件夹中,放置进入代码根目录,如下图(bert_base_uncased文件夹)。

5.接下来修改models/rstnet中的language_model.py文件中第42行:

self.language_model = BertModel.from_pretrained('bert-base-uncased', return_dict=True)

将其改为

self.language_model = BertModel.from_pretrained(BERT_PATH)

其中BERT_PATH为我们所命名文件夹的路径例如:

BERT_PATH = './bert_base_uncased'  (表示linux下当前目录下的bert_base_uncased,请注意,此时需要在代码根目录运行指令python train_language.py --exp_name bert_language --batch_size 50 --features_path /path/to/features --annotation_folder /path/to/annotations)

然后就可以正常运行了

引用:

[1]RSTNet:GitHub - zhangxuying1004/RSTNet: Official Code for 'RSTNet: Captioning with Adaptive Attention on Visual and Non-Visual Words' (CVPR 2021)

(env) (base) PS D:\MiniGPT-4> python demo.py --cfg-path eval_configs/minigpt4_eval.yaml Initializing Chat Loading VIT Loading VIT Done Loading Q-Former Traceback (most recent call last): File "D:\MiniGPT-4\env\lib\site-packages\transformers\utils\hub.py", line 409, in cached_file resolved_file = hf_hub_download( File "D:\MiniGPT-4\env\lib\site-packages\huggingface_hub\utils\_validators.py", line 120, in _inner_fn return fn(*args, **kwargs) File "D:\MiniGPT-4\env\lib\site-packages\huggingface_hub\file_download.py", line 1259, in hf_hub_download raise LocalEntryNotFoundError( huggingface_hub.utils._errors.LocalEntryNotFoundError: Connection error, and we cannot find the requested files in the disk cache. Please try again or make sure your Internet connection is on. During handling of the above exception, another exception occurred: Traceback (most recent call last): File "D:\MiniGPT-4\demo.py", line 57, in <module> model = model_cls.from_config(model_config).to('cuda:0') File "D:\MiniGPT-4\minigpt4\models\mini_gpt4.py", line 241, in from_config model = cls( File "D:\MiniGPT-4\minigpt4\models\mini_gpt4.py", line 64, in __init__ self.Qformer, self.query_tokens = self.init_Qformer( File "D:\MiniGPT-4\minigpt4\models\blip2.py", line 47, in init_Qformer encoder_config = BertConfig.from_pretrained("bert-base-uncased") File "D:\MiniGPT-4\env\lib\site-packages\transformers\configuration_utils.py", line 546, in from_pretrained config_dict, kwargs = cls.get_config_dict(pretrained_model_name_or_path, **kwargs) File "D:\MiniGPT-4\env\lib\site-packages\transformers\configuration_utils.py", line 573, in get_config_dict config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs) File "D:\MiniGPT-4\env\lib\site-packages\transformers\configuration_utils.py", line 628, in _get_config_dict resolved_config_file = cached_file( File "D:\MiniGPT-4\env\lib\site-packages\transformers\utils\hub.py", line 443, in cached_file raise EnvironmentError( OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like bert-base-uncased is not the path to a directory containing a file named config.json. Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
07-23
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值