debug日志
运行以下代码时,发现报错:LocalEntryNotFoundError: Connection error, and we cannot find the requested files in the disk cache. Please try again or make sure your Internet connection is on.
net = timm.create_model(name, pretrained=True)
主要是因为huggingface的链接被墙了,学校服务器也不可能连得上VPN。所以考虑在自己电脑上使用VPN将模型下载到本地,再传上服务器读取。
主要参考:
https://blog.csdn.net/joe199996/article/details/134208754
https://blog.csdn.net/m0_63977857/article/details/132921375
我需要下载deit_small_patch16_224
预训练权重,可以执行下述代码:
import timm
print(timm.models.create_model('deit_small_patch16_224').default_cfg)
得
{'url': 'https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth',
'hf_hub_id': 'timm/deit_small_patch16_224.fb_in1k',
'architecture': 'deit_small_patch16_224',
'tag': 'fb_in1k',
'custom_load': False,
'input_size': (3, 224, 224),
'fixed_input_size': True,
'interpolation': 'bicubic',
'crop_pct': 0.9,
'crop_mode': 'center',
'mean': (0.485, 0.456, 0.406),
'std': (0.229, 0.224, 0.225),
'num_classes': 1000,
'pool_size': None,
'first_conv': 'patch_embed.proj',
'classifier': 'head'}
url就是旧版本对应的链接;hf_hub_id是新版本加入的HF下载链接,这个hf的优先级更高,所以和域名组合一下,就能得到完整的地址:https://huggingface.co/timm/deit_small_patch16_224.fb_in1k
进入文件地址:
网上是把bin文件下载下来,在这里我都下载了。将其另存为电脑上的文件夹中,在Windows搜索栏敲cmd打开终端,输入scp -r path1 user23215425@172.25.4.35:/home/user23215425/.cache/huggingface/hub/models--timm--deit_small_patch16_224.fb_in1k
(注意,这里的path1是电脑上保存下载文件的路径,user23215425需要换成服务器上自己的使用名称,ip地址需要换成自己的ip地址。
/home/user23215425/.cache/huggingface/hub/models–timm–deit_small_patch16_224.fb_in1k则是文件需要导到服务器上的路径,由于之前的运行,models–timm–deit_small_patch16_224.fb_in1k文件夹都会自动创建)
然后将报错代码
net = timm.create_model(name, pretrained=True)
改为
net = timm.create_model(name, pretrained=True,
pretrained_cfg_overlay=dict(file='/home/user23215425/.cache/huggingface/hub/models--timm--deit_small_patch16_224.fb_in1k/pytorch_model.bin'))
网上的教程一般到这里就结束了,看大家的回复,都可以正常解决了。但是不知道为什么,我的代码还是会出现相同的报错,百思不得其解。直到我打开这个timm.create_model
发现
def create_model(
model_name: str,
pretrained: bool = False,
pretrained_cfg: Optional[Union[str, Dict[str, Any], PretrainedCfg]] = None,
pretrained_cfg_overlay: Optional[Dict[str, Any]] = None,
checkpoint_path: str = '',
scriptable: Optional[bool] = None,
exportable: Optional[bool] = None,
no_jit: Optional[bool] = None,
**kwargs,
):
kwargs = {k: v for k, v in kwargs.items() if v is not None}
model_source, model_name = parse_model_name(model_name)
if model_source == 'hf-hub':
assert not pretrained_cfg, 'pretrained_cfg should not be set when sourcing model from Hugging Face Hub.'
pretrained_cfg, model_name, model_args = load_model_config_from_hf(model_name)
if model_args:
for k, v in model_args.items():
kwargs.setdefault(k, v)
else:
model_name, pretrained_tag = split_model_name_tag(model_name)
if pretrained_tag and not pretrained_cfg:
pretrained_cfg = pretrained_tag
if not is_model(model_name):
raise RuntimeError('Unknown model (%s)' % model_name)
create_fn = model_entrypoint(model_name)
with set_layer_config(scriptable=scriptable, exportable=exportable, no_jit=no_jit):
model = create_fn(
pretrained=pretrained,
# pretrained_cfg=pretrained_cfg,
# pretrained_cfg_overlay=pretrained_cfg_overlay,
**kwargs,
)
if checkpoint_path:
load_checkpoint(model, checkpoint_path)
return model
他在这里给我的 pretrained_cfg_overlay=pretrained_cfg_overlay,
注释了❗❗❗
model = create_fn(
pretrained=pretrained,
# pretrained_cfg=pretrained_cfg,
# pretrained_cfg_overlay=pretrained_cfg_overlay,
**kwargs,
)
把这个的注释取消即可正常运行