Stable Diffusion / huggingface 相关配置问题汇总


笔者在配置SD的时候遭遇了许多bug,特此汇总如下:

1 OSError: Can't load tokenizer for 'openai/clip-vit-large-patch14'.

报错

完整报错如下:

OSError: Can't load tokenizer for 'openai/clip-vit-large-patch14'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'openai/clip-vit-large-patch14' is the correct path to a directory containing all relevant files for a CLIPTokenizer tokenizer.

这是由于 huggingface 网站上不去导致的。

解决方法

方法1——手动下载

在本地新建文件夹 openai/clip-vit-large-patch14,将官网对应位置下面的文件全部下下来放进去即可123
官方网址:https://huggingface.co/openai/clip-vit-large-patch14/tree/main
国内镜像:

  1. https://www.modelscope.cn/models/AI-ModelScope/clip-vit-large-patch14/files
  2. https://hf-mirror.com/openai/clip-vit-large-patch14/tree/main

或者百度网盘:
链接: https://pan.baidu.com/s/1pmOuyaRnLcc8ee-8_jtb1g?pwd=ukyi 提取码: ukyi 复制这段内容后打开百度网盘手机App,操作更方便哦
–来自百度网盘超级会员v9的分享

方法2——自动下载

在本地新建文件夹 openai 之后,在该路径下利用 git clone https://www.modelscope.cn/AI-ModelScope/clip-vit-large-patch14.git 自动下载4,但是注意到此时会报错 safetensors_rust.SafetensorError,具体如下:

  File "/home/xxx/.conda/envs/xxx/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3503, in from_pretrained
    with safe_open(resolved_archive_file, framework="pt") as f:
safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge

原因是几个大一点文件没有完整下载,需要重新手动下载并覆盖。

其他方法(待研究)

  1. 在transformers/utils/hub.py的源码中,将’_default_endpoint’的值改为国内镜像https://hf-mirror.com即可5 (实测未成功)
  2. 安装依赖包 pip install -U huggingface_hub,然后修改HF_ENDPOINT 环境变量6 (未验证)
    export HF_ENDPOINT=https://hf-mirror.com  # Linux 
    set HF_ENDPOINT=https://hf-mirror.com  # Windows
    huggingface-cli download --resume-download InstantX/InstantID --local-dir checkpoints
    

2 huggingface_hub.utils._errors.LocalEntryNotFoundError:

报错

出现如下报错:

 File "/home/xxx/.conda/envs/xxx/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1371, in hf_hub_download
    raise LocalEntryNotFoundError(
huggingface_hub.utils._errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.

还是网络的问题。

解决方法

参考博文:【秒解决!!huggingface_hub.utils._errors.LocalEntryNotFoundError】,将 huggingface 网站 改成其镜像站:

  • 方法1:在代码开头设置环境变量
    import os
    os.environ["HF_ENDPOINT"] = "https://hf-mirror.com"
    
  • 方法2:在命令行中设置环境变量 HF_ENDPOINT=https://hf-mirror.com python xxx.py

  1. OSError: Can‘t load tokenizer for ‘openai/clip-vit-large-patch14‘. ↩︎

  2. Stable-diffusion安装时Can‘t load tokenizer for ‘openai/clip-vit-large-patch14‘问题解决 ↩︎

  3. 【debug】OSError: Can‘t load tokenizer for ‘XXX‘. If you were trying to load it from ‘https://huggingf ↩︎

  4. StableDiffusion搭建[报错] OSError openai/clip-vit-large-patch14 ↩︎

  5. 解决diffusion部署时,无法从‘huggingface.co‘下载‘openai/clip-vit-large-patch14‘导致的报错 ↩︎

  6. Huggingface 镜像站使用方法! ↩︎

  • 23
    点赞
  • 12
    收藏
    觉得还不错? 一键收藏
  • 1
    评论
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值