Chatglm3-6b报错处理

目录

AttributeError: 'ChatGLMTokenizer' object has no attribute 'sp_tokenizer'

AttributeError: 'ChatGLMConfig' object has no attribute 'max_sequence_length'

AttributeError: 'ChatGLMConfig' object has no attribute 'position_encoding_2d'

AttributeError: 'ChatGLMConfig' object has no attribute 'inner_hidden_size'

AttributeError: 'ChatGLMConfig' object has no attribute 'world_size'

ValueError: 150001 is not in list

OSError: We couldn't connect to 'https://huggingface.co' to load this file

附录:能正常运行的config.json配置文件


AttributeError: 'ChatGLMTokenizer' object has no attribute 'sp_tokenizer'

和Transformers版本不兼容有关,最快解决办法是直接修改tokenization脚本,参见:

‘ChatGLMTokenizer‘ object has no attribute ‘sp_tokenizer‘_chatglmtokenizer' object has no attribute 'tokeniz-CSDN博客

AttributeError: 'ChatGLMConfig' object has no attribute 'max_sequence_length'

和Transformers版本不兼容有关,最快解决办法是直接修改配置文件:

vim config.json
"max_sequence_length": 2048

AttributeError: 'ChatGLMConfig' object has no attribute 'position_encoding_2d'

和Transformers版本不兼容有关,最快解决办法是直接修改配置文件:

vim config.json
"position_encoding_2d": true

AttributeError: 'ChatGLMConfig' object has no attribute 'inner_hidden_size'

和Transformers版本不兼容有关,最快解决办法是直接修改配置文件:

vim config.json
"inner_hidden_size": 16384

AttributeError: 'ChatGLMConfig' object has no attribute 'world_size'

 和Transformers版本不兼容有关,最快解决办法是直接修改配置文件:

vim config.json
"world_size": 1

ValueError: 150001 is not in list

疑似和输入有关,未直接解决,直接换了一套Demo代码,问题解决参见:

https://github.com/THUDM/ChatGLM-6B/issues/394

OSError: We couldn't connect to 'https://huggingface.co' to load this file

OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like None is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
/opt/anaconda3/envs/python39/lib/python3.9/tempfile.py:830: ResourceWarning: Implicitly cleaning up <TemporaryDirectory '/tmp/tmpq02ihr1b'>
  _warnings.warn(warn_message, ResourceWarning)

本地模型路径错误,导致代码进入了从huggingface加载模型的流程,但服务器未联网导致连接失败,请确认是否正确输入了模型路径!

附录:能正常运行的config.json配置文件

{
  "_name_or_path": "THUDM/chatglm3-6b",
  "model_type": "chatglm",
  "architectures": [
    "ChatGLMModel"
  ],
  "auto_map": {
    "AutoConfig": "configuration_chatglm.ChatGLMConfig",
    "AutoModel": "modeling_chatglm.ChatGLMForConditionalGeneration",
    "AutoModelForCausalLM": "modeling_chatglm.ChatGLMForConditionalGeneration",
    "AutoModelForSeq2SeqLM": "modeling_chatglm.ChatGLMForConditionalGeneration",
    "AutoModelForSequenceClassification": "modeling_chatglm.ChatGLMForSequenceClassification"
  },
  "add_bias_linear": false,
  "add_qkv_bias": true,
  "apply_query_key_layer_scaling": true,
  "apply_residual_connection_post_layernorm": false,
  "attention_dropout": 0.0,
  "attention_softmax_in_fp32": true,
  "bias_dropout_fusion": true,
  "ffn_hidden_size": 13696,
  "fp32_residual_connection": false,
  "hidden_dropout": 0.0,
  "hidden_size": 4096,
  "kv_channels": 128,
  "layernorm_epsilon": 1e-05,
  "multi_query_attention": true,
  "multi_query_group_num": 2,
  "num_attention_heads": 32,
  "num_layers": 28,
  "original_rope": true,
  "padded_vocab_size": 65024,
  "post_layer_norm": true,
  "rmsnorm": true,
  "seq_length": 32768,
  "use_cache": true,
  "torch_dtype": "float16",
  "transformers_version": "4.27.1",
  "tie_word_embeddings": false,
  "eos_token_id": 2,
  "pad_token_id": 0,
  "max_sequence_length": 2048,
  "position_encoding_2d": true,
  "inner_hidden_size": 16384
}

总结:ChatGLM3-6B本身有些小问题,对Transformers的版本兼容更是稀烂无比!

  • 25
    点赞
  • 18
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值