Ubuntu cuda 10.2下调通ChatGLM-6B经验
- python 版本要大于 1.7 ()
KeyError: ‘chatglm’
不然安装transformers的时候会导致版本过低,报如下错误:
Traceback (most recent call last):
File "test.py", line 2, in
tokenizer = AutoTokenizer.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True)
File "/usr/local/lib/python3.6/dist-packages/transformers/models/auto/tokenization_auto.py", line 390, in from_pretrained
config = AutoConfig.from_pretrained(pretrained_model_name_or_path, **kwargs)
File "/usr/local/lib/python3.6/dist-packages/transformers/models/auto/configuration_auto.py", line 400, in from_pretrained
config_class = CONFIG_MAPPING[config_dict["model_type"]]
KeyError: 'chatglm'
https://github.com/THUDM/ChatGLM-6B/issues/913
- cuda和pytorch版本要协调
比如 torch 1.12.1就可以适配cuda 10.2,代码如下
pip install torch==1.12.1+cu102 -f https://download.pytorch.org/whl/torch_stable.html
这里是参考:
https://blog.csdn.net/xu_guo_jie/article/details/130172610
未完待续…