File “/home/finetunellm/main.py”, line 18, in
tokenizer = AutoTokenizer.from_pretrained(MODEL_DIR)
File “/home/anaconda3/envs//lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py”, line 733, in from_pretrained
raise ValueError(
ValueError: Tokenizer class ChatGLMTokenizer does not exist or is not currently imported.
具体报错
tokenizer = AutoTokenizer.from_pretrained(“yourpath”)
解决方法
tokenizer = AutoTokenizer.from_pretrained(“yourpath”,trust_remote_code=True)
还有值得说的一点是
在安装下面几个包时,最好使用这几个包安装,避免一些奇怪的问题
pip install git+https://github.com/huggingface/peft.git
pip install git+https://github.com/huggingface/accelerate.git
pip install git+https://github.com/huggingface/transformers.git
pip install git+https://github.com/TimDettmers/bitsandbytes.git