代码:
from modelscope import AutoTokenizer, AutoModel, snapshot_download
model_dir = snapshot_download("ZhipuAI/chatglm3-6b", revision = "v1.0.0")
tokenizer = AutoTokenizer.from_pretrained(model_dir, trust_remote_code=True)
model = AutoModel.from_pretrained(model_dir, trust_remote_code=True).half().cuda()
model = model.eval()
response, history = model.chat(tokenizer, "你好", history=[])
print(response)
response, history = model.chat(tokenizer, "晚上睡不着应该怎么办", history=history)
print(response)
修改为本地路径:
from modelscope import AutoTokenizer, AutoModel, snapshot_download
model_dir = 'path/to/your/model'
tokenizer = AutoTokenizer.from_pretrained(model_dir, trust_remote_code=True)
model = AutoModel.from_pretrained(model_dir, trust_remote_code=True).half().cuda()
model = model.eval()
response, history = model.chat(tokenizer, "你好", history=[])
print(response)
response, history = model.chat(tokenizer, "晚上睡不着应该怎么办", history=history)
print(response)
报错:
RuntimeError: Internal: src/sentencepiece_processor.cc(1101) [model_proto->ParseFromArray(serialized.data(), serialized.size())]
考虑原因&解决方法:
1. 原因:chatglm3-6b下的tokenizer.model有问题
解决方法:检查sha256 (修改路径为你本地tokenizer.model的路径)
sha256sum chatglm3-6b/tokenizer.model
sha256的值应为:
e7dc4c393423b76e4373e5157ddc34803a0189ba96b21ddbb40269d31468a6f2
2. 原因:本地位置错误
解决方法:检查本地model路径‘model_dir’,应指向chatglm3-6b文件夹,路径比如:path/chatglm3-6b。