问题提出
使用Lora进行微调大模型时,终端报如下错误:
Traceback (most recent call last):
File "/usr/local/bin/llamafactory-cli", line 33, in <module>
sys.exit(load_entry_point('llamafactory==0.7.2.dev0', 'console_scripts', 'llamafactory-cli')())
File "/usr/local/lib/python3.10/site-packages/llamafactory-0.7.2.dev0-py3.10.egg/llamafactory/cli.py", line 65, in main
run_exp()
File "/usr/local/lib/python3.10/site-packages/llamafactory-0.7.2.dev0-py3.10.egg/llamafactory/train/tuner.py", line 34, in run_exp
run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks)
File "/usr/local/lib/python3.10/site-packages/llamafactory-0.7.2.dev0-py3.10.egg/llamafactory/train/sft/workflow.py", line 33, in run_sft
dataset = get_dataset(model_args, data_args, training_args, stage="sft", **tokenizer_module)
File "/usr/local/lib/python3.10/site-packages/llamafactory-0.7.2.dev0-py3.10.egg/llamafactory/data/loader.py", line 123, in get_dataset
template = get_template_and_fix_tokenizer(tokenizer, data_args.template)
File "/usr/local/lib/python3.10/site-packages/llamafactory-0.7.2.dev0-py3.10.egg/llamafactory/data/template.py", line 352, in get_template_and_fix_tokenizer
raise ValueError("Template {} does not exist.".format(name))
解决方案:
终端输入:
vim /usr/local/lib/python3.10/site-packages/llamafactory-0.7.2.dev0-py3.10.egg/llamafactory/data/template.py
在文件最后添加模板即可:
_register_template(
name="model_name",
format_user=StringFormatter(slots=["<用户>{{content}}<AI>"]),
format_system=StringFormatter(slots=[{"bos_token"}, "{{content}}"]),
force_system=True,
)