想在config.yam文件夹中加入lora的相关参数,即加入:
train:
lora:
rank: 16
alpha: 32
dropout: 0.05
bias: "none"
task_type: "CAUSAL_LM"
# use_dora:
# init_lora_weights:
问题1
问题描述:
TypeError: __init__() got an unexpected keyword argument 'lora'
问题分析:
代码中,没有初始化’lora’参数
问题解决
在定义train参数的最后加上lora: dict
,即
pipeline: str # One of the pipelines in framework.pipeline
orchestrator: str # One of the orchestrators
logging_dir: str = "logs"
checkpoint_dir: str = "ckpts"
project_name: str = "trlx"
seed: int = 1000
eval_only: bool = False
mixin: bool = False
max_grad_norm: float = -1
lora: dict
问题2
问题描述:
TypeError: non-default argument 'lora' follows default argument
问题分析:
原因是将没有默认值的参数在定义时放在了有默认值的参数的后面
问题解决:
改换参数顺序,即:
pipeline: str # One of the pipelines in framework.pipeline
orchestrator: str # One of the orchestrators
lora: dict
logging_dir: str = "logs"
checkpoint_dir: str = "ckpts"
project_name: str = "trlx"
seed: int = 1000
eval_only: bool = False
mixin: bool = False
max_grad_norm: float = -1
问题3
问题描述:
AttributeError: 'dict' object has no attribute 'rank'
问题解决:
代码由:
gpt_lora_config = LoraConfig(
r=self.config.train.lora.rank,
lora_alpha=self.config.train.lora.alpha,
lora_dropout=self.config.train.lora.dropout,
bias=self.config.train.lora.bias,
task_type=self.config.train.lora.task_type,
)
改为:
gpt_lora_config = LoraConfig(
r=self.config.train.lora["rank"],
lora_alpha=self.config.train.lora["alpha"],
lora_dropout=self.config.train.lora["dropout"],
bias=self.config.train.lora["bias"],
task_type=self.config.train.lora["task_type"],
)