环境版本:
pytorch_lighting==1.9.4
问题描述`
设置Trainer的callbacks中的
auto_lr_find = True
或是
Trainer.fit(model,auto_lr_find = True)
不起作用或是报错
解决方案:
需要首先导入LearningRateFinder
后新建一类再传给Trainer的callback
这里用官方文档中的例子:
# Customize LearningRateFinder callback to run at different epochs.
# This feature is useful while fine-tuning models.
from pytorch_lightning.callbacks import LearningRateFinder
class FineTuneLearningRateFinder(LearningRateFinder):
def __init__(self, milestones, *args, **kwargs):
super().__init__(*args, **kwargs)
self.milestones = milestones
def on_fit_start(self, *args, **kwargs):
return
def on_train_epoch_start(self, trainer, pl_module):
if trainer.current_epoch in self.milestones or trainer.current_epoch == 0:
self.lr_find(trainer, pl_module)
trainer = Trainer(callbacks=[FineTuneLearningRateFinder(milestones=(5, 10))])
trainer.fit(...)
若有原callback设置需要同时导入,只需要将其放入Trainer的callbacks的序列中
trainer = pl.Trainer(
callbacks=[FineTuneLearningRateFinder(milestones=(5, 10)),checkpoint_callback],
)
之后再运行trainer.fit(),就会开始自动寻找最佳学习率