LambdaLR类:torch.optim.lr_scheduler
作用是,可以自定义学习率学习曲线,官方代码如下:
lass LambdaLR(_LRScheduler):
"""Sets the learning rate of each parameter group to the initial lr
times a given function. When last_epoch=-1, sets initial lr as lr.
Args:
optimizer (Optimizer): Wrapped optimizer.
lr_lambda (function or list): A function which computes a multiplicative
factor given an integer parameter epoch, or a list of such
functions, one for each group in optimizer.param_groups.
last_epoch (int): The index of last epoch. Default: -1.
Example:
>>> # Assuming optimizer has two groups.
>>> lambda1 = lambda epoch: epoch // 30
>>> lambda2 = lambda epoch: 0.95 ** epoch
>>> scheduler = LambdaLR(optimizer, lr_lambda=[lambda1, lambda2])
>>> for epoch in range(100):
>>> train(...)
>>> validate(...)
>>> scheduler.step()
"""
def __init__(self, optimizer, lr_lambda, last_epoch=-1):
...
其中,optimizer是优化器实例,如SGD的实例,lr_lambda是学习率变化的函数或者一组函数,函数的参数是当前的epoch,last_epoch是前一个epoch,比如epoch=2训练完后,中断了,继续训练的话last_epoch=2,内部会自动加1.
import logging
import math
from torch.optim.lr_scheduler import LambdaLR
logger = logging.getLogger(__name__)
class ConstantLRSchedule(LambdaLR):#常数
""" Constant learning rate schedule.
"""
def __init__(self