学习率调整策略
1、StepLR 等间隔调整学习率
2、MultiStepLR 按给定间隔调整学习率
3、ExponentialLR调整学习率按指数衰减
4、CosineAnnealingLR余弦周期调整学习率
import torch
import matplotlib.pyplot as plt
import torch.optim as optim
model=torch.nn.Linear(3,4)
optimizer=optim.Adam(model.parameters())
###按给定间隔调整学习率 设置每50个epoch调整学习率,lr=0.1*lr
# scheduler_lr = optim.lr_scheduler.StepLR(optimizer, step_size=50, gamma=0.1) # 设置学习率下降策略
###按照给定间隔调整学习率
# milestones = [50, 125, 150]
# scheduler_lr = optim.lr_scheduler.MultiStepLR(optimizer, milestones=milestones, gamma=0.1)
###ExponentialLR 指数衰减调整学习率
# gamma = 0.95
# scheduler_lr = optim.lr_scheduler.ExponentialLR(optimizer, gamma=gamma)
###CosineAnnealingLR 余弦周期调整学习率
# t_max = 50
# scheduler_lr = optim.lr_scheduler.CosineAnnealingLR(optimizer, T_max=t_max, eta_min=0.)
###LambdaLR 自定义调整学习率
lambda1 = lambda epoch: 0.5 ** (epoch //