pytorch四种学习率衰减策略
MultiStepLR()
torch.optim.lr_scheduler.MultiStepLR(optimizer, milestones, gamma=0.1, last_epoch=-1)
drop_after_epoch = [3, 5, 7]
scheduler = optim.lr_scheduler.MultiStepLR(opti, milestones=drop_after_epoch, gamma=0.333, last_epoch=-1)
说明:
1)milestones为一个.
原创
2021-10-28 19:27:08 ·
1075 阅读 ·
0 评论