参考链接: How to adjust learning rate
实验举例: class torch.optim.lr_scheduler.LambdaLR
实验举例: class torch.optim.lr_scheduler.MultiplicativeLR
实验举例: class torch.optim.lr_scheduler.StepLR
实验举例: class torch.optim.lr_scheduler.MultiStepLR
实验举例: class torch.optim.lr_scheduler.ExponentialLR
实验举例: class torch.optim.lr_scheduler.CosineAnnealingLR
实验举例: class torch.optim.lr_scheduler.ReduceLROnPlateau
实验举例: class torch.optim.lr_scheduler.CyclicLR
实验举例: class torch.optim.lr_scheduler.OneCycleLR
实验举例: class torch.optim.lr_scheduler.CosineAnnealingWarmRestarts
配套代码下载链接: 测试学习率调度器.zip
How to adjust learning rate
如何调整调节学习率
torch.optim.lr_scheduler provides several methods to adjust the learning rate
based on the number of epochs. torch.optim.lr_scheduler.ReduceLROnPlateau
allows dynamic learning rate reducing based on some validation measurements.
torch.optim.lr_scheduler模块提供了多种基于训练世代数的学习率调节方法.
使用torch.optim.lr_scheduler.ReduceLROnPlateau可以基于一些验证度量来动态地减小学习率.
Learning rate scheduling should be applied after optimizer’s update;
e.g., you should write your code this way:
学习率的调度的执行应该放在优化器更新之后;例如,可以按如下方式编写代码:
>>> scheduler = ...
>>> for epoch in range(100):
>>> train(...)
>>> validate(...)
>>> scheduler.step()
Warning
Prior to PyTorch 1.1.0, the learning rate scheduler was expected to be called
before the optimizer’s update; 1.1.0 changed this behavior in a BC-breaking
way. If you use the learning rate scheduler (calling scheduler.step()) before
the optimizer’s update (calling optimizer.step()), this will skip the first
value of the learning rate schedule. If you are unable to reproduce results
after upgrading to PyTorch 1.1.0, please check if you are calling
scheduler.step() at the wrong time.
警告
在PyTorch 1.1.0版本之前,学习率调度器的调用应该放在优化器更新之前;PyTorch 1.1.0版本
改变了这种方式,不再后向兼容. 如果你在优化器使用(调用optimizer.step())之前使用学习率
调度器(调用scheduler.step()),那么这将会跳过学习率调度器的第一个值. 如果你将PyTorch
更新到1.1.0版本,无法复现结果,那么请检查你是否在错误的时间上调用了scheduler.step() .