看代码提示里写的 ‘min’是判断 schedular.step(val_loss) 里输入的val_loss是否减小来更改lr的值的
我是这样写的:
scheduler = torch.optim.lr_scheduler.ReduceLROnPlateau(optimizer, mode='min', factor=0.5,
patience=2, verbose=True, threshold=0.0001, threshold_mode='rel', cooldown=0, min_lr=0, eps=1e-08)
for epoch in range(1, 32):
train( model, device, train_loader, optimizer, epoch)
test_loss = test(model, device, test_loader)
scheduler.step(epoch)