之前一直想过在调参时实现一下,比如训练300个epoch,在100个epoch降低学习率,然后在150个epoch又去降低学习率,现在实现的代码大多数是设定epoch的every数,就是说每隔多少个epoch,就将学习率降低0.1倍,我现在想借助于python中的队列结构实现,可以设定任意个降低学习率的epoch点数,对学习率进行decay操作。
from collections import deque queue = deque([1,4,6]) lr=1e-1 for epoch in range(10): if len(queue)>=1: if epoch==queue[0]: lr*=0.1 queue.popleft() '''training one epoch''' print(epoch,lr) ''' 0 0.1 1 0.010000000000000002 2 0.010000000000000002 3 0.010000000000000002 4 0.0010000000000000002 5 0.0010000000000000002 6 0.00010000000000000003 7 0.00010000000000000003 8 0.00010000000000000003 9 0.00010000000000000003 '''
for epoch in range(args.start_epoch, args.max_epochs + 1):
# setting to train mode
fasterRCNN.train()
loss_temp = 0
start = time.time()
if len(decay_point)>=1:
if epoch==decay_point[0]:
adjust_learning_rate(optimizer, args.lr_decay_gamma)
lr *= args.lr_decay_gamma
decay_point.popleft()
#if epoch % (args.lr_decay_step + 1) == 0:
data_iter = iter(dataloader)