除了可以使用tensorflow自己定义好的learning_rate schedules外,我们也可以自定义tensorflow的learning_rate schedule,由于2.0的eager模式,这个改变十分简单:
def lr_fn(epoch, base_lr=1e-4):
# based on your strategy to change lr
for epoch in range(epochs):
for step, (img, heatmap, paf) in enumerate(dataset):
with tf.GradientTape() as tape:
outputs = model(img)
loss = xxx
grads = tape.gradient(loss, model.trainable_variables)
optimizer.learning_rate = lr_fn(epoch)
optimizer.apply_gradients(zip(grads, model.trainable_variables))
只需要在optimizer每一次执行apply_gradients之前,改变optimizer的learning_rate值就可以了,简单有效