learning_rate = get_learning_rate(batch)
首先看具体实现
def get_learning_rate(batch):
learning_rate = tf.train.exponential_decay(
BASE_LEARNING_RATE, # Base learning rate.
batch * BATCH_SIZE, # Current index into the dataset.
DECAY_STEP, # Decay step.
DECAY_RATE, # Decay rate.
staircase=True)
learning_rate = tf.maximum(learning_rate, 0.00001) # CLIP THE LEARNING RATE!
return learning_rate
输入参数为batch,在目前的项目中表示train的step
与get_bn_decay(从0.5开始按照指数规则缓慢减少)相似,也是指数增加的learningrate而且从0.00001开始增加,