tensorflow learning_rate 的设置方法小结

1. 分段式learning rate的设置方式

其中

tf.app.flags.DEFINE_float('learning_rate', 1e-3, 'Initial learning rate.')
tf.app.flags.DEFINE_float('end_learning_rate', 0.000001,
                          'The minimal end learning rate used by a polynomial decay learning rate.')
# for learning rate piecewise_constant decay
tf.app.flags.DEFINE_string('decay_boundaries', '2000,80000, 100000',
                           'Learning rate decay boundaries by global_step (comma-separated list).')
tf.app.flags.DEFINE_string('lr_decay_factors', '0.1,1, 0.1, 0.01',
                           'The values of learning_rate decay factor for each segment between boundaries (comma-separated list).')
global_step = tf.train.get_or_create_global_step()

lr_values = [params['learning_rate'] * decay for decay in params['lr_decay_factors']]
learning_rate = tf.train.piecewise_constant(tf.cast(global_step, tf.int32),
                                            [int(_) for _ in params['decay_boundaries']],
                                            lr_values)
truncated_learning_rate = tf.maximum(learning_rate, tf.constant(params['end_learning_rate'], dtype=learning_rate.dtype),
                                     name='learning_rate')
# Create a tensor named learning_rate for logging purposes.
tf.summary.scalar('learning_rate', truncated_learning_rate)

注释:

Example: use a learning rate that's 1.0 for the first 100001 steps, 0.5 for the next 10000 steps, and 0.1 for any additional steps.

global_step = tf.Variable(0, trainable=Falseboundaries = [100000, 110000values = [1.0, 0.5, 0.1]learning_rate = tf.train.piecewise_constant(global_step, boundaries, value

2. 指数的方式

  • 0
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值