ADAPTIVE GRADIENT METHODS WITH DYNAMIC BOUND OF LEARNING RATE

标题:ADAPTIVE GRADIENT METHODS WITH DYNAMIC BOUND OF LEARNING RATE

Abstract:

1.element-wise scaling term  逐元素缩放项 2.gradual and smooth transition 逐步平稳过渡  3.prototypes 原型  4.non-adaptive counterparts  同行  5.portion 部分 6. plateaus 平稳状态

 

短语:in spite of its simplicity  尽管很简单

1INTRODUCTION

词:7. state-of-the-art 最先进的  8.wherein 其中  9.instances 实例    10.dominant 优势  11.scales the gradient uniformly  均匀缩放梯度  12.sparse 稀疏  13.empirical 经验 14.abate 减轻 15.elucidate 阐明 16.scale-down term 缩减项  17 constant 不变 18.prototypes 原型
 

方法:named AD-ABOUND AND AMSBOUND 

We employ dynamic bounds on learning rates in these adaptive methods, where the lower and upper bound are initialized as zero and infinity respectively, and they both smoothly converge to a constant final step size. The new variants can be regarded as adaptive methods at the beginning of training, and they gradually and smoothly transform to SGD (or with momentum) as time step increases.
 

2NOTATIONS AND PRELIMINARIES——简单记录

词:1.coordinate坐标 2.elementwise 逐元素地

EXPERIMENT ON CNN——using DenseNet-121 and ResNet-34 with CIFAR-10 dataset


 

 

 

 

 


 

 

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值