几种梯度算法(Gradient Method)

BGD (Batch Gradient Descent) 批梯度下降法

SGD (Stochastic Gradient Descent) 随机梯度下降法

MBGD (Mini-batch Gradient Descent) 小批梯度下降法

MGD (Momentum Gradient Descent) 动量梯度下降法

NAG (Nesterov Accelerated Gradient) 涅斯捷罗夫加速梯度下降法

AGD (Adaptive Gradient Descent) 自适应梯度下降法

Adadelta (Adadelta Gradient Descent) 自适应 Δ \Delta Δ梯度下降法

RMSprop (Root Mean Square propagation Gradient Descent) 均方根传递梯度下降法

Adam (Adaptive Moment Estimation Gradient Descent) 自适应矩估计梯度下降算法

相关链接
[1] https://drivingc.com/p/5bdfa51fd249870dca3afe22
[2] https://www.cnblogs.com/guoyaohua/p/8542554.html
[3] https://blog.csdn.net/wfei101/article/details/79938305?utm_medium=distribute.pc_relevant.none-task-blog-BlogCommendFromMachineLearnPai2-6.channel_param&depth_1-utm_source=distribute.pc_relevant.none-task-blog-BlogCommendFromMachineLearnPai2-6.channel_param
[4] https://www.cnblogs.com/tabatabaye/articles/1112475.html
[5] https://blog.csdn.net/u012328159/article/details/80311892
[6] http://www.atyun.com/2257.html
[7] https://blog.csdn.net/u011497262/article/details/88787905?utm_medium=distribute.pc_relevant.none-task-blog-title-1&spm=1001.2101.3001.4242
[8] https://www.cnblogs.com/yifdu25/p/8183587.html

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值