梯度提升树GBDT帮助理解

梯度提升树GBDT

关键是利用损失函数的负梯度在当前模型的值作为回归问题提升树算法中的残差的近似值,拟合一个回归树。

− [ ∂ L ( y , f ( x i ) ) ∂ f ( x i ) ] -\left[ \frac{\partial L(y,f(x_i))}{\partial f(x_i)} \right] [f(xi)L(y,f(xi))]
讲解GBDT的一个不错的博客

结合下面帮助理解
l e a r n r a t e : 0.1 ⇒ 1.475 − 0.1 × 0.375 = 1.4375 learn rate:0.1 \Rightarrow 1.475-0.1\times0.375=1.4375 learnrate:0.11.4750.1×0.375=1.4375

indexidageweightlabelf_0res_1f_1res_2f_2res_3f_3res_4f_4res_5f_5
015201.11.475-0.3751.4375-0.33751.40375-0.303751.373375-0.2733751.346037-0.2460371.321434
127301.31.475-0.1751.4575-0.15751.44175-0.141751.427575-0.1275751.414818-0.1148181.403336
2321701.71.4750.2251.49750.20251.517750.182251.5359750.1640251.5523770.1476221.567140
3430601.81.4750.3251.50750.29251.536750.263251.5630750.2369251.5867680.2132321.608091

l e a r n r a t e : 0.2 ⇒ 1.475 − 0.2 × 0.375 = 1.4 learn rate:0.2 \Rightarrow 1.475-0.2\times 0.375 = 1.4 learnrate:0.21.4750.2×0.375=1.4

indexidageweightlabelf_0res_1f_1res_2f_2res_3f_3res_4f_4res_5f_5
015201.11.475-0.3751.40-0.301.340-0.2401.2920-0.19201.25360-0.153601.222880
127301.31.475-0.1751.44-0.141.412-0.1121.3896-0.08961.37168-0.071681.357344
2321701.71.4750.2251.520.181.5560.1441.58480.11521.607840.092161.626272
3430601.81.4750.3251.540.261.5920.2081.63360.16641.666880.133121.693504
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值