Bagging, Boosting, and Randomization对比


原始论文:An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization


This paper compares the effectiveness of randomization, bagging, and boosting for
improving the performance of the decision-tree algorithm C4.5. The experiments show that in situations with little
or no classification noise, randomization is competitive with (and perhaps slightly superior to) bagging but not as
accurate as boosting. In situations with substantial classification noise, bagging is much better than boosting, and
sometimes better than randomization.

没有noise时,randomization和boosting效果相当,好于bagging;

有noise时,randomization和bagging效果相当,好于boosting;

因为这篇文章提出了randomization的想法,所以肯定要说randomization的好了。。。



The goal of ensemble learning methods is to construct a collection (an ensemble) of individual classifiers that are 【diverse】 and yet 【accurate】.





Adaboost.M1 (boosting by weighting);原来.M1可能表示method 1。.M0是不是表示boosting by sampling???

  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值