关于AdaBoost的讲义以及几个源码

 

AdaBoost真是既简单又好用,不愧是最好的集成。

讲义有很多,我看了两个。

AdaBoost–Lecturer: Jan ˇSochman

A Short Introduction to Boosting

讲的都很不错。

有个 Boosting Demo,详细的描述了集成过程。各种分类器。

 

This demo gives a clear visual presentation of what happens during the Adaboost algorithms. It shows how the decision boundary, example weights, training error and base learner weights change during training.

A selection of base learning algorithms are included: Linear Regression, Naive Bayes, Decision Stump, CART (requires stats toolbox), Neural Network (requires netlab) and SVM (requires libsvm). There are also 3 dataset generators (2-gaussians, circle and rotated checkerboard). There is documentation to assist with adding custom base learner algorithms or dataset generators.
The demo allows the choice of base learner and dataset. It is then possible to add one base learner at a time, according to the Adaboost algorithm.

After any number of base learners, the decision boundary and margins are shown on the plot. It is also possible to view two graphs: Error rates (showing how Adaboost affects training and generalisation errors as more base learners are added), and margin distributions (showing the cumulative distribution of margins for the current ensemble).

Base learners appear in a list at the left of the window. These include a checkbox which disables/enables each learner, and a scroll bar that adjusts its weight. This makes it possible to see the consequences of changing the weights assigned by Adaboost.

The Reset button enables all the base learners and sets their weights according to Adaboost. The checkboxes can be right-clicked to disable all other learners and view the impact of only the selected base learner.

—-

GML AdaBoost Matlab Toolbox

This project is devoted to create an easy and convenient Matlab based toolbox for investigations of AdaBoost based machine learning algorithms.

Download

Download GML AdaBoost Matlab Toolbox 0.3

Download GML AdaBoost Matlab Toolbox 0.2

GML AdaBoost Matlab Toolbox is set of matlab functions and classes implementing a family of classification algorithms, known as Boosting.

Implemented algorithms

So far we have implemented 3 different boosting schemes: Real AdaBoost, Gentle AdaBoost and Modest AdaBoost.

  • Real AdaBoost (see [2] for full description) is the generalization of a basic AdaBoost algorithm first introduced by Fruend and Schapire [1]. Real AdaBoost should be treated as a basic “hardcore” boosting algorithm.
  • Gentle AdaBoost is a more robust and stable version of real AdaBoost (see [3] for full description). So far, it has been the most practically efficient boosting algorithm, used, for example, in Viola-Jones object detector [4]. Our experiments show, that Gentle AdaBoost performs slightly better then Real AdaBoost on regular data, but is considerably better on noisy data, and much more resistant to outliers.
  • Modest AdaBoost (see [5] for a full description) – regularized tradeoff of AdaBoost, mostly aimed for better generalization capability and resistance to overfitting. Our experiments show, that in terms of test error and overfitting this algorithm outperforms both Real and Gentle AdaBoost.
Available weak learners

We have implemented a classification tree as a weak learner.

当然还有很多其他的代码,因为简单,所以很多很实用。

  • 0
    点赞
  • 4
    收藏
    觉得还不错? 一键收藏
  • 2
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值