Classic AdaBoost Classifier
This a classic AdaBoost implementation, in one single file with easy understandable code.
The function consist of two parts a simple weak classifier and a boosting part:
The weak classifier tries to find the best threshold in one of the data dimensions to separate the data into two classes -1 and 1
The boosting part calls the classifier iteratively, after every classification step it changes the weights of miss-classified examples. This creates a cascade of "weak classifiers" which behaves like a "strong classifier"
.
Training mode:
[estimateclass,model]=adaboost('train',datafeatures,dataclass,itt)
Apply mode:
estimateclass=adaboost('apply',datafeatures,model)
inputs/outputs:
datafeatures : An Array with size number_samples x number_features
dataclass : An array with the class off all examples, the class
can be -1 or 1
itt : The number of training iterations
model : A struct with the cascade of weak-classifiers
estimateclass : The by the adaboost model classified data
.
Please leave a comment, if you like the code, find a bug or have a suggestion.