Bagging (Bootstrap Aggregation)
Pruning Classification is one of the simplest classification algorithms. It works just like if-then. However, when aggregating a lot of prunnings we are able to create a powerful classifier.
The process of Bagging based on pruning is really simple but not trivial:
- For j = 1 , … , b j=1,\dots,b j=1,…,b,
2. Pick up m m m samples from a sample set with n n n samples { ( x i , y i ) } i = 1 n \{ (x_{i},y_{i}) \}_{i=1}^{n} {(xi,yi)}i=1n. Repeating is permitted. Then we get a new sample set.
3. Train the pruning classifier ψ j \psi_{j} ψj with the new sample set. - For all of the pruning classifiers { ψ j } j = 1 b \{\psi_{j}\}_{j=1}^{b} { ψj}j=1b, calculate their average and get f f f: f ( x ) ← 1 b ∑ j = 1 b ψ j ( x ) f(x)\leftarrow \frac{1}{b}\sum_{j=1}^{b}\psi_{j}(x) f(x)←b1j=1∑bψj(x)
n=50; x=randn(n,2);
y=2*(x(:,1)>x(:,2))-1;
b=5000; a=50; Y=zeros(a,a);
X0=linspace(-3,3,a);
[X(: