Wiki ref: http://en.wikipedia.org/wiki/Bootstrap_aggregating
Bootstrap aggregating (bagging) is a meta-algorithm to improve classification and regression models in terms of stability and classification accuracy. Bagging also reduces variance and helps to avoid overfitting. Although this method is usually applied to decision tree models, it can be used with any type of model. Bagging is a special case of the model averaging approach.
Given a standard training set D of size N, we generate L new training sets Di also of size N' (N' < N) by sampling examples uniformly from D, and with replacement. By sampling with replacement it is likely that some examples will be repeated in each Di. If N' = N, then for large N the set Di expected to have 63.2% of the examples of D, the rest being duplicates. This kind of sample is known as a bootstrap sample. The L models are fitted using the above L bootstrap samples and combined by averaging the output (in case of regression) or voting (in case of classification). One particular interesting point about bagging is that, since the method averages several predictors, it is not useful for improving linear models.