参考文献:
1. Understanding the Bias-Variance Tradeoff
2. 机器学习中的Bias(偏差),Error(误差),和Variance(方差)有什么区别和联系?
3.Bias–variance tradeoff
在西瓜书第八章<集成学习>中看到两句话:
- Boosting主要关注降低偏差
- Bagging主要关注降低方差
对于这两句话到底有什么含义?
首先WIKI百科中有这样一段话:
In statistics and machine learning, the bias–variance tradeoff (or dilemma) is the problem of simultaneously minimizing two sources of error that prevent supervised learning algorithms from generalizing beyond their training set: