正则化算法
Understanding the use of Regularization algorithms like LASSO, Ridge, and Elastic-Net regression.
了解正则化算法(如LASSO,Ridge和Elastic-Net回归)的用法。
前提条件 (Pre-requisite)
Before directly jumping into this article make sure you know the maths behind the Linear Regression algorithm. If you don’t, follow this article through!
在直接进入本文之前,请确保您了解线性回归算法背后的数学知识。 如果您不这样做,请继续阅读本文 !
目录 (Table of Contents)
- What is Regularization? 什么是正则化?
- What are the different Regularization algorithms? 有哪些不同的正则化算法?
- Working of LASSO, Ridge, and Elastic-Net Regression LASSO,Ridge和弹性网回归的工作
- What does Regularization achieve? 正则化实现了什么?
什么是正则化? (What is Regularization?)
Regularization is a technique used in regression to reduce the complexity of the model and to shrink the coefficients of the independent features.
正则化是一种用于回归的技术,可以降低模型的复杂性并缩小独立特征的系数。
“Everything should be made as simple as possible, but no simpler.” -Albert Einstein
“一切都应该尽可能简单 ,但不要简单。” -艾尔伯特爱因斯坦
In simple words, this technique converts a complex model into a simpler one, so as to avoid the risk of overfitting and shrinks the coefficients, for lesser computational cost.
简而言之,该技术将复杂的模型转换为更简单的模型,从而避免了过拟合的风险并缩小了系数,从而降低了计算成本。