1 Ridge Regression (岭回归,⼜名 Tikhonov regularization)
2 Lasso Regression(Lasso 回归)
3 Elastic Net (弹性⽹络)
4 Early Stopping [了解]
Early Stopping 也是正则化迭代学习的⽅法之⼀。
其做法为:在验证错误率达到最⼩值的时候停⽌训练
5.线性回归的改进-岭回归
5.1 API
5.2 观察正则化程度的变化,对结果的影响?
- 正则化⼒度越⼤,权重系数会越⼩
- 正则化⼒度越⼩,权重系数会越⼤
5.3 波⼠顿房价预测
from sklearn.datasets import load_boston
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
from sklearn.linear_model import LinearRegression,SGDRegressor,RidgeCV,Ridge
from sklearn.metrics import mean_squared_error
import joblib
import warnings
warnings.filterwarnings("ignore")
def linear_model3():
"""
线性回归:岭回归
:return:
"""
# 1.获取数据
boston=load_boston()
# 2.数据基本数据
# 2.1 分割数据
x_train,x_test,y_train,y_test=train_test_split(boston.data,boston.target,random_state=22,test_size=0.2)
# 3.特征工程--标准化
transfer=StandardScaler()
x_train=transfer.fit_transform(x_train)
x_test=transfer.fit_transform(x_test)
# 4.机器学习---线性回归
# 4.1 模型训练
#estimator=SGDRegressor(max_iter=1000,learning_rate="constant",eta0=0.001)
# estimator=Ridge(alpha=1.0)
estimator=RidgeCV(alphas=(0.001,0.01,0.1,1,10,100))
estimator.fit(x_train,y_train)
print("这个模型的偏置是:\n",estimator.intercept_)
print("这个模型的系数是:\n",estimator.coef_)
# 5. 模型评估
## 5.1 预测值
y_pre=estimator.predict(x_test)
# print("预测值是:\n",y_pre)
## 5.2 均方误差
ret=mean_squared_error(y_test,y_pre)
print("均方误差:\n",ret)
if __name__ == '__main__':
linear_model3()