Python机器学习:线型回归法009使用scikit解决线性回归问题

import numpy as np
from sklearn import datasets
boston = datasets.load_boston()
X = boston.data
y = boston.target

X = X[y < 50]
y = y[y < 50]
#from sklearn.model_selection import train_test_split
from Simple_linear_Regression.model_selection import train_test_split
X_train,X_test,y_train,y_test = train_test_split(X,y,seed = 666)

scikit-learn 中的线性回归

from sklearn.linear_model import LinearRegression
lin_reg = LinearRegression()
lin_reg.fit(X_train,y_train)

系数和截距

print(lin_reg.coef_)
print(lin_reg.intercept_)
[-1.20354261e-01  3.64423279e-02 -3.61493155e-02  5.12978140e-02
 -1.15775825e+01  3.42740062e+00 -2.32311760e-02 -1.19487594e+00
  2.60101728e-01 -1.40219119e-02 -8.35430488e-01  7.80472852e-03
 -3.80923751e-01]
34.11739972322946
lin_reg.score(X_test,y_test)
0.8129794056212813

KNN Regressor

from sklearn.neighbors import KNeighborsRegressor
knn_reg = KNeighborsRegressor()#创建实例
knn_reg.fit(X_train,y_train)#fit
knn_reg.score(X_test,y_test)#score
#超参数
from sklearn.model_selection import GridSearchCV
param_grid = [{'weights':['uniform'],
               'n_neighbors':[i for i in range(1,11)]},
              {'weights':['distance'],
                'n_neighbors':[i for i in range(1,11)],
               'p':[i for i in range(1,6)]}
              ]
knn_reg = KNeighborsRegressor()
grid_search = GridSearchCV(knn_reg,param_grid)
grid_search.fit(X_train,y_train)
print(grid_search.best_params_)
print(grid_search.best_score_)
{'n_neighbors': 7, 'p': 1, 'weights': 'distance'}
0.652216494152461
#看看在测试集上的准确率
print(grid_search.best_estimator_.score(X_test,y_test))
0.7160666820548707
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值