机器学习-有监督学习-线性回归-sklearn

import numpy as np
import matplotlib.pyplot as plt
# 导入线性回归库
from sklearn.linear_model import LinearRegression
# 定义损失函数 ( y - w * x - b ) **2
def cost(w , b , points):
    sum_cost = 0
    M = len(points)
    for i in range(M):
        x = points[i,0]
        y = points[i,1]
        sum_cost += ( y - w * x - b ) ** 2
    return sum_cost
if __name__ == '__main__':
    """
    Ordinary least squares Linear Regression.
    LinearRegression fits a linear model with coefficients w = (w1, ..., wp)
    to minimize the residual sum of squares between the observed targets in
    the dataset, and the targets predicted by the linear approximation.
    普通最小二乘线性回归。
    线性回归拟合的线性模型的系数为w = (w1,…wp)
    使观测目标间的残差平方和最小
    数据集,和目标预测的线性逼近。
    """
    lr = LinearRegression()
    # 读取数据
    points = np.genfromtxt("D:\projects\PythonProjects\PythonStudy\data.csv",delimiter=",")
    x = points[:,0]
    y = points[:,1]
    plt.scatter(x,y)
    # 对 x, y 点进行转置  将一行n列的向量转换成n行一列的向量
    new_x = x.reshape(-1,1)
    new_y = y.reshape(-1,1)
    # 根据训练数据集来得到模型
    lr.fit(new_x,new_y)
    w = lr.coef_[0][0]
    b = lr.intercept_[0]
    cost_list = cost(w,b,points)
    print("w is :" ,w)
    print("b is :" ,b)
    print("cost_list is :" ,cost_list)

    end_y = w * x + b
    plt.plot(x , end_y , c = "r")
    plt.show()
D:\Python\python.exe D:/projects/PythonProjects/PythonStudy/python-1/com/python/stuay/SkLearnLinearRegression.py
w is : 1.3224310227553597
b is : 7.991020982270399
cost_list is : 11025.738346621318

 

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值