线性回归的正规方程解法与梯度下降解法的代码

理论参见:http://blog.csdn.net/sadfasdgaaaasdfa/article/details/46850803

梯度公式:
这里写图片描述

__author__ = 'Chen'

from numpy import *
def linearRegresion(x,y,type=True,alpha=0.01):

    xrow = shape(x)[0]
    xcol = shape(x)[1]
    x = matrix(x)
    Y = matrix(y)
    # fill ones
    xone = ones((xrow,1))
    X = hstack((xone,x))
    X = matrix(X)
    # normal equiation
    if type == True:

        theta = (X.T*X).I*X.T*Y
        return theta
    else:

        # gradiant
        theta = matrix(random.random(xcol+1))
        # iterations
        for iteration in range(1,10000):
                sums = 0
                #gradient method
                for i in range(xrow):
                    sums += (theta*X[i,:].T-Y[i,:])*X[i,:]
                theta -= alpha*sums/xrow
        return theta



x= [[0,1,0],[0,0,1],[0,1,1],[1,1,1]]
y= [[1],[2],[3],[4]]

# calculate linearRegression by normal equation
theta1 = linearRegresion(x,y)
print theta1

#gradient descent
theta2 = linearRegresion(x,y,False)

print theta2
C:\Python27\python.exe C:/Users/Chen/PycharmProjects/mypython/courseraML/LinearRegression.py
[[ 0.]
 [ 1.]
 [ 1.]
 [ 2.]]
[[ 0.00709026  1.00390811  0.99480722  1.99480246]]

Process finished with exit code 0
  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值