吴恩达机器学习代码及相关知识点总结--ex1(2.多变量线性回归)

多元线性回归

def computeCost(X, y, theta):
    inner = np.power(((X * theta.T) - y), 2)
    return np.sum(inner) / (2 * len(X))
computeCost(X, y, theta)
def gradientDescent(X,y,theta,alpha,iters):
    temp=np.matrix(np.zeros(theta.shape))
    parameters=int(theta.ravel().shape[1])
    cost=np.zeros(iters)
    for i in range(iters):
        error=(X*theta.T)-y
        for j in range(parameters):
            term = np.multiply(error, X[:,j])
            temp[0,j] = theta[0,j] - ((alpha / len(X)) * np.sum(term))
            theta = temp
            cost[i] = computeCost(X, y, theta)
    return theta,cost
data2=pd.read_csv("code/ex1-linear regression/ex1data2.txt",names=['Size', 'Bedrooms', 'Price'])
data2=(data2-data2.mean())/data2.std()
data2.head(5)
data2.insert(0,"ones",1)
data2.head()
cols2=data2.shape[1]
X2=data2.iloc[:,:cols2-1]
Y2=data2.iloc[:,cols2-1:cols2]
X2=np.matrix(X2.values)
Y2=np.matrix(Y2.values)
theta2 = np.matrix(np.array([0,0,0]))
g2, cost2 = gradientDescent(X2, Y2, theta2, alpha, iters)
computeCost(X2,Y2,theta2)
fig2, ax = plt.subplots(figsize=(12,8))
ax.plot(np.arange(iters), cost2, 'r')
ax.set_xlabel('Iterations')
ax.set_ylabel('Cost')
ax.set_title('Error vs. Training Epoch')
plt.show()

在这里插入图片描述

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值