怎样在python中做多元回归_Python中的多元线性回归

这是我创建的一个小工作 . 我用R检查了它,它的工作正确 .

import numpy as np

import statsmodels.api as sm

y = [1,2,3,4,3,4,5,4,5,5,4,5,4,5,4,5,6,5,4,5,4,3,4]

x = [

[4,2,3,4,5,4,5,6,7,4,8,9,8,8,6,6,5,5,5,5,5,5,5],

[4,1,2,3,4,5,6,7,5,8,7,8,7,8,7,8,7,7,7,7,7,6,5],

[4,1,2,5,6,7,8,9,7,8,7,8,7,7,7,7,7,7,6,6,4,4,4]

]

def reg_m(y, x):

ones = np.ones(len(x[0]))

X = sm.add_constant(np.column_stack((x[0], ones)))

for ele in x[1:]:

X = sm.add_constant(np.column_stack((ele, X)))

results = sm.OLS(y, X).fit()

return results

结果:

print reg_m(y, x).summary()

输出:

OLS Regression Results

==============================================================================

Dep. Variable: y R-squared: 0.535

Model: OLS Adj. R-squared: 0.461

Method: Least Squares F-statistic: 7.281

Date: Tue, 19 Feb 2013 Prob (F-statistic): 0.00191

Time: 21:51:28 Log-Likelihood: -26.025

No. Observations: 23 AIC: 60.05

Df Residuals: 19 BIC: 64.59

Df Model: 3

==============================================================================

coef std err t P>|t| [95.0% Conf. Int.]

------------------------------------------------------------------------------

x1 0.2424 0.139 1.739 0.098 -0.049 0.534

x2 0.2360 0.149 1.587 0.129 -0.075 0.547

x3 -0.0618 0.145 -0.427 0.674 -0.365 0.241

const 1.5704 0.633 2.481 0.023 0.245 2.895

==============================================================================

Omnibus: 6.904 Durbin-Watson: 1.905

Prob(Omnibus): 0.032 Jarque-Bera (JB): 4.708

Skew: -0.849 Prob(JB): 0.0950

Kurtosis: 4.426 Cond. No. 38.6

pandas 提供了一种方便的方式来运行OLS,如下面的答案所示:

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值