6.1 回归分析

import numpy as np
from scipy import optimize
import statsmodels.api as sm
fire = np.loadtxt('./fire.csv', delimiter=',')
blood = np.loadtxt('./blood.csv', delimiter=',') 
study = np.loadtxt('./study.csv', delimiter=',')
# 代码6-1
# 方法一
x = fire[:,0]
y = fire[:,1]
def regula(p):
    a,b = p
    return y - a - b*x
result = optimize.least_squares(regula,[0,0])
print('回归参数的估计值为:',result.x)
回归参数的估计值为: [10.2779285   4.91933074]
# 方法二
X = sm.add_constant(fire[:,0])
model = sm.OLS(fire[:,1], X)
results = model.fit()
print('回归参数的估计值为:',results.params)

 回归参数的估计值为: [10.27792855 4.91933073]

 

# 代码6-2
print('检验的结果为:\n',results.summary())
检验的结果为:
                             OLS Regression Results                            
==============================================================================
Dep. Variable:                      y   R-squared:                       0.923
Model:                            OLS   Adj. R-squared:                  0.918
Method:                 Least Squares   F-statistic:                     156.9
Date:                Fri, 10 Dec 2021   Prob (F-statistic):           1.25e-08
Time:                        08:28:38   Log-Likelihood:                -32.811
No. Observations:                  15   AIC:                             69.62
Df Residuals:                      13   BIC:                             71.04
Df Model:                           1                                         
Covariance Type:            nonrobust                                         
==============================================================================
                 coef    std err          t      P>|t|      [0.025      0.975]
------------------------------------------------------------------------------
const         10.2779      1.420      7.237      0.000       7.210      13.346
x1             4.9193      0.393     12.525      0.000       4.071       5.768
==============================================================================
Omnibus:                        2.551   Durbin-Watson:                   1.318
Prob(Omnibus):                  0.279   Jarque-Bera (JB):                1.047
Skew:                          -0.003   Prob(JB):                        0.592
Kurtosis:                       1.706   Cond. No.                         9.13
==============================================================================
# 代码6-3
# 方法一,利用线性回归模型参数的最小二乘估计量求出参数估计值
X = blood[:,0:2]
X = np.c_[np.ones(13),X]  # 创建X矩阵
Y = blood[:,2]  # 创建Y矩阵
B1 = np.linalg.inv(np.dot(X.T,X))
B2 = np.dot(B1,X.T)
print('回归参数的估计值为:\n',np.dot(B2,Y))

 回归参数的估计值为: [-62.96335911 2.13655814 0.40021615]

# 方法二
X = sm.add_constant(X)
model = sm.OLS(Y, X)
results = model.fit()
print('回归参数的估计值为:\n',results.params)

 回归参数的估计值为: [-62.96335911 2.13655814 0.40021615]

# 代码6-4
print('检验的结果为:\n',results.summary())

 

检验的结果为:
                             OLS Regression Results                            
==============================================================================
Dep. Variable:                      y   R-squared:                       0.946
Model:                            OLS   Adj. R-squared:                  0.935
Method:                 Least Squares   F-statistic:                     87.84
Date:                Fri, 10 Dec 2021   Prob (F-statistic):           4.53e-07
Time:                        08:29:07   Log-Likelihood:                -30.372
No. Observations:                  13   AIC:                             66.74
Df Residuals:                      10   BIC:                             68.44
Df Model:                           2                                         
Covariance Type:            nonrobust                                         
==============================================================================
                 coef    std err          t      P>|t|      [0.025      0.975]
------------------------------------------------------------------------------
const        -62.9634     17.000     -3.704      0.004    -100.841     -25.086
x1             2.1366      0.175     12.185      0.000       1.746       2.527
x2             0.4002      0.083      4.810      0.001       0.215       0.586
==============================================================================
Omnibus:                        0.167   Durbin-Watson:                   2.263
Prob(Omnibus):                  0.920   Jarque-Bera (JB):                0.370
Skew:                           0.089   Prob(JB):                        0.831
Kurtosis:                       2.193   Cond. No.                     1.97e+03
==============================================================================
# 代码6-5
X = sm.add_constant(study[:,0])  # 创建X矩阵
model = sm.Logit(study[:,1],X)  # 创建Y矩阵
result = model.fit()  # 内部使用极大似然估计
print('模型的结果为:\n',result.summary())

 

Optimization terminated successfully.
         Current function value: 0.401494
         Iterations 7
模型的结果为:
                            Logit Regression Results                           
==============================================================================
Dep. Variable:                      y   No. Observations:                   20
Model:                          Logit   Df Residuals:                       18
Method:                           MLE   Df Model:                            1
Date:                Fri, 10 Dec 2021   Pseudo R-squ.:                  0.4208
Time:                        08:29:40   Log-Likelihood:                -8.0299
converged:                       True   LL-Null:                       -13.863
                                        LLR p-value:                 0.0006365
==============================================================================
                 coef    std err          z      P>|z|      [0.025      0.975]
------------------------------------------------------------------------------
const         -4.0777      1.761     -2.316      0.021      -7.529      -0.626
x1             1.5046      0.629      2.393      0.017       0.272       2.737
==============================================================================
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值