机器学习之多元线性回归

零、模型

0.1、模型介绍

y = b0 + b1X1+B2X2+BnXn

0.2、限定条件

1.线性、2.同方差性、3.多元正太分布、4.误差独立、5.无多重共线性

0.3 模型的建立方法

1.全部选取 :反向淘汰的第一步、必须全部选取的时候、先验知识

2.反向淘汰 :自变量对于P值的影响, 计算每个自变量的P值,进行与自定义SL值比较。

3.顺向选择 :每个变量是否能够进入模型,

4.双向淘汰 : 选择两个显著性值,同时进行反向淘汰与顺向选择

5.信息量比较:对所有可能的模型进行打分

一、导入标准库

In [8]:
# Importing the libraries 导入库
import numpy as np
import matplotlib.pyplot as plt
import pandas as pd
# 使图像能够调整
%matplotlib notebook 
#中文字体显示  
plt.rc('font', family='SimHei', size=8)

二、导入数据

In [38]:
# 根据各项开支预测利润
dataset = pd.read_csv('./50_Startups.csv')
X = dataset.iloc[:, :-1].values  # 选取自变量
y = dataset.iloc[:, 4].values    # 选取因变量
dataset
Out[38]:
 R&D SpendAdministrationMarketing SpendStateProfit
0165349.20136897.80471784.10New York192261.83
1162597.70151377.59443898.53California191792.06
2153441.51101145.55407934.54Florida191050.39
3144372.41118671.85383199.62New York182901.99
4142107.3491391.77366168.42Florida166187.94
5131876.9099814.71362861.36New York156991.12
6134615.46147198.87127716.82California156122.51
7130298.13145530.06323876.68Florida155752.60
8120542.52148718.95311613.29New York152211.77
9123334.88108679.17304981.62California149759.96
10101913.08110594.11229160.95Florida146121.95
11100671.9691790.61249744.55California144259.40
1293863.75127320.38249839.44Florida141585.52
1391992.39135495.07252664.93California134307.35
14119943.24156547.42256512.92Florida132602.65
15114523.61122616.84261776.23New York129917.04
1678013.11121597.55264346.06California126992.93
1794657.16145077.58282574.31New York125370.37
1891749.16114175.79294919.57Florida124266.90
1986419.70153514.110.00New York122776.86
2076253.86113867.30298664.47California118474.03
2178389.47153773.43299737.29New York111313.02
2273994.56122782.75303319.26Florida110352.25
2367532.53105751.03304768.73Florida108733.99
2477044.0199281.34140574.81New York108552.04
2564664.71139553.16137962.62California107404.34
2675328.87144135.98134050.07Florida105733.54
2772107.60127864.55353183.81New York105008.31
2866051.52182645.56118148.20Florida103282.38
2965605.48153032.06107138.38New York101004.64
3061994.48115641.2891131.24Florida99937.59
3161136.38152701.9288218.23New York97483.56
3263408.86129219.6146085.25California97427.84
3355493.95103057.49214634.81Florida96778.92
3446426.07157693.92210797.67California96712.80
3546014.0285047.44205517.64New York96479.51
3628663.76127056.21201126.82Florida90708.19
3744069.9551283.14197029.42California89949.14
3820229.5965947.93185265.10New York81229.06
3938558.5182982.09174999.30California81005.76
4028754.33118546.05172795.67California78239.91
4127892.9284710.77164470.71Florida77798.83
4223640.9396189.63148001.11California71498.49
4315505.73127382.3035534.17New York69758.98
4422177.74154806.1428334.72California65200.33
451000.23124153.041903.93New York64926.08
461315.46115816.21297114.46Florida49490.75
470.00135426.920.00California42559.73
48542.0551743.150.00New York35673.41
490.00116983.8045173.06California14681.40

三、虚拟变量的处理

In [39]:
from sklearn.preprocessing import LabelEncoder, OneHotEncoder
labelencoder_X = LabelEncoder()
X[:, 3] = labelencoder_X.fit_transform(X[:, 3]) # 将地区变为数字
onehotencoder = OneHotEncoder(categorical_features = [3])
X = onehotencoder.fit_transform(X).toarray()    # 将地区变为虚拟变量
X
Out[39]:
array([[  0.00000000e+00,   0.00000000e+00,   1.00000000e+00,
          1.65349200e+05,   1.36897800e+05,   4.71784100e+05],
       [  1.00000000e+00,   0.00000000e+00,   0.00000000e+00,
          1.62597700e+05,   1.51377590e+05,   4.43898530e+05],
       [  0.00000000e+00,   1.00000000e+00,   0.00000000e+00,
          1.53441510e+05,   1.01145550e+05,   4.07934540e+05],
       [  0.00000000e+00,   0.00000000e+00,   1.00000000e+00,
          1.44372410e+05,   1.18671850e+05,   3.83199620e+05],
       [  0.00000000e+00,   1.00000000e+00,   0.00000000e+00,
          1.42107340e+05,   9.13917700e+04,   3.66168420e+05],
       [  0.00000000e+00,   0.00000000e+00,   1.00000000e+00,
          1.31876900e+05,   9.98147100e+04,   3.62861360e+05],
       [  1.00000000e+00,   0.00000000e+00,   0.00000000e+00,
          1.34615460e+05,   1.47198870e+05,   1.27716820e+05],
       [  0.00000000e+00,   1.00000000e+00,   0.00000000e+00,
          1.30298130e+05,   1.45530060e+05,   3.23876680e+05],
       [  0.00000000e+00,   0.00000000e+00,   1.00000000e+00,
          1.20542520e+05,   1.48718950e+05,   3.11613290e+05],
       [  1.00000000e+00,   0.00000000e+00,   0.00000000e+00,
          1.23334880e+05,   1.08679170e+05,   3.04981620e+05],
       [  0.00000000e+00,   1.00000000e+00,   0.00000000e+00,
          1.01913080e+05,   1.10594110e+05,   2.29160950e+05],
       [  1.00000000e+00,   0.00000000e+00,   0.00000000e+00,
          1.00671960e+05,   9.17906100e+04,   2.49744550e+05],
       [  0.00000000e+00,   1.00000000e+00,   0.00000000e+00,
          9.38637500e+04,   1.27320380e+05,   2.49839440e+05],
       [  1.00000000e+00,   0.00000000e+00,   0.00000000e+00,
          9.19923900e+04,   1.35495070e+05,   2.52664930e+05],
       [  0.00000000e+00,   1.00000000e+00,   0.00000000e+00,
          1.19943240e+05,   1.56547420e+05,   2.56512920e+05],
       [  0.00000000e+00,   0.00000000e+00,   1.00000000e+00,
          1.14523610e+05,   1.22616840e+05,   2.61776230e+05],
       [  1.00000000e+00,   0.00000000e+00,   0.00000000e+00,
          7.80131100e+04,   1.21597550e+05,   2.64346060e+05],
       [  0.00000000e+00,   0.00000000e+00,   1.00000000e+00,
          9.46571600e+04,   1.45077580e+05,   2.82574310e+05],
       [  0.00000000e+00,   1.00000000e+00,   0.00000000e+00,
          9.17491600e+04,   1.14175790e+05,   2.94919570e+05],
       [  0.00000000e+00,   0.00000000e+00,   1.00000000e+00,
          8.64197000e+04,   1.53514110e+05,   0.00000000e+00],
       [  1.00000000e+00,   0.00000000e+00,   0.00000000e+00,
          7.62538600e+04,   1.13867300e+05,   2.98664470e+05],
       [  0.00000000e+00,   0.00000000e+00,   1.00000000e+00,
          7.83894700e+04,   1.53773430e+05,   2.99737290e+05],
       [  0.00000000e+00,   1.00000000e+00,   0.00000000e+00,
          7.39945600e+04,   1.22782750e+05,   3.03319260e+05],
       [  0.00000000e+00,   1.00000000e+00,   0.00000000e+00,
          6.75325300e+04,   1.05751030e+05,   3.04768730e+05],
       [  0.00000000e+00,   0.00000000e+00,   1.00000000e+00,
          7.70440100e+04,   9.92813400e+04,   1.40574810e+05],
       [  1.00000000e+00,   0.00000000e+00,   0.00000000e+00,
          6.46647100e+04,   1.39553160e+05,   1.37962620e+05],
       [  0.00000000e+00,   1.00000000e+00,   0.00000000e+00,
          7.53288700e+04,   1.44135980e+05,   1.34050070e+05],
       [  0.00000000e+00,   0.00000000e+00,   1.00000000e+00,
          7.21076000e+04,   1.27864550e+05,   3.53183810e+05],
       [  0.00000000e+00,   1.00000000e+00,   0.00000000e+00,
          6.60515200e+04,   1.82645560e+05,   1.18148200e+05],
       [  0.00000000e+00,   0.00000000e+00,   1.00000000e+00,
          6.56054800e+04,   1.53032060e+05,   1.07138380e+05],
       [  0.00000000e+00,   1.00000000e+00,   0.00000000e+00,
          6.19944800e+04,   1.15641280e+05,   9.11312400e+04],
       [  0.00000000e+00,   0.00000000e+00,   1.00000000e+00,
          6.11363800e+04,   1.52701920e+05,   8.82182300e+04],
       [  1.00000000e+00,   0.00000000e+00,   0.00000000e+00,
          6.34088600e+04,   1.29219610e+05,   4.60852500e+04],
       [  0.00000000e+00,   1.00000000e+00,   0.00000000e+00,
          5.54939500e+04,   1.03057490e+05,   2.14634810e+05],
       [  1.00000000e+00,   0.00000000e+00,   0.00000000e+00,
          4.64260700e+04,   1.57693920e+05,   2.10797670e+05],
       [  0.00000000e+00,   0.00000000e+00,   1.00000000e+00,
          4.60140200e+04,   8.50474400e+04,   2.05517640e+05],
       [  0.00000000e+00,   1.00000000e+00,   0.00000000e+00,
          2.86637600e+04,   1.27056210e+05,   2.01126820e+05],
       [  1.00000000e+00,   0.00000000e+00,   0.00000000e+00,
          4.40699500e+04,   5.12831400e+04,   1.97029420e+05],
       [  0.00000000e+00,   0.00000000e+00,   1.00000000e+00,
          2.02295900e+04,   6.59479300e+04,   1.85265100e+05],
       [  1.00000000e+00,   0.00000000e+00,   0.00000000e+00,
          3.85585100e+04,   8.29820900e+04,   1.74999300e+05],
       [  1.00000000e+00,   0.00000000e+00,   0.00000000e+00,
          2.87543300e+04,   1.18546050e+05,   1.72795670e+05],
       [  0.00000000e+00,   1.00000000e+00,   0.00000000e+00,
          2.78929200e+04,   8.47107700e+04,   1.64470710e+05],
       [  1.00000000e+00,   0.00000000e+00,   0.00000000e+00,
          2.36409300e+04,   9.61896300e+04,   1.48001110e+05],
       [  0.00000000e+00,   0.00000000e+00,   1.00000000e+00,
          1.55057300e+04,   1.27382300e+05,   3.55341700e+04],
       [  1.00000000e+00,   0.00000000e+00,   0.00000000e+00,
          2.21777400e+04,   1.54806140e+05,   2.83347200e+04],
       [  0.00000000e+00,   0.00000000e+00,   1.00000000e+00,
          1.00023000e+03,   1.24153040e+05,   1.90393000e+03],
       [  0.00000000e+00,   1.00000000e+00,   0.00000000e+00,
          1.31546000e+03,   1.15816210e+05,   2.97114460e+05],
       [  1.00000000e+00,   0.00000000e+00,   0.00000000e+00,
          0.00000000e+00,   1.35426920e+05,   0.00000000e+00],
       [  0.00000000e+00,   0.00000000e+00,   1.00000000e+00,
          5.42050000e+02,   5.17431500e+04,   0.00000000e+00],
       [  1.00000000e+00,   0.00000000e+00,   0.00000000e+00,
          0.00000000e+00,   1.16983800e+05,   4.51730600e+04]])

虚拟变量只需要N-1个变量即可拟合,所以去掉其中一个变量

In [40]:
X = X[:,1:]
X
Out[40]:
array([[  0.00000000e+00,   1.00000000e+00,   1.65349200e+05,
          1.36897800e+05,   4.71784100e+05],
       [  0.00000000e+00,   0.00000000e+00,   1.62597700e+05,
          1.51377590e+05,   4.43898530e+05],
       [  1.00000000e+00,   0.00000000e+00,   1.53441510e+05,
          1.01145550e+05,   4.07934540e+05],
       [  0.00000000e+00,   1.00000000e+00,   1.44372410e+05,
          1.18671850e+05,   3.83199620e+05],
       [  1.00000000e+00,   0.00000000e+00,   1.42107340e+05,
          9.13917700e+04,   3.66168420e+05],
       [  0.00000000e+00,   1.00000000e+00,   1.31876900e+05,
          9.98147100e+04,   3.62861360e+05],
       [  0.00000000e+00,   0.00000000e+00,   1.34615460e+05,
          1.47198870e+05,   1.27716820e+05],
       [  1.00000000e+00,   0.00000000e+00,   1.30298130e+05,
          1.45530060e+05,   3.23876680e+05],
       [  0.00000000e+00,   1.00000000e+00,   1.20542520e+05,
          1.48718950e+05,   3.11613290e+05],
       [  0.00000000e+00,   0.00000000e+00,   1.23334880e+05,
          1.08679170e+05,   3.04981620e+05],
       [  1.00000000e+00,   0.00000000e+00,   1.01913080e+05,
          1.10594110e+05,   2.29160950e+05],
       [  0.00000000e+00,   0.00000000e+00,   1.00671960e+05,
          9.17906100e+04,   2.49744550e+05],
       [  1.00000000e+00,   0.00000000e+00,   9.38637500e+04,
          1.27320380e+05,   2.49839440e+05],
       [  0.00000000e+00,   0.00000000e+00,   9.19923900e+04,
          1.35495070e+05,   2.52664930e+05],
       [  1.00000000e+00,   0.00000000e+00,   1.19943240e+05,
          1.56547420e+05,   2.56512920e+05],
       [  0.00000000e+00,   1.00000000e+00,   1.14523610e+05,
          1.22616840e+05,   2.61776230e+05],
       [  0.00000000e+00,   0.00000000e+00,   7.80131100e+04,
          1.21597550e+05,   2.64346060e+05],
       [  0.00000000e+00,   1.00000000e+00,   9.46571600e+04,
          1.45077580e+05,   2.82574310e+05],
       [  1.00000000e+00,   0.00000000e+00,   9.17491600e+04,
          1.14175790e+05,   2.94919570e+05],
       [  0.00000000e+00,   1.00000000e+00,   8.64197000e+04,
          1.53514110e+05,   0.00000000e+00],
       [  0.00000000e+00,   0.00000000e+00,   7.62538600e+04,
          1.13867300e+05,   2.98664470e+05],
       [  0.00000000e+00,   1.00000000e+00,   7.83894700e+04,
          1.53773430e+05,   2.99737290e+05],
       [  1.00000000e+00,   0.00000000e+00,   7.39945600e+04,
          1.22782750e+05,   3.03319260e+05],
       [  1.00000000e+00,   0.00000000e+00,   6.75325300e+04,
          1.05751030e+05,   3.04768730e+05],
       [  0.00000000e+00,   1.00000000e+00,   7.70440100e+04,
          9.92813400e+04,   1.40574810e+05],
       [  0.00000000e+00,   0.00000000e+00,   6.46647100e+04,
          1.39553160e+05,   1.37962620e+05],
       [  1.00000000e+00,   0.00000000e+00,   7.53288700e+04,
          1.44135980e+05,   1.34050070e+05],
       [  0.00000000e+00,   1.00000000e+00,   7.21076000e+04,
          1.27864550e+05,   3.53183810e+05],
       [  1.00000000e+00,   0.00000000e+00,   6.60515200e+04,
          1.82645560e+05,   1.18148200e+05],
       [  0.00000000e+00,   1.00000000e+00,   6.56054800e+04,
          1.53032060e+05,   1.07138380e+05],
       [  1.00000000e+00,   0.00000000e+00,   6.19944800e+04,
          1.15641280e+05,   9.11312400e+04],
       [  0.00000000e+00,   1.00000000e+00,   6.11363800e+04,
          1.52701920e+05,   8.82182300e+04],
       [  0.00000000e+00,   0.00000000e+00,   6.34088600e+04,
          1.29219610e+05,   4.60852500e+04],
       [  1.00000000e+00,   0.00000000e+00,   5.54939500e+04,
          1.03057490e+05,   2.14634810e+05],
       [  0.00000000e+00,   0.00000000e+00,   4.64260700e+04,
          1.57693920e+05,   2.10797670e+05],
       [  0.00000000e+00,   1.00000000e+00,   4.60140200e+04,
          8.50474400e+04,   2.05517640e+05],
       [  1.00000000e+00,   0.00000000e+00,   2.86637600e+04,
          1.27056210e+05,   2.01126820e+05],
       [  0.00000000e+00,   0.00000000e+00,   4.40699500e+04,
          5.12831400e+04,   1.97029420e+05],
       [  0.00000000e+00,   1.00000000e+00,   2.02295900e+04,
          6.59479300e+04,   1.85265100e+05],
       [  0.00000000e+00,   0.00000000e+00,   3.85585100e+04,
          8.29820900e+04,   1.74999300e+05],
       [  0.00000000e+00,   0.00000000e+00,   2.87543300e+04,
          1.18546050e+05,   1.72795670e+05],
       [  1.00000000e+00,   0.00000000e+00,   2.78929200e+04,
          8.47107700e+04,   1.64470710e+05],
       [  0.00000000e+00,   0.00000000e+00,   2.36409300e+04,
          9.61896300e+04,   1.48001110e+05],
       [  0.00000000e+00,   1.00000000e+00,   1.55057300e+04,
          1.27382300e+05,   3.55341700e+04],
       [  0.00000000e+00,   0.00000000e+00,   2.21777400e+04,
          1.54806140e+05,   2.83347200e+04],
       [  0.00000000e+00,   1.00000000e+00,   1.00023000e+03,
          1.24153040e+05,   1.90393000e+03],
       [  1.00000000e+00,   0.00000000e+00,   1.31546000e+03,
          1.15816210e+05,   2.97114460e+05],
       [  0.00000000e+00,   0.00000000e+00,   0.00000000e+00,
          1.35426920e+05,   0.00000000e+00],
       [  0.00000000e+00,   1.00000000e+00,   5.42050000e+02,
          5.17431500e+04,   0.00000000e+00],
       [  0.00000000e+00,   0.00000000e+00,   0.00000000e+00,
          1.16983800e+05,   4.51730600e+04]])

四、区分训练集和测试集

In [41]:
from sklearn.model_selection import train_test_split
X_train,X_test,y_train,y_test = train_test_split(X,y,test_size=0.2,random_state = 0)

五、用线性回归训练

In [42]:
from sklearn.linear_model import LinearRegression
regressor = LinearRegression()
regressor.fit(X_train,y_train)
Out[42]:
LinearRegression(copy_X=True, fit_intercept=True, n_jobs=1, normalize=False)
In [43]:
y_pred = regressor.predict(X_test)

六、反向淘汰选择模型变量

设定当 P>|t|大于0.05即被淘汰

In [44]:
import statsmodels.formula.api as sm
X_train = np.append(arr = np.ones((40,1)),values = X_train,axis = 1) # 增加新的一列
X_opt = X_train[:,[0,1,2,3,4,5]]
regressor_OLS = sm.OLS(endog = y_train, exog = X_opt).fit()
regressor_OLS.summary()
Out[44]:
OLS Regression Results
Dep. Variable:yR-squared:0.950
Model:OLSAdj. R-squared:0.943
Method:Least SquaresF-statistic:129.7
Date:Sat, 14 Apr 2018Prob (F-statistic):3.91e-21
Time:23:08:24Log-Likelihood:-421.10
No. Observations:40AIC:854.2
Df Residuals:34BIC:864.3
Df Model:5  
Covariance Type:nonrobust  
 coefstd errtP>|t|[0.0250.975]
const4.255e+048358.5385.0910.0002.56e+045.95e+04
x1-959.28424038.108-0.2380.814-9165.7067247.138
x2699.36913661.5630.1910.850-6741.8228140.560
x30.77350.05514.0250.0000.6610.886
x40.03290.0660.4950.624-0.1020.168
x50.03660.0191.8840.068-0.0030.076
Omnibus:15.823Durbin-Watson:2.468
Prob(Omnibus):0.000Jarque-Bera (JB):23.231
Skew:-1.094Prob(JB):9.03e-06
Kurtosis:6.025Cond. No.1.49e+06
In [45]:
 X_opt = X_train [:, [0, 1, 3, 4, 5]]
regressor_OLS = sm.OLS(endog = y_train, exog = X_opt).fit()
regressor_OLS.summary()
Out[45]:
OLS Regression Results
Dep. Variable:yR-squared:0.950
Model:OLSAdj. R-squared:0.944
Method:Least SquaresF-statistic:166.7
Date:Sat, 14 Apr 2018Prob (F-statistic):2.87e-22
Time:23:08:48Log-Likelihood:-421.12
No. Observations:40AIC:852.2
Df Residuals:35BIC:860.7
Df Model:4  
Covariance Type:nonrobust  
 coefstd errtP>|t|[0.0250.975]
const4.292e+048020.3975.3520.0002.66e+045.92e+04
x1-1272.16083639.780-0.3500.729-8661.3086116.986
x20.77540.05314.4980.0000.6670.884
x30.03190.0650.4880.629-0.1010.165
x40.03630.0191.9020.065-0.0020.075
Omnibus:16.074Durbin-Watson:2.467
Prob(Omnibus):0.000Jarque-Bera (JB):24.553
Skew:-1.086Prob(JB):4.66e-06
Kurtosis:6.164Cond. No.1.43e+06
In [46]:
X_opt = X_train [:, [0, 3, 4, 5]]
regressor_OLS = sm.OLS(endog = y_train, exog = X_opt).fit()
regressor_OLS.summary()
Out[46]:
OLS Regression Results
Dep. Variable:yR-squared:0.950
Model:OLSAdj. R-squared:0.946
Method:Least SquaresF-statistic:227.8
Date:Sat, 14 Apr 2018Prob (F-statistic):1.85e-23
Time:23:08:50Log-Likelihood:-421.19
No. Observations:40AIC:850.4
Df Residuals:36BIC:857.1
Df Model:3  
Covariance Type:nonrobust  
 coefstd errtP>|t|[0.0250.975]
const4.299e+047919.7735.4280.0002.69e+045.91e+04
x10.77880.05215.0030.0000.6740.884
x20.02940.0640.4580.650-0.1010.160
x30.03470.0181.8960.066-0.0020.072
Omnibus:15.557Durbin-Watson:2.481
Prob(Omnibus):0.000Jarque-Bera (JB):22.539
Skew:-1.081Prob(JB):1.28e-05
Kurtosis:5.974Cond. No.1.43e+06
In [47]:
X_opt = X_train [:, [0, 3, 5]]
regressor_OLS = sm.OLS(endog = y_train, exog = X_opt).fit()
regressor_OLS.summary()
Out[47]:
OLS Regression Results
Dep. Variable:yR-squared:0.950
Model:OLSAdj. R-squared:0.947
Method:Least SquaresF-statistic:349.0
Date:Sat, 14 Apr 2018Prob (F-statistic):9.65e-25
Time:23:08:53Log-Likelihood:-421.30
No. Observations:40AIC:848.6
Df Residuals:37BIC:853.7
Df Model:2  
Covariance Type:nonrobust  
 coefstd errtP>|t|[0.0250.975]
const4.635e+042971.23615.5980.0004.03e+045.24e+04
x10.78860.04716.8460.0000.6940.883
x20.03260.0181.8600.071-0.0030.068
Omnibus:14.666Durbin-Watson:2.518
Prob(Omnibus):0.001Jarque-Bera (JB):20.582
Skew:-1.030Prob(JB):3.39e-05
Kurtosis:5.847Cond. No.4.97e+05
In [48]:
X_opt = X_train [:, [0, 3]]
regressor_OLS = sm.OLS(endog = y_train, exog = X_opt).fit()
regressor_OLS.summary()
Out[48]:
OLS Regression Results
Dep. Variable:yR-squared:0.945
Model:OLSAdj. R-squared:0.944
Method:Least SquaresF-statistic:652.4
Date:Sat, 14 Apr 2018Prob (F-statistic):1.56e-25
Time:23:08:55Log-Likelihood:-423.09
No. Observations:40AIC:850.2
Df Residuals:38BIC:853.6
Df Model:1  
Covariance Type:nonrobust  
 coefstd errtP>|t|[0.0250.975]
const4.842e+042842.71717.0320.0004.27e+045.42e+04
x10.85160.03325.5420.0000.7840.919
Omnibus:13.132Durbin-Watson:2.325
Prob(Omnibus):0.001Jarque-Bera (JB):16.254
Skew:-0.991Prob(JB):0.000295
Kurtosis:5.413Cond. No.1.57e+05

七、项目地址

阅读更多
想对作者说点什么? 我来说一句

没有更多推荐了,返回首页