Python中的多元函数拟合

使用sklearn函数库的LinearRegression和PolynomialFeatures进行函数拟合:

我的需求是三元函数拟合,如:f(a,b,c)=y。我有了一些自变量的值及最终y的值,想要计算出自变量值与y的关系,通俗讲就是想得到这个函数的式子是多少。在一次方下,最终的公式会变成:y=系数1*a+系数2*b+系数3*c+常数。

表格里展示了一些样例数据,我们用这些数据来进行拟合:

abcy
30.273.9682.3232.41
29.393.6992.334.93
29.187.2286.933
28.476.884.3730.69
29.382.5484.5833.04
29.490.7290.733.83
29.391.2990.6335.32
29.289.5688.9734.65
2895.3487.2733.74
25.499.8578.3831.45

先说结论,在一次方的情况下,公式为:y = 0.31838939*c-0.09243509*a-0.07032063*b+15.77246562

直接上代码,在这里通过一个循环运行了1到4次方以内的情况:

import pandas as pd
import numpy as np
from sklearn.linear_model import LinearRegression  # 导入线性回归模块
from sklearn.preprocessing import PolynomialFeatures

x = [[30.2, 73.96, 82.32], [29.3, 93.69, 92.3], [29.1, 87.22, 86.9], [28.4, 76.8, 84.37], [29.3, 82.54, 84.58], [29.4, 90.72, 90.7], [29.3, 91.29, 90.63], [29.2, 89.56, 88.97], [28, 95.34, 87.27], [25.4, 99.85, 78.38], [27.4, 92.17, 82.58], [28.7, 87.47, 85.29], [28.4, 80.21, 79.23], [26.5, 93.75, 79.64], [26.4, 86.72, 75.16], [25.7, 96.58, 77.86], [26.6, 99.72, 83.58], [28.2, 96.79, 89.14], [29, 95.9, 92.37], [30.1, 89.13, 92.89], [30.3, 90.83, 95.1], [29.9, 93.19, 94.86], [28.7, 98.61, 92.75], [28.7, 98.06, 92.38], [29.4, 86.56, 87.8], [29.4, 86.2, 87.55], [30.2, 79.56, 86.38], [30.80 , 71.17, 82.72], [29.3, 87.9, 88.28], [29.3, 94.06, 92.55], [29.4, 93.08, 92.35], [29, 95.93, 92.39], [29, 96.73, 92.94], [29.1, 93.85, 91.45], [30.2, 85.42, 90.65], [30.6, 85.5, 92.58], [30.6, 84.5, 91.83], [30.6, 84.95, 92.16], [30.9, 85.56, 94.04], [30.7, 84.95, 92.63], [29.9, 90.37, 92.83], [30.7, 78.12, 87.5], [31.3, 73.17, 86.33], [31.5, 74.78, 88.45], [31.4, 78.86, 91.2], [31.1, 80.12, 90.81], [30.8, 82.85, 91.51], [28.7, 95.94, 90.95], [28.9, 94.24, 90.76], [29.9, 90.2, 92.7], [29.8, 91.39, 93.08], [29.9, 87.62, 90.85], [30, 85.88, 90.06], [30.6, 80.12, 88.55], [31.5, 67.12, 82.46], [31.5, 63.9, 79.96], [31.5, 67, 82.37], [27.4, 93.9, 83.65], [26.5, 99.85, 83.21], [28, 95.18, 87.17], [28.2, 94.77, 87.83], [28.8, 90.18, 87.55], [29.8, 83.99, 87.79], [31.7, 80.96, 94.25], [33, 79.73, 99.49], [32.6, 79.84, 97.63], [32.4, 83.41, 99.65], [30.5, 91.96, 96.93], [29, 96.41, 92.72], [29.9, 90.63, 93.01], [28.1, 94.02, 86.88], [28, 97.27, 88.51], [27.4, 98.54, 86.52], [27.8, 99, 88.42], [29.4, 92.43, 91.9], [29.4, 93.22, 92.45], [29.2, 92.86, 91.24], [28.2, 93.63, 87.09], [28.9, 90.24, 88.05], [27.6, 95.74, 85.69], [25.5, 99.42, 78.57], [27.9, 95.46, 86.89], [28.8, 95.09, 90.86], [28.3, 97.03, 89.77], [29.3, 90.1, 89.8], [28.8, 96.08, 96.45], [28.8, 91.34, 88.33], [28.4, 94.64, 88.67], [28.9, 94.02, 90.61], [29.8, 90.65, 92.55], [31.1, 72.89, 85.27], [28.9, 81.8, 82.36], [26.3, 99.2, 81.93], [28.4, 88.82, 84.85], [30.5, 73.31, 83.08], [27.3, 87.92, 79.55], [26.2, 99.67, 81.76], [25.7, 99.77, 79.63], [27.8, 87.17, 87.19], [27.2, 79.64, 74.12], [27.2, 81.41, 75.19], [27.9, 75.66, 74.36], [28.2, 85, 81.51], [28.2, 86.52, 82.49], [28.6, 81.79, 81.09], [28.7, 83.16, 82.42], [28.8, 82.63, 82.49], [28.1, 86.64, 82.13], [29.2, 86.41, 86.8], [29.6, 83.62, 86.64], [29.6, 83.63, 86.65], [29.6, 90.16, 91.25], [29.8, 85.48, 88.86], [30.7, 76.55, 86.33], [30.3, 70.49, 80.21], [30.5, 69.82, 80.51], [30.8, 69.06, 81.14], [30.3, 69.82, 79.22], [30.2, 66.48, 76.92], [29.9, 69.23, 77.74], [30.4, 70.15, 80.36], [30.6, 68.49, 79.93], [29.7, 70.85, 78.1], [28.6, 59.93, 66.81], [29.2, 59.85, 68.75], [29.5, 66.62, 74.41], [27.8, 61.96, 65.47], [25.3, 67.92, 60.82], [25, 81.51, 66.99], [24.74, 87.11, 68.81], [24.8, 87.86, 69.57], [25.1, 77.1, 65.02], [24.7, 78.7, 67.3], [25.2, 63.34, 58.09], [25.7, 65.64, 60.86], [25.1, 77.98, 65.48], [25.1, 71.77, 62.21], [25.3, 54.46, 53.71], [24.7, 51.88, 50.78], [25, 64.78, 58.24], [25.2, 84.03, 69.05], [26.3, 83.2, 72.74], [23, 64.9, 52.43], [23.7, 60.29, 52.23], [25.3, 61.13, 57.23], [27.5, 72.46, 70.89], [26.4, 77.55, 69.89], [26.2, 73.7, 66.99], [23.9, 81.92, 63.32], [23.2, 69.01, 54.91], [24.3, 62.63, 55.22], [25, 51.25, 51.24], [25, 56.625, 54.01], [25, 57.62, 54.53], [25, 71.24, 61.61], [24.6, 85.22, 67.46], [25.1, 84.23, 68.79], [25.7, 86.55, 72.31], [26.9, 79, 72.59], [24.3, 62.97, 55.24], [23.8, 59.92, 52.33], [24.7, 79.73, 65], [25.2, 87.2, 70.89], [22.4, 70.87, 53.27], [22.6, 64.33, 51.05], [20, 56.84, 41.34], [19.6, 55.97, 40.08], [21.4, 63.86, 47.6], [21.3, 56.45, 44.3], [18.6, 34.4, 30.43], [18.7, 32.37, 29.91], [19, 45.77, 35.12], [20.1, 51.61, 39.59], [21.3, 54.37, 43.45], [21.7, 71.69, 51.71], [21.9, 75.45, 53.9], [21.5, 47.76, 41.19], [20.1, 53.6, 40.34], [21.2, 60.74, 45.8], [21, 70.17, 49.1], [21.6, 56.01, 44.85], [21.3, 62.13, 46.63], [21.6, 67.64, 49.72], [19.1, 60.1, 40.41], [17.9, 53.22, 35.37], [19.3, 62.72, 41.83], [18.3, 93.7, 49.87], [19.3, 88.01, 51.01], [19.2, 61.13, 41.01], [19.7, 48.08, 37.4], [19.2, 60.72, 41.11], [20.5, 61.55, 44.35], [22, 64.28, 49.39]]
y = [32.41, 34.93, 33, 30.69, 33.04, 33.83, 35.32, 34.65, 33.74, 31.45, 32.13, 32.21, 30.57, 30.48, 28.98, 30.06, 32.34, 38.66, 35.83, 36.58, 37.12, 35.36, 35.99, 35.97, 32.69, 33.05, 33.65, 33.25, 34.66, 35.97, 35.4952, 35.35, 35.7454, 34.9792, 34.3755, 35.809, 35.9238, 35.8182, 36.2126, 36.3648, 36.6559, 35.6048, 35.2835, 36.2347, 36.1388, 36.4947, 37.6265, 37.4104, 36.5614, 36.5002, 37.7603, 37.2834, 36.4108, 36.0211, 34.8442, 34.5706, 33.6497, 28.6073, 32.3691, 33.8309, 34.7854, 34.7854, 34.4016, 35.3682, 37.5852, 39.5721, 39.6482, 37.064, 36.2043, 35.9454, 34.7678, 34.3286, 34.5021, 34.9351, 36.5426, 36.4286, 36.602, 35.4223, 34.8491, 34.9475, 32.5374, 34.4008, 36.2489, 36.2118, 35.9979, 36.1951, 35.2621, 34.7222, 36.42758, 36.62781, 34.92295, 34.20356, 33.12341, 33.41995, 34.31366, 33.88733, 34.99557, 34.08932, 33.45712, 31.52066, 31.11321, 31.01933, 33.43293, 33.62164, 33.10863, 33.37417, 33.46632, 32.7838, 34.63174, 35.09647, 35.00071, 36.53666, 35.88498, 35.37119, 34.37705, 33.88807, 34.90083, 34.59718, 34.14731, 33.97156, 34.66636, 34.44469, 33.08487, 30.35254, 31.1625, 32.92542, 30.65265, 28.73976, 30.29222, 30.27743, 30.41306, 29.1695, 27.96457, 27.53616, 28.27566, 28.77334, 27.8593, 24.33573, 25.61012, 27.53281, 29.57958, 29.71556, 25.09585, 25.92506, 28.03225, 29.53976, 29.97617, 29.55075, 28.36101, 25.84153, 26.12548, 25.64531, 26.89404, 26.53279, 28.20822, 28.9811, 29.23923, 29.99869, 30.80204, 26.61131, 26.25268, 28.84218, 30.27573, 25.82562, 25.11705, 23.73354, 23.53936, 24.42389, 24.21438, 22.80012, 22.82548, 22.7121, 22.99725, 23.70374, 24.25557, 24.08455, 23.258, 22.7148, 23.49153, 23.66313, 23.636, 23.77006, 23.95503, 23.15424, 22.24213, 22.60156, 23.21879, 23.83092, 23.27122, 22.94566, 23.00742, 23.39508, 24.40417]
#其中,[30.2, 73.96, 82.32]对应第一个y值32.41,x和y两个数组的长度是一样的
for index in range(1, 5):
    print(f"============================={index} start====================================")
    data = pd.DataFrame({'IN': x, 'OUT': y})
    data_train = np.array(data['IN']).reshape(data['IN'].shape[0], 1)
    data_test = data['OUT']

    poly_reg = PolynomialFeatures(degree=index)
    X_ploy = poly_reg.fit_transform(x)
    regr = LinearRegression(fit_intercept=True)
    regr.fit(X_ploy, data_test)
    print("vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv")
    print("degree = ", index)
    print("coefficients = ", regr.coef_)
    print("intercept = ", regr.intercept_)
    print("R^2 = ", regr.score(X_ploy, data_test))
    print("^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^")
    print(f"=============================={index} end====================================")

运行结果:

=============================1 start====================================
vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv
degree =  1
coefficients =  [ 0.         -0.09243509 -0.07032063  0.31838939]
intercept =  15.772465625245767
R^2 =  0.9528875969548224
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
==============================1 end====================================
=============================2 start====================================
vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv
degree =  2
coefficients =  [ 0.00000000e+00 -1.52507511e+01 -1.82011542e+00  3.43811400e+00
  4.70491199e-01  1.14542322e-01 -2.30460753e-01  6.02909527e-03
 -2.61756659e-02  3.09742973e-02]
intercept =  150.57741878715817
R^2 =  0.9621763866367594
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
==============================2 end====================================
=============================3 start====================================
vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv
degree =  3
coefficients =  [ 0.00000000e+00  5.17461566e+02  7.78097480e+01 -1.39781965e+02
 -3.16044449e+01 -9.93323848e+00  1.79092101e+01 -7.22557189e-01
  2.83641108e+00 -2.67313933e+00  6.46687046e-01  3.17954381e-01
 -5.70397616e-01  4.63496209e-02 -1.83637923e-01  1.72517121e-01
  2.23932422e-03 -1.35083511e-02  2.71238318e-02 -1.74995129e-02]
intercept =  -2843.0392759785514
R^2 =  0.9649641979451274
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
==============================3 end====================================
=============================4 start====================================
vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv
degree =  4
coefficients =  [-5.65646882e+06 -4.75174376e+03 -1.51101468e+03  1.35773860e+03
  3.08738915e+02  2.44176652e+02 -2.18055377e+02  2.83342332e+01
 -7.39548730e+01  4.19875929e+01 -8.29884689e+00 -1.30025795e+01
  1.10569580e+01 -3.36536023e+00  8.49862070e+00 -4.62116129e+00
 -2.13926206e-01  1.06424752e+00 -1.51948650e+00  6.39448832e-01
  4.41226170e-02  2.28315169e-01 -1.39984265e-01  9.54236680e-02
 -2.35869163e-01  1.02174188e-01  1.17909032e-02 -5.66655340e-02
  7.95922894e-02 -2.77677658e-02  5.29802012e-04 -3.64901444e-03
  8.60646429e-03 -8.34487268e-03  2.33528744e-03]
intercept =  5683688.619945288
R^2 =  0.8103905412113526
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
==============================4 end====================================

解释:degree:次方值,每一个自变量的最高次方。

        coefficients:系数,最终公式里每一个自变量前的系数。

        intercept:常数,最终式子拼凑完成后,加的常数。

        R^2:拟合值,越接近1,则拟合值越高。

从结果中可以看出,当达到4次方后,拟合程度会下降到81%,没有意义,因此3次方及以内的结果校准。

  • 23
    点赞
  • 16
    收藏
    觉得还不错? 一键收藏
  • 2
    评论
要在Python进行多元函数拟合,可以使用sklearn函数的LinearRegression和PolynomialFeatures。首先,使用PolynomialFeatures将输入的特征进行多项式转换,然后使用LinearRegression进行线性回归拟合。以下是一个示例代码: ```python from sklearn.linear_model import LinearRegression from sklearn.preprocessing import PolynomialFeatures # 假设有多个特征 x1, x2, x3, ..., xn 和对应的目标值 y # 将特征进行多项式转换 poly = PolynomialFeatures(degree=d) # d为多项式的阶数 x_poly = poly.fit_transform(x) # x为原始特征矩阵,x_poly为转换后的特征矩阵 # 使用线性回归模型进行拟合 model = LinearRegression() model.fit(x_poly, y) # x_poly为转换后的特征矩阵,y为目标值 # 输出拟合结果 print('系数:', model.coef_) print('截距:', model.intercept_) print('R2:', model.score(x_poly, y)) ``` 在这个示例,我们首先使用PolynomialFeatures将原始特征进行多项式转换,然后使用LinearRegression进行拟合拟合结果包括回归系数、截距和R2评分。<span class="em">1</span><span class="em">2</span> #### 引用[.reference_title] - *1* [多元多项式拟合--Python](https://blog.csdn.net/l645317186/article/details/124963010)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v93^chatsearchT3_2"}}] [.reference_item style="max-width: 50%"] - *2* [Python关于如何进行多元线性回归拟合以及出图](https://blog.csdn.net/qq_29706435/article/details/88070637)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v93^chatsearchT3_2"}}] [.reference_item style="max-width: 50%"] [ .reference_list ]

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值