销量预测02(数据的初步处理:prophet的basline)

首先在此考虑针对每家店分别做销量预测的尝试.

导入计算库

import pandas as pd 
import numpy as np 
import matplotlib.pyplot as plt 
import seaborn as sns 
plt.style.use("fivethirtyeight")
plt.rcParams["font.sans-serif"] = ["Microsoft YaHei"]
plt.rcParams["axes.unicode_minus"] = False
import warnings
warnings.filterwarnings("ignore")
from sklearn.preprocessing import OneHotEncoder

from fbprophet import Prophet

导入数据

先尝试使用没有被压缩的数据做探索,根据之前的探索,ID号,和日期都可以被去除,日期被转化为运营日期。

path_train = "../preocess_data/train_data_o.csv"
path_test = "../data/test_data.csv"
data  = pd.read_csv(path_train)
data_test = pd.read_csv(path_test)
data["运营日期"] = pd.to_datetime(data["运营日期"] )
data_test["运营日期"] = pd.to_datetime(data_test["日期"])
data.drop(["行ID","日期"],axis=1,inplace=True) 
data_test.drop(["行ID","日期"],axis=1,inplace=True)
data
商店ID商店类型位置地区节假日折扣销量运营日期
01S1L3R11Yes7011.842018-01-01
1253S4L2R11Yes51789.122018-01-01
2252S3L2R11Yes36868.202018-01-01
3251S2L3R11Yes19715.162018-01-01
4250S2L3R41Yes45614.522018-01-01
...........................
188335149S2L3R21Yes37272.002019-05-31
188336153S4L2R11No54572.642019-05-31
188337154S1L3R21No31624.562019-05-31
188338155S3L1R21Yes49162.412019-05-31
188339152S2L1R11No37977.002019-05-31

188340 rows × 8 columns

特征编码

商店类型,位置, 地区,折扣 四个需要进行编码,但是根据之前的推论,只用对折扣进行编码就好(OneHot)。

折扣编码

enc = OneHotEncoder(drop="if_binary")
enc.fit(data["折扣"].values.reshape(-1,1))
enc.transform(data["折扣"].values.reshape(-1,1)).toarray()
enc.transform(data_test["折扣"].values.reshape(-1,1)).toarray()
array([[0.],
       [0.],
       [0.],
       ...,
       [1.],
       [0.],
       [0.]])
data["折扣"] = enc.transform(data["折扣"].values.reshape(-1,1)).toarray()
data_test["折扣"]  = enc.transform(data_test["折扣"].values.reshape(-1,1)).toarray()
data_train_1 = data[data["商店ID"]==1]
data_train_1
商店ID商店类型位置地区节假日折扣销量运营日期yearmonthdayquarterweekofyeardayofweekweekend
01S1L3R111.07011.842018-01-012018111110
6071S1L3R101.042369.002018-01-022018121120
10461S1L3R101.050037.002018-01-032018131130
12071S1L3R101.044397.002018-01-042018141140
17521S1L3R101.047604.002018-01-052018151150
................................................
1865691S1L3R101.033075.002019-05-27201952722210
1871651S1L3R101.037317.002019-05-28201952822220
1873911S1L3R101.044652.002019-05-29201952922230
1879621S1L3R101.042387.002019-05-30201953022240
1881131S1L3R111.039843.782019-05-31201953122250

516 rows × 15 columns

month, quarter 编码

enc_ = OneHotEncoder(drop="if_binary")
enc_.fit(data_train_1[["month","quarter"]])
OneHotEncoder(drop='if_binary')
data_onehot_train = pd.DataFrame(enc_.transform(data_train_1[["month","quarter"]]).toarray())
data_onehot_list_train = [f"month_{i}" for i in range(1,13)]
data_onehot_list_train.extend([f"quarter_{i}" for i in range(1,5)])
data_onehot_train.columns = data_onehot_list_train

data_onehot_test= pd.DataFrame(enc_.transform(data_test_1[["month","quarter"]]).toarray())
data_onehot_list_test = [f"month_{i}" for i in range(1,13)]
data_onehot_list_test.extend([f"quarter_{i}" for i in range(1,5)])
data_onehot_test.columns = data_onehot_list_test

特征衍生

def time_derivation(t,col="运营日期"):
    t["year"] = t[col].dt.year
    t["month"] = t[col].dt.month
    t["day"] = t[col].dt.day
    t["quarter"] = t[col].dt.quarter
    t["weekofyear"] = t[col].dt.weekofyear
    t["dayofweek"] = t[col].dt.dayofweek+1
    t["weekend"] = (t["dayofweek"]>5).astype(int)
    return t

data_train  = time_derivation(data)
data_test_  = time_derivation(data_test)

对每家店探索

对第一家先进行尝试

对于第一家店而言,其实测试集合是训练集接着的61天的数据,也就是两个月。类似的其他的数据集是相同的情况。销量作为y,其他的作为外部变量。 因此先考虑使用prophet来尝试。

# 训练集合
data_train_1 = data_train[data_train["商店ID"] ==1]
data_train_1 
商店ID商店类型位置地区节假日折扣销量运营日期yearmonthdayquarterweekofyeardayofweekweekend
01S1L3R111.07011.842018-01-012018111110
6071S1L3R101.042369.002018-01-022018121120
10461S1L3R101.050037.002018-01-032018131130
12071S1L3R101.044397.002018-01-042018141140
17521S1L3R101.047604.002018-01-052018151150
................................................
1865691S1L3R101.033075.002019-05-27201952722210
1871651S1L3R101.037317.002019-05-28201952822220
1873911S1L3R101.044652.002019-05-29201952922230
1879621S1L3R101.042387.002019-05-30201953022240
1881131S1L3R111.039843.782019-05-31201953122250

516 rows × 15 columns

# 测试集合
data_test_1 = data_test_[data_test_["商店ID"] ==1]
plt.figure(figsize=(16,8))
plt.plot(data_train_1["运营日期"],data_train_1["销量"])
plt.xlabel("日期",fontsize= 20)
plt.ylabel("销量",fontsize= 20)
plt.title("1号店的销量",fontsize=20)
Text(0.5, 1.0, '1号店的销量')


png

plot_data = data_train_1[ (data_train_1['运营日期']>='2018-01-01') & 
              (data_train_1['运营日期'] <'2018-02-28' )]

plt.plot( plot_data["运营日期"], plot_data["销量"], '.-' )
plt.xticks(rotation = 90)
plt.grid( axis='y' )


png

data_train_1[["节假日","折扣"]]
节假日折扣
011.0
60701.0
104601.0
120701.0
175201.0
.........
18656901.0
18716501.0
18739101.0
18796201.0
18811311.0

516 rows × 2 columns

data_train_1["折扣"].value_counts()
0.0    280
1.0    236
Name: 折扣, dtype: int64
data_train_1[data_train_1["节假日"]==1]["折扣"].value_counts()
1.0    34
0.0    34
Name: 折扣, dtype: int64

Prophet的尝试

假期变量的影响

#添加是否是假期。
holidays_= pd.DataFrame({
  'holiday': 'holiday_',
  'ds':data_train_1[data_train_1["节假日"]==1]["运营日期"].values,
  'lower_window': 0,
  'upper_window': 1,
})

# 添加打折日
discounts = pd.DataFrame({
  'holiday': 'discount',
  'ds':data_train_1[data_train_1["折扣"]==1]["运营日期"].values,
  'lower_window': -1,
  'upper_window': 0,
})

holidays = pd.concat([holidays_,discounts])

m = Prophet(holidays = holidays,
            changepoint_prior_scale=0.005,
            seasonality_prior_scale=10,
            yearly_seasonality=True,
            weekly_seasonality=True,
#             changepoint_prior_scale = 35,
#             seasonality_prior_scale = 0.5,
#             changepoint_range  = 0.3 ,
            seasonality_mode = "additive",
            holidays_prior_scale = 10,
           )
m.add_country_holidays(country_name='CHN')
<fbprophet.forecaster.Prophet at 0x18016b483c8>

添加其他外部变量

# m.add_seasonality(name='monthly', period=30.5, fourier_order=5,condition_name="month")
# m.add_seasonality(name='quarterly', period=91.25, fourier_order=4,condition_name='quarter')
# m.add_seasonality(name='weekly', period=7, fourier_order=3, condition_name='dayofweek')
m.add_regressor(name="weekend",mode='additive')
<fbprophet.forecaster.Prophet at 0x18016b483c8>
data_train_prophet =  data_train_1[["运营日期","销量","dayofweek","weekend","折扣"]].copy()
data_train_prophet.columns= ["ds","y","dayofweek","weekend","折扣"]
m.fit(data_train_prophet)
INFO:fbprophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.





<fbprophet.forecaster.Prophet at 0x18016b483c8>

预测未来61天

horizon = 61
future = m.make_future_dataframe(periods= horizon)
future
ds
02018-01-01
12018-01-02
22018-01-03
32018-01-04
42018-01-05
......
5722019-07-27
5732019-07-28
5742019-07-29
5752019-07-30
5762019-07-31

577 rows × 1 columns

a = list(data_train_1["折扣"].values)
a.extend(data_test_1["折扣"].values)
len(a)
577
future["year"] = future["ds"].dt.year
future["month"] = future["ds"].dt.month
future["quarter"] = future["ds"].dt.quarter
future["dayofweek"] = future["ds"].dt.dayofweek+1
future["weekend"] = (future["dayofweek"]>5).astype(int)
future["折扣"] = np.array(a)
forecast = m.predict(future)
forecast[['ds', 'yhat', 'yhat_lower', 'yhat_upper',
         'trend','trend_lower','trend_upper',
         'weekly']].head(5)
dsyhatyhat_loweryhat_uppertrendtrend_lowertrend_upperweekly
02018-01-0130775.77042420889.75768540936.49259826748.74709726748.74709726748.7470971189.775785
12018-01-0239995.26297830214.81410149360.97439426756.47320426756.47320426756.473204-1664.402306
22018-01-0344372.07480834877.67735954502.68571126764.19931226764.19931226764.1993121178.083305
32018-01-0442600.41259733153.61306652712.33360526771.92542026771.92542026771.925420-617.245171
42018-01-0549308.41216139745.45425858877.90525426779.65152826779.65152826779.651528-1129.635885
plt.figure(figsize=(12,7))
plt.plot(data_train_1[["运营日期"]],data_train_1[["销量"]],label = "真实值")
plt.plot(forecast.loc[(forecast.index<517)&(forecast.index>455),["ds"]],
         forecast.loc[(forecast.index<517)&(forecast.index>455),["yhat"]],label = "预测值")
plt.xlabel("ds")
plt.ylabel("销量")
plt.title("prophet的验证")
plt.legend()
<matplotlib.legend.Legend at 0x1802269bb48>


png

np.array(forecast.loc[(forecast.index<517)&(forecast.index>455),["yhat"]])
array([[25617.87079196],
       [28494.61718356],
       [26754.69257591],
       [18749.93589485],
       [37333.08107403],
       [45267.56610754],
       [40559.29950886],
       [37876.496955  ],
       [48141.90279336],
       [27787.36958936],
       [27512.35859756],
       [27134.7722622 ],
       [28060.79911766],
       [22017.4621406 ],
       [26628.40357149],
       [16879.79807243],
       [39869.34910728],
       [34012.96673227],
       [46561.0685432 ],
       [43356.350005  ],
       [50279.07500986],
       [30475.47854009],
       [33662.95616483],
       [32207.92923537],
       [32029.12790642],
       [38878.69776961],
       [41322.45509755],
       [35289.42433161],
       [32722.54429931],
       [40463.1729284 ],
       [27064.96799   ],
       [45561.0861559 ],
       [52296.89175555],
       [54613.93978621],
       [48441.39667443],
       [52950.98586258],
       [37124.29614574],
       [28245.81736758],
       [33458.53713267],
       [41473.1328489 ],
       [43579.87026573],
       [37187.3847213 ],
       [27011.29818529],
       [48500.29246951],
       [46540.19772446],
       [45827.8909742 ],
       [44959.19819402],
       [59731.23505757],
       [35801.89591473],
       [32621.89790062],
       [35113.30049962],
       [32945.15253098],
       [32042.23570321],
       [38163.17905462],
       [32652.05219235],
       [44677.94503143],
       [41411.4922227 ],
       [43850.23619555],
       [41665.70544608],
       [40854.0543816 ],
       [33941.84453036]])
def symmetric_mean_absolute_percentage_error(y_true, y_pred):
    y_true, y_pred = np.array(y_true), np.array(y_pred)
    return np.sum(np.abs(y_true - y_pred) * 2) / np.sum(np.abs(y_true) + np.abs(y_pred))

def prophet_smape(y_true, y_pred):
    smape_val = symmetric_mean_absolute_percentage_error(y_true, y_pred)
    return 'SMAPE', smape_val, False
prophet_smape(np.array(data_train_1.tail(61)["销量"]).reshape(-1,1),
             np.array(forecast.loc[(forecast.index<517)&(forecast.index>455),["yhat"]]))
('SMAPE', 0.2754594181133183, False)
from sklearn.metrics import mean_squared_error,r2_score
mean_squared_error(np.array(data_train_1.tail(61)["销量"]).reshape(-1,1),
             np.array(forecast.loc[(forecast.index<517)&(forecast.index>455),["yhat"]]))**0.5
13748.092397062905
plt.figure(figsize =(12,8))
fig1 = m.plot(forecast)
plt.title("ProPhet未来两个月的预测销售")
Text(0.5, 1.0, 'ProPhet未来两个月的预测销售')




<Figure size 864x576 with 0 Axes>

png


forecast_prophet = forecast[forecast["ds"]>"2019-05-31"]
forecast_prophet.to_csv("../preocess_data/prophet_pre.csv")
forecast_prophet
dstrendyhat_loweryhat_uppertrend_lowertrend_upperChinese New Year (Spring Festival)Chinese New Year (Spring Festival)_lowerChinese New Year (Spring Festival)_upperDragon Boat Festival...weeklyweekly_lowerweekly_upperyearlyyearly_loweryearly_uppermultiplicative_termsmultiplicative_terms_lowermultiplicative_terms_upperyhat
5162019-06-0130717.59308123978.98801843887.74315330717.59308130717.5930810.00.00.00.0...-543.034260-543.034260-543.034260-692.226218-692.226218-692.2262180.00.00.033941.844530
5172019-06-0230725.27473427820.20458946776.09998630725.27473430725.2747340.00.00.00.0...1586.4585311586.4585311586.458531-1009.473895-1009.473895-1009.4738950.00.00.037240.329938
5182019-06-0330732.95638820579.30229440967.93438230732.95638830732.9563880.00.00.00.0...1189.7757851189.7757851189.775785-1288.709842-1288.709842-1288.7098420.00.00.030634.022331
5192019-06-0430740.63804116933.73188137228.67879630740.63804130740.6380410.00.00.00.0...-1664.402306-1664.402306-1664.402306-1524.746417-1524.746417-1524.7464170.00.00.027551.489318
5202019-06-0530748.31969519840.26077339788.22488630748.31962330748.3197200.00.00.00.0...1178.0833051178.0833051178.083305-1712.887352-1712.887352-1712.8873520.00.00.030213.515648
..................................................................
5722019-07-2731147.76567226002.86571845344.82087931147.62242831147.9141020.00.00.00.0...-543.034260-543.034260-543.034260-1416.639602-1416.639602-1416.6396020.00.00.035126.162378
5732019-07-2831155.44732626388.78965946396.20752531155.30159231155.5998490.00.00.00.0...1586.4585311586.4585311586.458531-1650.182907-1650.182907-1650.1829070.00.00.037029.793518
5742019-07-2931163.12897920421.28161239356.14119931162.97994931163.2869310.00.00.00.0...1189.7757851189.7757851189.775785-1842.411772-1842.411772-1842.4117720.00.00.030510.492992
5752019-07-3031170.81063217209.81331237025.89855131170.65512831170.9739690.00.00.00.0...-1664.402306-1664.402306-1664.402306-1994.025550-1994.025550-1994.0255500.00.00.027512.382776
5762019-07-3131178.49228620642.31350540558.96384331178.33023031178.6613140.00.00.00.0...1178.0833051178.0833051178.083305-2106.380041-2106.380041-2106.3800410.00.00.030250.195551

61 rows × 55 columns

from fbprophet.diagnostics import cross_validation
df_cv = cross_validation(m, initial='400 days', period='20 days', horizon = '61 days')
INFO:fbprophet:Making 3 forecasts with cutoffs between 2019-02-19 00:00:00 and 2019-03-31 00:00:00



  0%|          | 0/3 [00:00<?, ?it/s]
df_cv
dsyhatyhat_loweryhat_upperycutoff
02019-02-2039169.94802031173.11834048219.37796436735.002019-02-19
12019-02-2145788.52225337014.17714254540.94883543452.002019-02-19
22019-02-2226886.26077517929.66346035342.79829218024.002019-02-19
32019-02-2332745.81113523818.47485541838.14768826976.002019-02-19
42019-02-2435213.15913625766.01247844371.68329134272.002019-02-19
.....................
1782019-05-2744303.51776933653.94029054457.75810933075.002019-03-31
1792019-05-2841422.73770831734.47684551089.49969637317.002019-03-31
1802019-05-2943684.59150234259.13872452989.36960244652.002019-03-31
1812019-05-3041666.49424831393.19118951114.01501942387.002019-03-31
1822019-05-3140736.57120630548.91225250292.00084539843.782019-03-31

183 rows × 6 columns

from fbprophet.diagnostics import performance_metrics
df_p = performance_metrics(df_cv)
df_p
horizonmsermsemaemapemdapecoverage
06 days5.036577e+077096.8848634918.1948100.2250680.1179360.888889
17 days6.514631e+078071.3266565856.0130650.2699970.1451720.777778
28 days6.050749e+077778.6558035522.8207830.2590150.1448380.833333
39 days5.499201e+077415.6598534861.7370920.2272300.0890640.833333
410 days7.613189e+078725.3588116104.7344320.5142300.1448380.722222
511 days8.454131e+079194.6351036733.8882740.5442530.1448380.666667
612 days5.702142e+077551.2531895739.2868690.4661660.1255380.722222
713 days6.444249e+078027.6081256066.5129780.4547860.1282570.777778
814 days6.994715e+078363.4410806519.9854530.4730440.1466530.722222
915 days7.334660e+078564.2630566887.6240320.4899760.1903270.722222
1016 days5.113090e+077150.5872455473.2365560.1943820.1208430.833333
1117 days4.776812e+076911.4481135177.2018370.1683500.1208430.833333
1218 days4.879434e+076985.2942005304.3802580.1725450.1346870.833333
1319 days3.374120e+075808.7172484594.6324980.1605060.1307530.833333
1420 days2.835123e+075324.5869644093.3077750.1379140.1049590.888889
1521 days2.597542e+075096.6086603953.3289010.1256780.1087550.888889
1622 days3.005056e+075481.8388634165.8795850.1359870.1087550.833333
1723 days3.412751e+075841.8755014694.0649160.1674890.1326730.833333
1824 days3.173372e+075633.2694694438.6631120.1611500.1087550.833333
1925 days2.580198e+075079.5646543966.9048220.1493050.1052480.888889
2026 days6.629733e+078142.3172615613.0162400.2807590.1155200.777778
2127 days1.040308e+0810199.5480067449.3559470.3406960.1776440.611111
2228 days1.062327e+0810306.9269337764.9752410.3514470.1783710.611111
2329 days9.940728e+079970.3197167215.2241510.3171010.1481940.666667
2430 days1.241672e+0811143.0356968762.4041840.6264650.2577100.555556
2531 days1.802528e+0813425.82488810750.9609460.6723270.3335540.444444
2632 days1.466661e+0812110.5761439492.5565320.5554810.2577100.500000
2733 days1.176510e+0810846.7023828278.4516790.5165090.1936040.611111
2834 days1.384843e+0811767.9359209222.3027810.5304330.2533140.555556
2935 days1.890791e+0813750.60392610618.9974440.5429130.2830660.500000
3036 days1.901535e+0813789.61537710244.9126110.2497520.2533140.555556
3137 days1.558550e+0812484.1903579460.5218450.2164290.2493870.555556
3238 days1.536181e+0812394.2785239451.2546630.2226190.2493870.611111
3339 days1.480860e+0812169.0576818861.8288730.2108350.2413810.611111
3440 days1.252158e+0811189.9862857736.6775840.1828310.1476830.722222
3541 days8.926055e+079447.7800577120.2941270.1798710.1294230.722222
3642 days6.410505e+078006.5631426015.1486090.1638170.1204770.777778
3743 days4.810944e+076936.0969975269.9697590.1713840.1204770.833333
3844 days4.262756e+076528.9781534811.9614910.1576470.1074730.833333
3945 days4.413978e+076643.7775425155.7207350.1702570.1204770.888889
4046 days7.963271e+078923.7160786627.7175040.2950010.1324070.777778
4147 days1.036545e+0810181.0871017713.2443950.3148830.2051210.666667
4248 days1.164146e+0810789.5616898732.1337750.3435340.2397470.555556
4349 days1.167080e+0810803.1491478749.3595530.3156510.2291850.555556
4450 days1.329107e+0811528.6884339989.7185610.3758790.2633460.444444
4551 days1.857242e+0813628.06515311618.6436550.4096290.2633460.333333
4652 days1.526817e+0812356.44270610289.6141240.2946480.2303520.388889
4753 days1.239579e+0811133.6365259111.6865350.2759830.1906160.444444
4854 days1.335311e+0811555.5661509144.4737470.2694820.1957740.500000
4955 days1.803036e+0813427.71815110451.7434050.2899740.2302760.500000
5056 days1.899438e+0813782.00826210445.9108050.2469630.2302760.555556
5157 days1.620768e+0812730.93948010040.7521970.2269120.2514730.500000
5258 days1.608247e+0812681.66697610108.4069020.2347700.2514730.555556
5359 days1.473948e+0812140.6261418939.0094360.2091270.2488780.666667
5460 days1.296210e+0811385.1233588287.9610400.1956890.1960690.722222
5561 days9.342384e+079665.6006147432.1739540.1846720.1609930.722222

网格搜索

(这段代码是可以用的,后来赶时间给打断了。)

import itertools

# 设置想要优化的超参数
param_grid = {  
    'changepoint_prior_scale': [0.001,0.005,0.01,0.15,0.2,0.25,0.30,35,0.4,0.45,0.5],
    'seasonality_prior_scale': [0.05, 0.1, 0.5, 1, 5, 10, 15],
    "changepoint_range" : [i / 10 for i in range(3, 10)],
    "seasonality_mode" : ['additive', 'multiplicative'],
    "holidays_prior_scale" : [0.05, 0.1, 0.5, 1, 5, 10, 15]

}

# 构建全部超参数组合
all_params = [dict(zip(param_grid.keys(), v)) for v in itertools.product(*param_grid.values())]

#记录模型验证误差
mapes = []  

#grid-search
for params in all_params:
    m = Prophet(**params).fit(data_train_prophet) 
    df_cv = cross_validation(m, initial='300 days', period='20 days', horizon = '61 days')
    df_p = performance_metrics(df_cv, rolling_window=1)
    mapes.append(df_p['mape'].values[0])
INFO:fbprophet:Disabling yearly seasonality. Run prophet with yearly_seasonality=True to override this.
INFO:fbprophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.
WARNING:fbprophet.models:Optimization terminated abnormally. Falling back to Newton.
INFO:fbprophet:Making 8 forecasts with cutoffs between 2018-11-11 00:00:00 and 2019-03-31 00:00:00



  0%|          | 0/8 [00:00<?, ?it/s]


WARNING:fbprophet.models:Optimization terminated abnormally. Falling back to Newton.



---------------------------------------------------------------------------

RuntimeError                              Traceback (most recent call last)

C:\ProgramData\Anaconda3\lib\site-packages\fbprophet\models.py in fit(self, stan_init, stan_data, **kwargs)
    244         try:
--> 245             self.stan_fit = self.model.optimizing(**args)
    246         except RuntimeError:


C:\ProgramData\Anaconda3\lib\site-packages\pystan\model.py in optimizing(self, data, seed, init, sample_file, algorithm, verbose, as_vector, **kwargs)
    580 
--> 581         ret, sample = fit._call_sampler(stan_args)
    582         pars = pystan.misc._par_vector2dict(sample['par'], m_pars, p_dims)


stanfit4anon_model_f5236004a3fd5b8429270d00efcc0cf9_7332008770348935536.pyx in stanfit4anon_model_f5236004a3fd5b8429270d00efcc0cf9_7332008770348935536.StanFit4Model._call_sampler()


stanfit4anon_model_f5236004a3fd5b8429270d00efcc0cf9_7332008770348935536.pyx in stanfit4anon_model_f5236004a3fd5b8429270d00efcc0cf9_7332008770348935536._call_sampler()


RuntimeError: Something went wrong after call_sampler.


During handling of the above exception, another exception occurred:

KeyboardInterrupt                         Traceback (most recent call last)

~\AppData\Local\Temp\ipykernel_27972\1694823081.py in <module>
     20 for params in all_params:
     21     m = Prophet(**params).fit(data_train_prophet)
---> 22     df_cv = cross_validation(m, initial='300 days', period='20 days', horizon = '61 days')
     23     df_p = performance_metrics(df_cv, rolling_window=1)
     24     mapes.append(df_p['mape'].values[0])


C:\ProgramData\Anaconda3\lib\site-packages\fbprophet\diagnostics.py in cross_validation(model, horizon, period, initial, parallel, cutoffs)
    187         predicts = [
    188             single_cutoff_forecast(df, model, cutoff, horizon, predict_columns)
--> 189             for cutoff in tqdm(cutoffs)
    190         ]
    191 


C:\ProgramData\Anaconda3\lib\site-packages\fbprophet\diagnostics.py in <listcomp>(.0)
    187         predicts = [
    188             single_cutoff_forecast(df, model, cutoff, horizon, predict_columns)
--> 189             for cutoff in tqdm(cutoffs)
    190         ]
    191 


C:\ProgramData\Anaconda3\lib\site-packages\fbprophet\diagnostics.py in single_cutoff_forecast(df, model, cutoff, horizon, predict_columns)
    225             'Increase initial window.'
    226         )
--> 227     m.fit(history_c, **model.fit_kwargs)
    228     # Calculate yhat
    229     index_predicted = (df['ds'] > cutoff) & (df['ds'] <= cutoff + horizon)


C:\ProgramData\Anaconda3\lib\site-packages\fbprophet\forecaster.py in fit(self, df, **kwargs)
   1164             self.params = self.stan_backend.sampling(stan_init, dat, self.mcmc_samples, **kwargs)
   1165         else:
-> 1166             self.params = self.stan_backend.fit(stan_init, dat, **kwargs)
   1167 
   1168         # If no changepoints were requested, replace delta with 0s


C:\ProgramData\Anaconda3\lib\site-packages\fbprophet\models.py in fit(self, stan_init, stan_data, **kwargs)
    250             )
    251             args['algorithm'] = 'Newton'
--> 252             self.stan_fit = self.model.optimizing(**args)
    253 
    254         params = dict()


C:\ProgramData\Anaconda3\lib\site-packages\pystan\model.py in optimizing(self, data, seed, init, sample_file, algorithm, verbose, as_vector, **kwargs)
    579         stan_args = pystan.misc._get_valid_stan_args(stan_args)
    580 
--> 581         ret, sample = fit._call_sampler(stan_args)
    582         pars = pystan.misc._par_vector2dict(sample['par'], m_pars, p_dims)
    583         if not as_vector:


stanfit4anon_model_f5236004a3fd5b8429270d00efcc0cf9_7332008770348935536.pyx in stanfit4anon_model_f5236004a3fd5b8429270d00efcc0cf9_7332008770348935536.StanFit4Model._call_sampler()


stanfit4anon_model_f5236004a3fd5b8429270d00efcc0cf9_7332008770348935536.pyx in stanfit4anon_model_f5236004a3fd5b8429270d00efcc0cf9_7332008770348935536._call_sampler()


stanfit4anon_model_f5236004a3fd5b8429270d00efcc0cf9_7332008770348935536.pyx in stanfit4anon_model_f5236004a3fd5b8429270d00efcc0cf9_7332008770348935536._dict_from_stanargs()


C:\ProgramData\Anaconda3\lib\enum.py in __call__(cls, value, names, module, qualname, type, start)
    313         """
    314         if names is None:  # simple value lookup
--> 315             return cls.__new__(cls, value)
    316         # otherwise, functional API: we're creating a new Enum type
    317         return cls._create_(value, names, module=module, qualname=qualname, type=type, start=start)


KeyboardInterrupt: 
mapes
[0.45648954214880777,
 0.45648954214880777,
 0.45648954214880777,
 0.45648954214880777,
 0.45648954214880777,
 0.45648954214880777,
 0.45648954214880777,
 0.45912830121340914,
 0.45912830121340914,
 0.45912830121340914,
 0.45912830121340914,
 0.45912830121340914,
 0.45912830121340914,
 0.45912830121340914,
 0.4568007117395657,
 0.4568007117395657,
 0.4568007117395657,
 0.4568007117395657,
 0.4568007117395657,
 0.4568007117395657,
 0.4568007117395657,
 0.4591920515559872,
 0.4591920515559872,
 0.4591920515559872,
 0.4591920515559872,
 0.4591920515559872,
 0.4591920515559872,
 0.4591920515559872,
 0.45510449474358516,
 0.45510449474358516,
 0.45510449474358516,
 0.45510449474358516,
 0.45510449474358516,
 0.45510449474358516,
 0.45510449474358516,
 0.45823943263013306,
 0.45823943263013306,
 0.45823943263013306,
 0.45823943263013306,
 0.45823943263013306,
 0.45823943263013306,
 0.45823943263013306,
 0.4554239499688715,
 0.4554239499688715,
 0.4554239499688715,
 0.4554239499688715,
 0.4554239499688715,
 0.4554239499688715,
 0.4554239499688715,
 0.45879268876194945,
 0.45879268876194945,
 0.45879268876194945,
 0.45879268876194945,
 0.45879268876194945,
 0.45879268876194945,
 0.45879268876194945,
 0.45667322427892104,
 0.45667322427892104,
 0.45667322427892104,
 0.45667322427892104,
 0.45667322427892104,
 0.45667322427892104,
 0.45667322427892104,
 0.4607103676236504,
 0.4607103676236504,
 0.4607103676236504,
 0.4607103676236504,
 0.4607103676236504,
 0.4607103676236504,
 0.4607103676236504,
 0.5206491117824643,
 0.5206491117824643,
 0.5206491117824643,
 0.5206491117824643,
 0.5206491117824643,
 0.5206491117824643,
 0.5206491117824643,
 0.5249336893238647,
 0.5249336893238647,
 0.5249336893238647,
 0.5249336893238647,
 0.5249336893238647,
 0.5249336893238647,
 0.5249336893238647,
 0.5236945915365312,
 0.5236945915365312,
 0.5236945915365312,
 0.5236945915365312,
 0.5236945915365312,
 0.5236945915365312,
 0.5236945915365312,
 0.5260598977697104,
 0.5260598977697104,
 0.5260598977697104,
 0.5260598977697104,
 0.5260598977697104,
 0.5260598977697104,
 0.5260598977697104,
 0.45904743702515494,
 0.45904743702515494,
 0.45904743702515494,
 0.45904743702515494,
 0.45904743702515494,
 0.45904743702515494,
 0.45904743702515494,
 0.45913177773182684,
 0.45913177773182684,
 0.45913177773182684,
 0.45913177773182684,
 0.45913177773182684,
 0.45913177773182684,
 0.45913177773182684,
 0.45961175007785626,
 0.45961175007785626,
 0.45961175007785626,
 0.45961175007785626,
 0.45961175007785626,
 0.45961175007785626,
 0.45961175007785626,
 0.4588563774303946,
 0.4588563774303946,
 0.4588563774303946,
 0.4588563774303946,
 0.4588563774303946,
 0.4588563774303946,
 0.4588563774303946,
 0.4564027795629576,
 0.4564027795629576,
 0.4564027795629576,
 0.4564027795629576,
 0.4564027795629576,
 0.4564027795629576,
 0.4564027795629576,
 0.4581067253308648,
 0.4581067253308648,
 0.4581067253308648,
 0.4581067253308648,
 0.4581067253308648,
 0.4581067253308648,
 0.4581067253308648,
 0.4573986858665292,
 0.4573986858665292,
 0.4573986858665292,
 0.4573986858665292,
 0.4573986858665292,
 0.4573986858665292,
 0.4573986858665292,
 0.45918990550647126,
 0.45918990550647126,
 0.45918990550647126,
 0.45918990550647126,
 0.45918990550647126,
 0.45918990550647126,
 0.45918990550647126,
 0.45814826453953184,
 0.45814826453953184,
 0.45814826453953184,
 0.45814826453953184,
 0.45814826453953184,
 0.45814826453953184,
 0.45814826453953184,
 0.45934417584343573,
 0.45934417584343573,
 0.45934417584343573,
 0.45934417584343573,
 0.45934417584343573,
 0.45934417584343573,
 0.45934417584343573,
 0.520743204175105,
 0.520743204175105,
 0.520743204175105,
 0.520743204175105,
 0.520743204175105,
 0.520743204175105,
 0.520743204175105,
 0.5247093553511951,
 0.5247093553511951,
 0.5247093553511951,
 0.5247093553511951,
 0.5247093553511951,
 0.5247093553511951,
 0.5247093553511951,
 0.5247960672686005,
 0.5247960672686005,
 0.5247960672686005,
 0.5247960672686005,
 0.5247960672686005,
 0.5247960672686005,
 0.5247960672686005,
 0.4954321092488347,
 0.4954321092488347,
 0.4954321092488347,
 0.4954321092488347,
 0.4954321092488347,
 0.4954321092488347,
 0.4954321092488347,
 0.45776923159529403,
 0.45776923159529403,
 0.45776923159529403,
 0.45776923159529403,
 0.45776923159529403,
 0.45776923159529403,
 0.45776923159529403,
 0.45870477732006926,
 0.45870477732006926,
 0.45870477732006926,
 0.45870477732006926,
 0.45870477732006926,
 0.45870477732006926,
 0.45870477732006926,
 0.45940105978119594,
 0.45940105978119594,
 0.45940105978119594,
 0.45940105978119594,
 0.45940105978119594,
 0.45940105978119594,
 0.45940105978119594,
 0.45767226940712247,
 0.45767226940712247,
 0.45767226940712247,
 0.45767226940712247,
 0.45767226940712247,
 0.45767226940712247,
 0.45767226940712247,
 0.4564011583170134,
 0.4564011583170134,
 0.4564011583170134,
 0.4564011583170134,
 0.4564011583170134,
 0.4564011583170134,
 0.4564011583170134,
 0.4592087592435913,
 0.4592087592435913,
 0.4592087592435913,
 0.4592087592435913,
 0.4592087592435913,
 0.4592087592435913,
 0.4592087592435913,
 0.45777443140182345,
 0.45777443140182345,
 0.45777443140182345,
 0.45777443140182345,
 0.45777443140182345,
 0.45777443140182345,
 0.45777443140182345,
 0.45924616460485934,
 0.45924616460485934,
 0.45924616460485934,
 0.45924616460485934,
 0.45924616460485934,
 0.45924616460485934,
 0.45924616460485934,
 0.4583897787591276,
 0.4583897787591276,
 0.4583897787591276,
 0.4583897787591276,
 0.4583897787591276,
 0.4583897787591276,
 0.4583897787591276,
 0.45956928793276064,
 0.45956928793276064,
 0.45956928793276064,
 0.45956928793276064,
 0.45956928793276064,
 0.45956928793276064,
 0.45956928793276064,
 0.519717041131953,
 0.519717041131953,
 0.519717041131953,
 0.519717041131953,
 0.519717041131953,
 0.519717041131953,
 0.519717041131953,
 0.4895468397347071,
 0.4895468397347071,
 0.4895468397347071,
 0.4895468397347071,
 0.4895468397347071,
 0.4895468397347071,
 0.4895468397347071,
 0.48624758680837576,
 0.48624758680837576,
 0.48624758680837576,
 0.48624758680837576,
 0.48624758680837576,
 0.48624758680837576,
 0.48624758680837576,
 0.5242938412806915,
 0.5242938412806915,
 0.5242938412806915,
 0.5242938412806915,
 0.5242938412806915,
 0.5242938412806915,
 0.5242938412806915,
 0.45732537831372994,
 0.45732537831372994,
 0.45732537831372994,
 0.45732537831372994,
 0.45732537831372994,
 0.45732537831372994,
 0.45732537831372994,
 0.45964326126801874,
 0.45964326126801874,
 0.45964326126801874,
 0.45964326126801874,
 0.45964326126801874,
 0.45964326126801874,
 0.45964326126801874,
 0.45826588652465855,
 0.45826588652465855,
 0.45826588652465855,
 0.45826588652465855,
 0.45826588652465855,
 0.45826588652465855,
 0.45826588652465855,
 0.4590153544745371,
 0.4590153544745371,
 0.4590153544745371,
 0.4590153544745371,
 0.4590153544745371,
 0.4590153544745371,
 0.4590153544745371,
 0.4559833616122703,
 0.4559833616122703,
 0.4559833616122703,
 0.4559833616122703,
 0.4559833616122703,
 0.4559833616122703,
 0.4559833616122703,
 0.45857840596563093,
 0.45857840596563093,
 0.45857840596563093,
 0.45857840596563093,
 0.45857840596563093,
 0.45857840596563093,
 0.45857840596563093,
 0.4556990122121692,
 0.4556990122121692,
 0.4556990122121692,
 0.4556990122121692,
 0.4556990122121692,
 0.4556990122121692,
 0.4556990122121692,
 0.45917761156046616,
 0.45917761156046616,
 0.45917761156046616,
 0.45917761156046616,
 0.45917761156046616,
 0.45917761156046616,
 0.45917761156046616,
 0.4569928761431437,
 0.4569928761431437,
 0.4569928761431437,
 0.4569928761431437,
 0.4569928761431437,
 0.4569928761431437,
 0.4569928761431437,
 0.45938906512694433,
 0.45938906512694433,
 0.45938906512694433,
 0.45938906512694433,
 0.45938906512694433,
 0.45938906512694433,
 0.45938906512694433,
 0.5195442695864627,
 0.5195442695864627,
 0.5195442695864627,
 0.5195442695864627,
 0.5195442695864627,
 0.5195442695864627,
 0.5195442695864627,
 0.5237921242093493,
 0.5237921242093493,
 0.5237921242093493,
 0.5237921242093493,
 0.5237921242093493,
 0.5237921242093493,
 0.5237921242093493,
 0.5236331415297112,
 0.5236331415297112,
 0.5236331415297112,
 0.5236331415297112,
 0.5236331415297112,
 0.5236331415297112,
 0.5236331415297112,
 0.5257152671018255,
 0.5257152671018255,
 0.5257152671018255,
 0.5257152671018255,
 0.5257152671018255,
 0.5257152671018255,
 0.5257152671018255,
 0.45742449248679157,
 0.45742449248679157,
 0.45742449248679157,
 0.45742449248679157,
 0.45742449248679157,
 0.45742449248679157,
 0.45742449248679157,
 0.46006417379971354,
 0.46006417379971354,
 0.46006417379971354,
 0.46006417379971354,
 0.46006417379971354,
 0.46006417379971354,
 0.46006417379971354,
 0.45805002081483975,
 0.45805002081483975,
 0.45805002081483975,
 0.45805002081483975,
 0.45805002081483975,
 0.45805002081483975,
 0.45805002081483975,
 0.45906816327999206,
 0.45906816327999206,
 0.45906816327999206,
 0.45906816327999206,
 0.45906816327999206,
 0.45906816327999206,
 0.45906816327999206,
 0.4558062815453554,
 0.4558062815453554,
 0.4558062815453554,
 0.4558062815453554,
 0.4558062815453554,
 0.4558062815453554,
 0.4558062815453554,
 0.45814752370252865,
 0.45814752370252865,
 0.45814752370252865,
 0.45814752370252865,
 0.45814752370252865,
 0.45814752370252865,
 0.45814752370252865,
 0.4554758339749052,
 0.4554758339749052,
 0.4554758339749052,
 0.4554758339749052,
 0.4554758339749052,
 0.4554758339749052,
 0.4554758339749052,
 0.4592112628621341,
 0.4592112628621341,
 0.4592112628621341,
 0.4592112628621341,
 0.4592112628621341,
 0.4592112628621341,
 0.4592112628621341,
 0.456751237891377,
 0.456751237891377,
 0.456751237891377,
 0.456751237891377,
 0.456751237891377,
 0.456751237891377,
 0.456751237891377,
 0.45951038595264654,
 0.45951038595264654,
 0.45951038595264654,
 0.45951038595264654,
 0.45951038595264654,
 0.45951038595264654,
 0.45951038595264654,
 0.5190684498265534,
 0.5190684498265534,
 0.5190684498265534,
 0.5190684498265534,
 0.5190684498265534,
 0.5190684498265534,
 0.5190684498265534,
 0.5242254242885218,
 0.5242254242885218,
 0.5242254242885218,
 0.5242254242885218,
 0.5242254242885218,
 0.5242254242885218,
 0.5242254242885218,
 0.5236403994221426,
 0.5236403994221426,
 0.5236403994221426,
 0.5236403994221426,
 0.5236403994221426,
 0.5236403994221426,
 0.5236403994221426,
 0.5245127323972775,
 0.5245127323972775,
 0.5245127323972775,
 0.5245127323972775,
 0.5245127323972775,
 0.5245127323972775,
 0.5245127323972775,
 0.45743857846100333,
 0.45743857846100333,
 0.45743857846100333,
 0.45743857846100333,
 0.45743857846100333,
 0.45743857846100333,
 0.45743857846100333,
 0.4581023784528862,
 0.4581023784528862,
 0.4581023784528862,
 0.4581023784528862,
 0.4581023784528862,
 0.4581023784528862,
 0.4581023784528862,
 0.45804155074040936,
 0.45804155074040936,
 0.45804155074040936,
 0.45804155074040936,
 0.45804155074040936,
 0.45804155074040936,
 0.45804155074040936,
 0.45904207260264823,
 0.45904207260264823,
 0.45904207260264823,
 0.45904207260264823,
 0.45904207260264823,
 0.45904207260264823,
 0.45904207260264823,
 0.45580530661993646,
 0.45580530661993646,
 0.45580530661993646,
 0.45580530661993646,
 0.45580530661993646,
 0.45580530661993646,
 0.45580530661993646,
 0.4605528126286541,
 0.4605528126286541,
 0.4605528126286541,
 0.4605528126286541,
 0.4605528126286541,
 0.4605528126286541,
 0.4605528126286541,
 0.455476409951663,
 0.455476409951663,
 0.455476409951663,
 0.455476409951663,
 0.455476409951663,
 0.455476409951663,
 0.455476409951663,
 0.4590906059792743,
 0.4590906059792743,
 0.4590906059792743,
 0.4590906059792743,
 0.4590906059792743,
 0.4590906059792743,
 0.4590906059792743,
 0.4567476334857499,
 0.4567476334857499,
 0.4567476334857499,
 0.4567476334857499,
 0.4567476334857499,
 0.4567476334857499,
 0.4567476334857499,
 0.4595237291251589,
 0.4595237291251589,
 0.4595237291251589,
 0.4595237291251589,
 0.4595237291251589,
 0.4595237291251589,
 0.4595237291251589,
 0.518423356810763,
 0.518423356810763,
 0.518423356810763,
 0.518423356810763,
 0.518423356810763,
 0.518423356810763,
 0.518423356810763,
 0.5237199847421963,
 0.5237199847421963,
 0.5237199847421963,
 0.5237199847421963,
 0.5237199847421963,
 0.5237199847421963,
 0.5237199847421963,
 0.5236691149175264,
 0.5236691149175264,
 0.5236691149175264,
 0.5236691149175264,
 0.5236691149175264,
 0.5236691149175264,
 0.5236691149175264,
 0.49475402516004363,
 0.49475402516004363,
 0.49475402516004363,
 0.49475402516004363,
 0.49475402516004363,
 0.49475402516004363,
 0.49475402516004363,
 0.4574330893288633,
 0.4574330893288633,
 0.4574330893288633,
 0.4574330893288633,
 0.4574330893288633,
 0.4574330893288633,
 0.4574330893288633,
 0.4576504652718613,
 0.4576504652718613,
 0.4576504652718613,
 0.4576504652718613,
 0.4576504652718613,
 0.4576504652718613,
 0.4576504652718613,
 0.4580471650876356,
 0.4580471650876356,
 0.4580471650876356,
 0.4580471650876356,
 0.4580471650876356,
 0.4580471650876356,
 0.4580471650876356,
 0.45907145875343724,
 0.45907145875343724,
 0.45907145875343724,
 0.45907145875343724,
 0.45907145875343724,
 0.45907145875343724,
 0.45907145875343724,
 0.4558073164160799,
 0.4558073164160799,
 0.4558073164160799,
 0.4558073164160799,
 0.4558073164160799,
 0.4558073164160799,
 0.4558073164160799,
 0.4605935211413079,
 0.4605935211413079,
 0.4605935211413079,
 0.4605935211413079,
 0.4605935211413079,
 0.4605935211413079,
 0.4605935211413079,
 0.45547605176401995,
 0.45547605176401995,
 0.45547605176401995,
 0.45547605176401995,
 0.45547605176401995,
 0.45547605176401995,
 0.45547605176401995,
 0.45910135612245917,
 0.45910135612245917,
 0.45910135612245917,
 0.45910135612245917,
 0.45910135612245917,
 0.45910135612245917,
 0.45910135612245917,
 0.456748769553789,
 0.456748769553789,
 0.456748769553789,
 0.456748769553789,
 0.456748769553789,
 0.456748769553789,
 0.456748769553789,
 0.4595384807994946,
 0.4595384807994946,
 0.4595384807994946,
 0.4595384807994946,
 0.4595384807994946,
 0.4595384807994946,
 0.4595384807994946,
 0.5195265552613648,
 0.5195265552613648,
 0.5195265552613648,
 0.5195265552613648,
 0.5195265552613648,
 0.5195265552613648,
 0.5195265552613648,
 0.5241649673697295,
 0.5241649673697295,
 0.5241649673697295,
 0.5241649673697295,
 0.5241649673697295,
 0.5241649673697295,
 0.5241649673697295,
 0.5236822877873643,
 0.5236822877873643,
 0.5236822877873643,
 0.5236822877873643,
 0.5236822877873643,
 0.5236822877873643,
 0.5236822877873643,
 0.48931463105703094,
 0.48931463105703094,
 0.48931463105703094,
 0.48931463105703094,
 0.48931463105703094,
 0.48931463105703094,
 0.48931463105703094,
 0.39666068603467913,
 0.39666068603467913,
 0.39666068603467913,
 0.39666068603467913,
 0.39666068603467913,
 0.39666068603467913,
 0.39666068603467913,
 0.3901798392098369,
 0.3901798392098369,
 0.3901798392098369,
 0.3901798392098369,
 0.3901798392098369,
 0.3901798392098369,
 0.3901798392098369,
 0.3885609967595123,
 0.3885609967595123,
 0.3885609967595123,
 0.3885609967595123,
 0.3885609967595123,
 0.3885609967595123,
 0.3885609967595123,
 0.3865859839155998,
 0.3865859839155998,
 0.3865859839155998,
 0.3865859839155998,
 0.3865859839155998,
 0.3865859839155998,
 0.3865859839155998,
 0.38240879237653774,
 0.38240879237653774,
 0.38240879237653774,
 0.38240879237653774,
 0.38240879237653774,
 0.38240879237653774,
 0.38240879237653774,
 0.3982902466772551,
 0.3982902466772551,
 0.3982902466772551,
 0.3982902466772551,
 0.3982902466772551,
 0.3982902466772551,
 0.3982902466772551,
 0.3816506485272468,
 0.3816506485272468,
 0.3816506485272468,
 0.3816506485272468,
 0.3816506485272468,
 0.3816506485272468,
 0.3816506485272468,
 0.38819283273744465,
 0.38819283273744465,
 0.38819283273744465,
 0.38819283273744465,
 0.38819283273744465,
 0.38819283273744465,
 0.38819283273744465,
 0.3835185359826565,
 0.3835185359826565,
 0.3835185359826565,
 0.3835185359826565,
 0.3835185359826565,
 0.3835185359826565,
 0.3835185359826565,
 0.3833135827625621,
 0.3833135827625621,
 0.3833135827625621,
 0.3833135827625621,
 0.3833135827625621,
 0.3833135827625621,
 0.3833135827625621,
 0.3869849805112024,
 0.3869849805112024,
 0.3869849805112024,
 0.3869849805112024,
 0.3869849805112024,
 0.3869849805112024,
 0.3869849805112024,
 0.38602263519113766,
 0.38602263519113766,
 0.38602263519113766,
 0.38602263519113766,
 0.38602263519113766,
 0.38602263519113766,
 0.38602263519113766,
 0.38411209707335087,
 0.38411209707335087,
 0.38411209707335087,
 0.38411209707335087,
 0.38411209707335087,
 0.38411209707335087,
 0.38411209707335087,
 0.3836342938422278,
 0.3836342938422278,
 0.3836342938422278,
 0.3836342938422278,
 0.3836342938422278,
 0.3836342938422278,
 0.3836342938422278,
 0.39662580862686186,
 0.39662580862686186,
 0.39662580862686186,
 0.39662580862686186,
 0.39662580862686186,
 0.39662580862686186,
 0.39662580862686186,
 0.3830553318350906,
 0.3830553318350906,
 0.3830553318350906,
 0.3830553318350906,
 0.3830553318350906,
 0.3830553318350906,
 0.3830553318350906,
 0.3823118251955382,
 0.3823118251955382,
 0.3823118251955382,
 0.3823118251955382,
 0.3823118251955382,
 0.3823118251955382,
 0.3823118251955382,
 0.38255510970482315,
 0.38255510970482315,
 0.38255510970482315,
 0.38255510970482315,
 0.38255510970482315,
 0.38255510970482315,
 0.38255510970482315,
 0.38640382361419084,
 0.38640382361419084,
 0.38640382361419084,
 0.38640382361419084,
 0.38640382361419084,
 0.38640382361419084,
 0.38640382361419084,
 0.38415364556395093,
 0.38415364556395093,
 0.38415364556395093,
 0.38415364556395093,
 0.38415364556395093,
 0.38415364556395093,
 0.38415364556395093,
 0.3818837737582994,
 0.3818837737582994,
 0.3818837737582994,
 0.3818837737582994,
 0.3818837737582994,
 0.3818837737582994,
 0.3818837737582994,
 0.37792739926390617,
 0.37792739926390617,
 0.37792739926390617,
 0.37792739926390617,
 0.37792739926390617,
 0.37792739926390617,
 0.37792739926390617,
 0.37736810319664643,
 0.37736810319664643,
 0.37736810319664643,
 0.37736810319664643,
 0.37736810319664643,
 0.37736810319664643,
 0.37736810319664643,
 0.3812731577523757,
 0.3812731577523757,
 0.3812731577523757,
 0.3812731577523757,
 0.3812731577523757,
 0.3812731577523757,
 0.3812731577523757,
 0.38421582153462613,
 0.38421582153462613,
 0.38421582153462613,
 0.38421582153462613,
 0.38421582153462613,
 0.38421582153462613,
 0.38421582153462613,
 0.3889604354373833,
 0.3889604354373833,
 0.3889604354373833,
 0.3889604354373833,
 0.3889604354373833,
 0.3889604354373833,
 0.3889604354373833,
 0.3861872963607218,
 0.3861872963607218,
 0.3861872963607218,
 0.3861872963607218,
 0.3861872963607218,
 0.3861872963607218,
 0.3861872963607218,
 0.39118932154019814,
 0.39118932154019814,
 0.39118932154019814,
 0.39118932154019814,
 0.39118932154019814,
 0.39118932154019814,
 0.39118932154019814,
 0.38736339748522314,
 0.38736339748522314,
 0.38736339748522314,
 0.38736339748522314,
 0.38736339748522314,
 0.38736339748522314,
 0.38736339748522314,
 0.3896752109205865,
 0.3896752109205865,
 0.3896752109205865,
 0.3896752109205865,
 0.3896752109205865,
 0.3896752109205865,
 0.3896752109205865,
 0.38977562697069384,
 0.38977562697069384,
 0.38977562697069384,
 0.38977562697069384,
 0.38977562697069384,
 0.38977562697069384,
 0.38977562697069384,
 0.38461408781610934,
 0.38461408781610934,
 0.38461408781610934,
 0.38461408781610934,
 0.38461408781610934,
 0.38461408781610934,
 0.38461408781610934,
 0.38525662593794624,
 0.38525662593794624,
 0.38525662593794624,
 0.38525662593794624,
 0.38525662593794624,
 0.38525662593794624,
 0.38525662593794624,
 0.3907619488290876,
 0.3907619488290876,
 0.3907619488290876,
 0.3907619488290876,
 0.3907619488290876,
 0.3907619488290876,
 0.3907619488290876,
 0.3823301807634558,
 0.3823301807634558,
 0.3823301807634558,
 0.3823301807634558,
 0.3823301807634558,
 0.3823301807634558,
 0.3823301807634558,
 0.38428628810978344,
 0.38428628810978344,
 0.38428628810978344,
 0.38428628810978344,
 0.38428628810978344,
 0.38428628810978344,
 0.38428628810978344,
 0.38762293845906487,
 0.38762293845906487,
 0.38762293845906487,
 0.38762293845906487,
 0.38762293845906487,
 0.38762293845906487,
 0.38762293845906487,
 0.38077971307423275,
 0.38077971307423275,
 0.38077971307423275,
 0.38077971307423275,
 0.38077971307423275,
 0.38077971307423275,
 0.38077971307423275,
 0.37348132870423595,
 0.37348132870423595,
 0.37348132870423595,
 0.37348132870423595,
 0.37348132870423595,
 0.37348132870423595,
 0.37348132870423595,
 0.38047858538493773,
 0.38047858538493773,
 0.38047858538493773,
 0.38047858538493773,
 0.38047858538493773,
 0.38047858538493773,
 0.38047858538493773,
 0.38065416099287325,
 0.38065416099287325,
 0.38065416099287325,
 0.38065416099287325,
 0.38065416099287325,
 0.38065416099287325,
 0.38065416099287325,
 0.38802750495965405,
 0.38802750495965405,
 0.38802750495965405,
 0.38802750495965405,
 0.38802750495965405,
 0.38802750495965405,
 0.38802750495965405,
 0.3896374109795356,
 0.3896374109795356,
 0.3896374109795356,
 0.3896374109795356,
 0.3896374109795356,
 0.3896374109795356,
 0.3896374109795356,
 0.3891513317710575,
 0.3891513317710575,
 0.3891513317710575,
 0.3891513317710575,
 0.3891513317710575,
 0.3891513317710575,
 0.3891513317710575,
 0.3873051573174501,
 0.3873051573174501,
 0.3873051573174501,
 0.3873051573174501,
 0.3873051573174501,
 0.3873051573174501,
 ...]
#找到最优超参数
tuning_results = pd.DataFrame(all_params)
tuning_results['mape'] = mapes
tuning_results.sort_values(by = "mape")
changepoint_prior_scaleseasonality_prior_scalechangepoint_rangeseasonality_modeholidays_prior_scalemape
500335.0000.500.3additive10.000.360404
500135.0000.500.3additive1.000.360404
500035.0000.500.3additive0.500.360404
499935.0000.500.3additive0.100.360404
499835.0000.500.3additive0.050.360404
.....................
950.0010.050.9multiplicative5.000.526060
940.0010.050.9multiplicative1.000.526060
930.0010.050.9multiplicative0.500.526060
920.0010.050.9multiplicative0.100.526060
910.0010.050.9multiplicative0.050.526060

7546 rows × 6 columns


  • 0
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值