机器学习实战 —— 工业蒸汽量预测(六)

🌟欢迎来到 我的博客 —— 探索技术的无限可能!


🌟博客的简介(文章目录)

文章描述

背景描述

  • 背景介绍

火力发电的基本原理是:燃料在燃烧时加热水生成蒸汽,蒸汽压力推动汽轮机旋转,然后汽轮机带动发电机旋转,产生电能。在这一系列的能量转化中,影响发电效率的核心是锅炉的燃烧效率,即燃料燃烧加热水产生高温高压蒸汽。锅炉的燃烧效率的影响因素很多,包括锅炉的可调参数,如燃烧给量,一二次风,引风,返料风,给水水量;以及锅炉的工况,比如锅炉床温、床压,炉膛温度、压力,过热器的温度等。

  • 相关描述

经脱敏后的锅炉传感器采集的数据(采集频率是分钟级别),根据锅炉的工况,预测产生的蒸汽量。

  • 结果评估

预测结果以mean square error作为评判标准。

数据说明

数据分成训练数据(train.txt)和测试数据(test.txt),其中字段”V0”-“V37”,这38个字段是作为特征变量,”target”作为目标变量。选手利用训练数据训练出模型,预测测试数据的目标变量,排名结果依据预测结果的MSE(mean square error)。

数据来源

http://tianchi-media.oss-cn-beijing.aliyuncs.com/DSW/Industrial_Steam_Forecast/zhengqi_test.txt

http://tianchi-media.oss-cn-beijing.aliyuncs.com/DSW/Industrial_Steam_Forecast/zhengqi_train.txt

实战内容

6.模型融合

下面把上篇关键流程在跑一边

导入包和数据

import warnings
warnings.filterwarnings("ignore")
import matplotlib.pyplot as plt
plt.rcParams.update({'figure.max_open_warning': 0})
import seaborn as sns
import pandas as pd
import numpy as np
from scipy import stats
from sklearn.model_selection import train_test_split
from sklearn.model_selection import GridSearchCV, RepeatedKFold, cross_val_score,cross_val_predict,KFold
from sklearn.metrics import make_scorer,mean_squared_error
from sklearn.linear_model import LinearRegression, Lasso, Ridge, ElasticNet
from sklearn.svm import LinearSVR, SVR
from sklearn.neighbors import KNeighborsRegressor
from sklearn.ensemble import RandomForestRegressor, GradientBoostingRegressor,AdaBoostRegressor
from xgboost import XGBRegressor
from sklearn.preprocessing import PolynomialFeatures,MinMaxScaler,StandardScaler

with open("./zhengqi_train.txt")  as fr:
    data_train=pd.read_table(fr,sep="\t")
with open("./zhengqi_test.txt") as fr_test:
    data_test=pd.read_table(fr_test,sep="\t")

合并数据

data_train["oringin"]="train"
data_test["oringin"]="test"
data_all=pd.concat([data_train,data_test],axis=0,ignore_index=True)

删除相关特征

data_all.drop(["V5","V9","V11","V17","V22","V28"],axis=1,inplace=True)

数据最大最小归一化

cols_numeric=list(data_all.columns)
cols_numeric.remove("oringin")
def scale_minmax(col):
    return (col-col.min())/(col.max()-col.min())
scale_cols = [col for col in cols_numeric if col!='target']
data_all[scale_cols] = data_all[scale_cols].apply(scale_minmax,axis=0)

画图:探查特征和标签相关信息

在这里插入图片描述

对特征进行Box-Cox变换,使其满足正态性

Box-Cox变换是Box和Cox在1964年提出的一种广义幂变换方法,是统计建模中常用的一种数据变换,用于连续的响应变量不满足正态分布的情况。Box-Cox变换之后,可以一定程度上减小不可观测的误差和预测变量的相关性。Box-Cox变换的主要特点是引入一个参数,通过数据本身估计该参数进而确定应采取的数据变换形式,Box-Cox变换可以明显地改善数据的正态性、对称性和方差相等性,对许多实际数据都是行之有效的

cols_transform=data_all.columns[0:-2]
for col in cols_transform:   
    # transform column
    data_all.loc[:,col], _ = stats.boxcox(data_all.loc[:,col]+1)

标签数据统计转换后的数据,计算分位数画图展示(基于正态分布)

print(data_all.target.describe())

plt.figure(figsize=(12,4))
plt.subplot(1,2,1)
sns.distplot(data_all.target.dropna() , fit=stats.norm);
plt.subplot(1,2,2)
_=stats.probplot(data_all.target.dropna(), plot=plt)

在这里插入图片描述

标签数据对数变换数据,使数据更符合正态,并画图展示

sp = data_train.target
data_train.target1 =np.power(1.5,sp)
print(data_train.target1.describe())

plt.figure(figsize=(12,4))
plt.subplot(1,2,1)
sns.distplot(data_train.target1.dropna(),fit=stats.norm);
plt.subplot(1,2,2)
_=stats.probplot(data_train.target1.dropna(), plot=plt)

在这里插入图片描述

获取训练和测试数据

def get_training_data():

    from sklearn.model_selection import train_test_split
    df_train = data_all[data_all["oringin"]=="train"]
    df_train["label"]=data_train.target1

    y = df_train.target
    X = df_train.drop(["oringin","target","label"],axis=1)
    X_train,X_valid,y_train,y_valid=train_test_split(X,y,test_size=0.3,random_state=100)
    return X_train,X_valid,y_train,y_valid

def get_test_data():
    df_test = data_all[data_all["oringin"]=="test"].reset_index(drop=True)
    return df_test.drop(["oringin","target"],axis=1)

评分函数

from sklearn.metrics import make_scorer

def rmse(y_true, y_pred):
    diff = y_pred - y_true
    sum_sq = sum(diff**2)    
    n = len(y_pred)   
    
    return np.sqrt(sum_sq/n)
def mse(y_ture,y_pred):
    return mean_squared_error(y_ture,y_pred)

rmse_scorer = make_scorer(rmse, greater_is_better=False)
mse_scorer = make_scorer(mse, greater_is_better=False)

获取异常数据,并画图

def find_outliers(model, X, y, sigma=3):
    try:
        y_pred = pd.Series(model.predict(X), index=y.index)
    except:
        model.fit(X,y)
        y_pred = pd.Series(model.predict(X), index=y.index)
        
    resid = y - y_pred
    mean_resid = resid.mean()
    std_resid = resid.std()

    z = (resid - mean_resid)/std_resid    
    outliers = z[abs(z)>sigma].index
    
    print('R2=',model.score(X,y))
    print('rmse=',rmse(y, y_pred))
    print("mse=",mean_squared_error(y,y_pred))
    print('---------------------------------------')

    print('mean of residuals:',mean_resid)
    print('std of residuals:',std_resid)
    print('---------------------------------------')

    print(len(outliers),'outliers:')
    print(outliers.tolist())

    plt.figure(figsize=(15,5))
    ax_131 = plt.subplot(1,3,1)
    plt.plot(y,y_pred,'.')
    plt.plot(y.loc[outliers],y_pred.loc[outliers],'ro')
    plt.legend(['Accepted','Outlier'])
    plt.xlabel('y')
    plt.ylabel('y_pred');

    ax_132=plt.subplot(1,3,2)
    plt.plot(y,y-y_pred,'.')
    plt.plot(y.loc[outliers],y.loc[outliers]-y_pred.loc[outliers],'ro')
    plt.legend(['Accepted','Outlier'])
    plt.xlabel('y')
    plt.ylabel('y - y_pred');

    ax_133=plt.subplot(1,3,3)
    z.plot.hist(bins=50,ax=ax_133)
    z.loc[outliers].plot.hist(color='r',bins=50,ax=ax_133)
    plt.legend(['Accepted','Outlier'])
    plt.xlabel('z')
    
    plt.savefig('outliers.png')
    
    return outliers
from sklearn.linear_model import Ridge
X_train, X_valid,y_train,y_valid = get_training_data()
test=get_test_data()
outliers = find_outliers(Ridge(), X_train, y_train)

X_outliers=X_train.loc[outliers]
y_outliers=y_train.loc[outliers]
X_t=X_train.drop(outliers)
y_t=y_train.drop(outliers)

在这里插入图片描述

# 使用删除异常的数据进行模型训练
def get_trainning_data_omitoutliers():
    y1=y_t.copy()
    X1=X_t.copy()
    return X1,y1
# 采用网格搜索训练模型
from sklearn.preprocessing import StandardScaler
def train_model(model, param_grid=[], X=[], y=[], 
                splits=5, repeats=5):

    if len(y)==0:
        X,y = get_trainning_data_omitoutliers()
    
    rkfold = RepeatedKFold(n_splits=splits, n_repeats=repeats)
    
    if len(param_grid)>0:
        gsearch = GridSearchCV(model, param_grid, cv=rkfold,
                               scoring="neg_mean_squared_error",
                               verbose=1, return_train_score=True)

        gsearch.fit(X,y)

        model = gsearch.best_estimator_        
        best_idx = gsearch.best_index_

        grid_results = pd.DataFrame(gsearch.cv_results_)       
        cv_mean = abs(grid_results.loc[best_idx,'mean_test_score'])
        cv_std = grid_results.loc[best_idx,'std_test_score']
 
    else:
        grid_results = []
        cv_results = cross_val_score(model, X, y, scoring="neg_mean_squared_error", cv=rkfold)
        cv_mean = abs(np.mean(cv_results))
        cv_std = np.std(cv_results)
    
    cv_score = pd.Series({'mean':cv_mean,'std':cv_std})

    y_pred = model.predict(X)
      
    print('----------------------')
    print(model)
    print('----------------------')
    print('score=',model.score(X,y))
    print('rmse=',rmse(y, y_pred))
    print('mse=',mse(y, y_pred))
    print('cross_val: mean=',cv_mean,', std=',cv_std)

    y_pred = pd.Series(y_pred,index=y.index)
    resid = y - y_pred
    mean_resid = resid.mean()
    std_resid = resid.std()
    z = (resid - mean_resid)/std_resid    
    n_outliers = sum(abs(z)>3)
    
    plt.figure(figsize=(15,5))
    ax_131 = plt.subplot(1,3,1)
    plt.plot(y,y_pred,'.')
    plt.xlabel('y')
    plt.ylabel('y_pred');
    plt.title('corr = {:.3f}'.format(np.corrcoef(y,y_pred)[0][1]))
    ax_132=plt.subplot(1,3,2)
    plt.plot(y,y-y_pred,'.')
    plt.xlabel('y')
    plt.ylabel('y - y_pred');
    plt.title('std resid = {:.3f}'.format(std_resid))
    
    ax_133=plt.subplot(1,3,3)
    z.plot.hist(bins=50,ax=ax_133)
    plt.xlabel('z')
    plt.title('{:.0f} samples with z>3'.format(n_outliers))

    return model, cv_score, grid_results
opt_models = dict()
score_models = pd.DataFrame(columns=['mean','std'])

splits=5
repeats=5

6.1 单一模型预测效果

6.1.1 岭回归
model = 'Ridge'

opt_models[model] = Ridge()
alph_range = np.arange(0.25,6,0.25)
param_grid = {'alpha': alph_range}

opt_models[model],cv_score,grid_results = train_model(opt_models[model], param_grid=param_grid, 
                                              splits=splits, repeats=repeats)

cv_score.name = model
score_models = score_models.append(cv_score)

plt.figure()
plt.errorbar(alph_range, abs(grid_results['mean_test_score']),
             abs(grid_results['std_test_score'])/np.sqrt(splits*repeats))
plt.xlabel('alpha')
plt.ylabel('score')

在这里插入图片描述

在这里插入图片描述
在这里插入图片描述

6.1.2 Lasso回归
model = 'Lasso'

opt_models[model] = Lasso()
alph_range = np.arange(1e-4,1e-3,4e-5)
param_grid = {'alpha': alph_range}

opt_models[model], cv_score, grid_results = train_model(opt_models[model], param_grid=param_grid, 
                                              splits=splits, repeats=repeats)

cv_score.name = model
score_models = score_models.append(cv_score)

plt.figure()
plt.errorbar(alph_range, abs(grid_results['mean_test_score']),abs(grid_results['std_test_score'])/np.sqrt(splits*repeats))
plt.xlabel('alpha')
plt.ylabel('score')

在这里插入图片描述

在这里插入图片描述
在这里插入图片描述

6.1.3 ElasticNet 回归
model ='ElasticNet'
opt_models[model] = ElasticNet()

param_grid = {'alpha': np.arange(1e-4,1e-3,1e-4),
              'l1_ratio': np.arange(0.1,1.0,0.1),
              'max_iter':[100000]}

opt_models[model], cv_score, grid_results = train_model(opt_models[model], param_grid=param_grid, 
                                              splits=splits, repeats=1)

cv_score.name = model
score_models = score_models.append(cv_score)

在这里插入图片描述
在这里插入图片描述

6.1.4 SVR回归
model='LinearSVR'
opt_models[model] = LinearSVR()

crange = np.arange(0.1,1.0,0.1)
param_grid = {'C':crange,
             'max_iter':[1000]}

opt_models[model], cv_score, grid_results = train_model(opt_models[model], param_grid=param_grid, 
                                              splits=splits, repeats=repeats)


cv_score.name = model
score_models = score_models.append(cv_score)

plt.figure()
plt.errorbar(crange, abs(grid_results['mean_test_score']),abs(grid_results['std_test_score'])/np.sqrt(splits*repeats))
plt.xlabel('C')
plt.ylabel('score')

在这里插入图片描述
在这里插入图片描述
在这里插入图片描述

6.1.5 KNN最近邻
model = 'KNeighbors'
opt_models[model] = KNeighborsRegressor()

param_grid = {'n_neighbors':np.arange(3,11,1)}

opt_models[model], cv_score, grid_results = train_model(opt_models[model], param_grid=param_grid, 
                                              splits=splits, repeats=1)

cv_score.name = model
score_models = score_models.append(cv_score)

plt.figure()
plt.errorbar(np.arange(3,11,1), abs(grid_results['mean_test_score']),abs(grid_results['std_test_score'])/np.sqrt(splits*1))
plt.xlabel('n_neighbors')
plt.ylabel('score')

在这里插入图片描述
在这里插入图片描述
在这里插入图片描述

6.1.6 GBDT 模型
model = 'GradientBoosting'
opt_models[model] = GradientBoostingRegressor()

param_grid = {'n_estimators':[150,250,350],
              'max_depth':[1,2,3],
              'min_samples_split':[5,6,7]}

opt_models[model], cv_score, grid_results = train_model(opt_models[model], param_grid=param_grid, 
                                              splits=splits, repeats=1)

cv_score.name = model
score_models = score_models.append(cv_score)

在这里插入图片描述
在这里插入图片描述

6.1.7XGB模型
model = 'XGB'
opt_models[model] = XGBRegressor()

param_grid = {'n_estimators':[100,200,300,400,500],
              'max_depth':[1,2,3],
             }

opt_models[model], cv_score,grid_results = train_model(opt_models[model], param_grid=param_grid, 
                                              splits=splits, repeats=1)

cv_score.name = model
score_models = score_models.append(cv_score)

在这里插入图片描述
在这里插入图片描述

6.1.8 随机森林模型
model = 'RandomForest'
opt_models[model] = RandomForestRegressor()

param_grid = {'n_estimators':[100,150,200],
              'max_features':[8,12,16,20,24],
              'min_samples_split':[2,4,6]}

opt_models[model], cv_score, grid_results = train_model(opt_models[model], param_grid=param_grid, 
                                              splits=5, repeats=1)

cv_score.name = model
score_models = score_models.append(cv_score)

在这里插入图片描述
在这里插入图片描述

6.2 模型预测–多模型Bagging

def model_predict(test_data,test_y=[],stack=False):

    i=0
    y_predict_total=np.zeros((test_data.shape[0],))
    for model in opt_models.keys():
        if model!="LinearSVR" and model!="KNeighbors":
            y_predict=opt_models[model].predict(test_data)
            y_predict_total+=y_predict
            i+=1
        if len(test_y)>0:
            print("{}_mse:".format(model),mean_squared_error(y_predict,test_y))
    y_predict_mean=np.round(y_predict_total/i,3)
    if len(test_y)>0:
        print("mean_mse:",mean_squared_error(y_predict_mean,test_y))
    else:
        y_predict_mean=pd.Series(y_predict_mean)
        return y_predict_mean
# Bagging预测
model_predict(X_valid,y_valid)

在这里插入图片描述

6.3 模型融合Stacking

6.3.1 模型融合stacking简单示例
import numpy as np
import matplotlib.pyplot as plt
import matplotlib.gridspec as gridspec
import itertools
from sklearn.linear_model import LogisticRegression
from sklearn.svm import SVC
from sklearn.ensemble import RandomForestClassifier

##主要使用pip install mlxtend安装mlxtend
from mlxtend.classifier import EnsembleVoteClassifier
from mlxtend.data import iris_data
from mlxtend.plotting import plot_decision_regions
%matplotlib inline

clf1 = LogisticRegression(random_state=0)
clf2 = RandomForestClassifier(random_state=0)
clf3 = SVC(random_state=0, probability=True)
eclf = EnsembleVoteClassifier(clfs=[clf1, clf2, clf3], weights=[2, 1, 1], voting='soft')

X, y = iris_data()
X = X[:,[0, 2]]

gs = gridspec.GridSpec(2, 2)
fig = plt.figure(figsize=(10, 8))

for clf, lab, grd in zip([clf1, clf2, clf3, eclf],
                         ['Logistic Regression', 'Random Forest', 'RBF kernel SVM', 'Ensemble'],
                         itertools.product([0, 1], repeat=2)):
    clf.fit(X, y)
    ax = plt.subplot(gs[grd[0], grd[1]])
    fig = plot_decision_regions(X=X, y=y, clf=clf, legend=2)
    plt.title(lab)
plt.show()

在这里插入图片描述

6.3.2工业蒸汽多模型融合stacking
from sklearn.model_selection import train_test_split
import pandas as pd
import numpy as np
from scipy import sparse
import xgboost
import lightgbm

from sklearn.ensemble import RandomForestRegressor,AdaBoostRegressor,GradientBoostingRegressor,ExtraTreesRegressor
from sklearn.linear_model import LinearRegression
from sklearn.metrics import mean_squared_error

def stacking_reg(clf,train_x,train_y,test_x,clf_name,kf,label_split=None):
    train=np.zeros((train_x.shape[0],1))
    test=np.zeros((test_x.shape[0],1))
    test_pre=np.empty((folds,test_x.shape[0],1))
    cv_scores=[]
    for i,(train_index,test_index) in enumerate(kf.split(train_x,label_split)):       
        tr_x=train_x[train_index]
        tr_y=train_y[train_index]
        te_x=train_x[test_index]
        te_y = train_y[test_index]
        if clf_name in ["rf","ada","gb","et","lr","lsvc","knn"]:
            clf.fit(tr_x,tr_y)
            pre=clf.predict(te_x).reshape(-1,1)
            train[test_index]=pre
            test_pre[i,:]=clf.predict(test_x).reshape(-1,1)
            cv_scores.append(mean_squared_error(te_y, pre))
        elif clf_name in ["xgb"]:
            train_matrix = clf.DMatrix(tr_x, label=tr_y, missing=-1)
            test_matrix = clf.DMatrix(te_x, label=te_y, missing=-1)
            z = clf.DMatrix(test_x, label=te_y, missing=-1)
            params = {'booster': 'gbtree',
                      'eval_metric': 'rmse',
                      'gamma': 1,
                      'min_child_weight': 1.5,
                      'max_depth': 5,
                      'lambda': 10,
                      'subsample': 0.7,
                      'colsample_bytree': 0.7,
                      'colsample_bylevel': 0.7,
                      'eta': 0.03,
                      'tree_method': 'exact',
                      'seed': 2017,
                      'nthread': 12
                      }
            num_round = 1000 #记得修改
            early_stopping_rounds = 10 #修改
            watchlist = [(train_matrix, 'train'),
                         (test_matrix, 'eval')
                         ]
            if test_matrix:
                model = clf.train(params, train_matrix, num_boost_round=num_round,evals=watchlist,
                                  early_stopping_rounds=early_stopping_rounds
                                  )
                pre= model.predict(test_matrix,ntree_limit=model.best_ntree_limit).reshape(-1,1)
                train[test_index]=pre
                test_pre[i, :]= model.predict(z, ntree_limit=model.best_ntree_limit).reshape(-1,1)
                cv_scores.append(mean_squared_error(te_y, pre))

        elif clf_name in ["lgb"]:
            train_matrix = clf.Dataset(tr_x, label=tr_y)
            test_matrix = clf.Dataset(te_x, label=te_y)
            params = {
                      'boosting_type': 'gbdt',
                      'objective': 'regression_l2',
                      'metric': 'mse',
                      'min_child_weight': 1.5,
                      'num_leaves': 2**5,
                      'lambda_l2': 10,
                      'subsample': 0.7,
                      'colsample_bytree': 0.7,
                      'colsample_bylevel': 0.7,
                      'learning_rate': 0.03,
                      'tree_method': 'exact',
                      'seed': 2017,
                      'nthread': 12,
                      'silent': True,
                      }
            num_round = 1000 #修改
            early_stopping_rounds = 10 #修改
            if test_matrix:
                model = clf.train(params, train_matrix,num_round,valid_sets=test_matrix,
                                  early_stopping_rounds=early_stopping_rounds
                                  )
                pre= model.predict(te_x,num_iteration=model.best_iteration).reshape(-1,1)
                train[test_index]=pre
                test_pre[i, :]= model.predict(test_x, num_iteration=model.best_iteration).reshape(-1,1)
                cv_scores.append(mean_squared_error(te_y, pre))
        else:
            raise IOError("Please add new clf.")
        print("%s now score is:"%clf_name,cv_scores)
    test[:]=test_pre.mean(axis=0)
    print("%s_score_list:"%clf_name,cv_scores)
    print("%s_score_mean:"%clf_name,np.mean(cv_scores))
    return train.reshape(-1,1),test.reshape(-1,1)

模型融合stacking基学习器

def rf_reg(x_train, y_train, x_valid, kf, label_split=None):
    randomforest = RandomForestRegressor(n_estimators=100, max_depth=20, n_jobs=-1, random_state=2017, max_features="auto",verbose=1)
    rf_train, rf_test = stacking_reg(randomforest, x_train, y_train, x_valid, "rf", kf, label_split=label_split)
    return rf_train, rf_test,"rf_reg"

def ada_reg(x_train, y_train, x_valid, kf, label_split=None):
    adaboost = AdaBoostRegressor(n_estimators=30, random_state=2017, learning_rate=0.01)
    ada_train, ada_test = stacking_reg(adaboost, x_train, y_train, x_valid, "ada", kf, label_split=label_split)
    return ada_train, ada_test,"ada_reg"

def gb_reg(x_train, y_train, x_valid, kf, label_split=None):
    gbdt = GradientBoostingRegressor(learning_rate=0.04, n_estimators=100, subsample=0.8, random_state=2017,max_depth=5,verbose=1)
    gbdt_train, gbdt_test = stacking_reg(gbdt, x_train, y_train, x_valid, "gb", kf, label_split=label_split)
    return gbdt_train, gbdt_test,"gb_reg"

def et_reg(x_train, y_train, x_valid, kf, label_split=None):
    extratree = ExtraTreesRegressor(n_estimators=100, max_depth=35, max_features="auto", n_jobs=-1, random_state=2017,verbose=1)
    et_train, et_test = stacking_reg(extratree, x_train, y_train, x_valid, "et", kf, label_split=label_split)
    return et_train, et_test,"et_reg"

def lr_reg(x_train, y_train, x_valid, kf, label_split=None):
    lr_reg=LinearRegression(n_jobs=-1)
    lr_train, lr_test = stacking_reg(lr_reg, x_train, y_train, x_valid, "lr", kf, label_split=label_split)
    return lr_train, lr_test, "lr_reg"

def xgb_reg(x_train, y_train, x_valid, kf, label_split=None):
    xgb_train, xgb_test = stacking_reg(xgboost, x_train, y_train, x_valid, "xgb", kf, label_split=label_split)
    return xgb_train, xgb_test,"xgb_reg"

def lgb_reg(x_train, y_train, x_valid, kf, label_split=None):
    lgb_train, lgb_test = stacking_reg(lightgbm, x_train, y_train, x_valid, "lgb", kf, label_split=label_split)
    return lgb_train, lgb_test,"lgb_reg"

模型融合stacking预测

def stacking_pred(x_train, y_train, x_valid, kf, clf_list, label_split=None, clf_fin="lgb", if_concat_origin=True):
    for k, clf_list in enumerate(clf_list):
        clf_list = [clf_list]
        column_list = []
        train_data_list=[]
        test_data_list=[]
        for clf in clf_list:
            train_data,test_data,clf_name=clf(x_train, y_train, x_valid, kf, label_split=label_split)
            train_data_list.append(train_data)
            test_data_list.append(test_data)
            column_list.append("clf_%s" % (clf_name))
    train = np.concatenate(train_data_list, axis=1)
    test = np.concatenate(test_data_list, axis=1)
    
    if if_concat_origin:
        train = np.concatenate([x_train, train], axis=1)
        test = np.concatenate([x_valid, test], axis=1)
    print(x_train.shape)
    print(train.shape)
    print(clf_name)
    print(clf_name in ["lgb"])
    if clf_fin in ["rf","ada","gb","et","lr","lsvc","knn"]:
        if clf_fin in ["rf"]:
            clf = RandomForestRegressor(n_estimators=100, max_depth=20, n_jobs=-1, random_state=2017, max_features="auto",verbose=1)
        elif clf_fin in ["ada"]:
            clf = AdaBoostRegressor(n_estimators=30, random_state=2017, learning_rate=0.01)
        elif clf_fin in ["gb"]:
            clf = GradientBoostingRegressor(learning_rate=0.04, n_estimators=100, subsample=0.8, random_state=2017,max_depth=5,verbose=1)
        elif clf_fin in ["et"]:
            clf = ExtraTreesRegressor(n_estimators=100, max_depth=35, max_features="auto", n_jobs=-1, random_state=2017,verbose=1)
        elif clf_fin in ["lr"]:
            clf = LinearRegression(n_jobs=-1)
        clf.fit(train, y_train)
        pre = clf.predict(test).reshape(-1,1)
        return pred
    elif clf_fin in ["xgb"]:
        clf = xgboost
        train_matrix = clf.DMatrix(train, label=y_train, missing=-1)
        test_matrix = clf.DMatrix(train, label=y_train, missing=-1)
        params = {'booster': 'gbtree',
                  'eval_metric': 'rmse',
                  'gamma': 1,
                  'min_child_weight': 1.5,
                  'max_depth': 5,
                  'lambda': 10,
                  'subsample': 0.7,
                  'colsample_bytree': 0.7,
                  'colsample_bylevel': 0.7,
                  'eta': 0.03,
                  'tree_method': 'exact',
                  'seed': 2017,
                  'nthread': 12
                  }
        num_round = 1000
        early_stopping_rounds = 10
        watchlist = [(train_matrix, 'train'),
                     (test_matrix, 'eval')
                     ]
        model = clf.train(params, train_matrix, num_boost_round=num_round,evals=watchlist,
                          early_stopping_rounds=early_stopping_rounds
                          )
        pre = model.predict(test,ntree_limit=model.best_ntree_limit).reshape(-1,1)
        return pre
    elif clf_fin in ["lgb"]:
        print(clf_name)
        clf = lightgbm
        train_matrix = clf.Dataset(train, label=y_train)
        test_matrix = clf.Dataset(train, label=y_train)
        params = {
                  'boosting_type': 'gbdt',
                  'objective': 'regression_l2',
                  'metric': 'mse',
                  'min_child_weight': 1.5,
                  'num_leaves': 2**5,
                  'lambda_l2': 10,
                  'subsample': 0.7,
                  'colsample_bytree': 0.7,
                  'colsample_bylevel': 0.7,
                  'learning_rate': 0.03,
                  'tree_method': 'exact',
                  'seed': 2017,
                  'nthread': 12,
                  'silent': True,
                  }
        num_round = 1000
        early_stopping_rounds = 10
        model = clf.train(params, train_matrix,num_round,valid_sets=test_matrix,
                          early_stopping_rounds=early_stopping_rounds
                          )
        print('pred')
        pre = model.predict(test,num_iteration=model.best_iteration).reshape(-1,1)
        print(pre)
        return pre
with open("./zhengqi_train.txt")  as fr:
    data_train=pd.read_table(fr,sep="\t")
with open("./zhengqi_test.txt") as fr_test:
    data_test=pd.read_table(fr_test,sep="\t")
### K折交叉验证
from sklearn.model_selection import StratifiedKFold, KFold

folds = 5
seed = 1
kf = KFold(n_splits=5, shuffle=True, random_state=0)
### 训练集和测试集数据
x_train = data_train[data_test.columns].values
x_valid = data_test[data_test.columns].values
y_train = data_train['target'].values
### 使用lr_reg和lgb_reg进行融合预测
clf_list = [lr_reg, lgb_reg]
clf_list = [lr_reg, rf_reg]

#很容易过拟合
pred = stacking_pred(x_train, y_train, x_valid, kf, clf_list, label_split=None, clf_fin="lgb", if_concat_origin=True)
print(pred)

运行结果:

lr now score is: [0.11573216950871248]
lr now score is: [0.11573216950871248, 0.09417486426618929]
lr now score is: [0.11573216950871248, 0.09417486426618929, 0.10805046561851059]
lr now score is: [0.11573216950871248, 0.09417486426618929, 0.10805046561851059, 0.12420887065601556]
lr now score is: [0.11573216950871248, 0.09417486426618929, 0.10805046561851059, 0.12420887065601556, 0.11940113841914012]
lr_score_list: [0.11573216950871248, 0.09417486426618929, 0.10805046561851059, 0.12420887065601556, 0.11940113841914012]
lr_score_mean: 0.1123135016937136
[LightGBM] [Warning] Unknown parameter: tree_method
[LightGBM] [Warning] Unknown parameter: colsample_bylevel
[LightGBM] [Warning] Unknown parameter: silent
[LightGBM] [Warning] Unknown parameter: tree_method
[LightGBM] [Warning] Unknown parameter: colsample_bylevel
[LightGBM] [Warning] Unknown parameter: silent
[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000775 seconds.
You can set `force_col_wise=true` to remove the overhead.
[LightGBM] [Info] Total Bins 8846
[LightGBM] [Info] Number of data points in the train set: 2310, number of used features: 38
[LightGBM] [Warning] Unknown parameter: tree_method
[LightGBM] [Warning] Unknown parameter: colsample_bylevel
[LightGBM] [Warning] Unknown parameter: silent
[LightGBM] [Info] Start training from score 0.126200
[1]	valid_0's l2: 0.992087
Training until validation scores don't improve for 10 rounds
[2]	valid_0's l2: 0.94735
[3]	valid_0's l2: 0.905091
[4]	valid_0's l2: 0.865237
[5]	valid_0's l2: 0.828478
[6]	valid_0's l2: 0.793015
[7]	valid_0's l2: 0.759026
[8]	valid_0's l2: 0.728042
[9]	valid_0's l2: 0.697924
[10]	valid_0's l2: 0.670335
[11]	valid_0's l2: 0.64293
[12]	valid_0's l2: 0.618124
[13]	valid_0's l2: 0.593382
[14]	valid_0's l2: 0.570581
[15]	valid_0's l2: 0.549105
[16]	valid_0's l2: 0.528357
[17]	valid_0's l2: 0.509096
[18]	valid_0's l2: 0.490868
[19]	valid_0's l2: 0.473523
[20]	valid_0's l2: 0.456674
[21]	valid_0's l2: 0.440983
[22]	valid_0's l2: 0.425928
[23]	valid_0's l2: 0.411831
[24]	valid_0's l2: 0.397973
[25]	valid_0's l2: 0.384988
[26]	valid_0's l2: 0.373082
[27]	valid_0's l2: 0.361415
[28]	valid_0's l2: 0.350399
[29]	valid_0's l2: 0.34
[30]	valid_0's l2: 0.330165
[31]	valid_0's l2: 0.320483
[32]	valid_0's l2: 0.311581
[33]	valid_0's l2: 0.303078
[34]	valid_0's l2: 0.294905
[35]	valid_0's l2: 0.287238
[36]	valid_0's l2: 0.280027
[37]	valid_0's l2: 0.273352
[38]	valid_0's l2: 0.266937
[39]	valid_0's l2: 0.260499
[40]	valid_0's l2: 0.254449
[41]	valid_0's l2: 0.248739
[42]	valid_0's l2: 0.243374
[43]	valid_0's l2: 0.238653
[44]	valid_0's l2: 0.233905
[45]	valid_0's l2: 0.229188
[46]	valid_0's l2: 0.224701
[47]	valid_0's l2: 0.220568
[48]	valid_0's l2: 0.216666
[49]	valid_0's l2: 0.212891
[50]	valid_0's l2: 0.209022
[51]	valid_0's l2: 0.205757
[52]	valid_0's l2: 0.202405
[53]	valid_0's l2: 0.199308
[54]	valid_0's l2: 0.196345
[55]	valid_0's l2: 0.193315
[56]	valid_0's l2: 0.190598
[57]	valid_0's l2: 0.187959
[58]	valid_0's l2: 0.18533
[59]	valid_0's l2: 0.182929
[60]	valid_0's l2: 0.18065
[61]	valid_0's l2: 0.17836
[62]	valid_0's l2: 0.176264
[63]	valid_0's l2: 0.174016
[64]	valid_0's l2: 0.171923
[65]	valid_0's l2: 0.170216
[66]	valid_0's l2: 0.168335
[67]	valid_0's l2: 0.166779
[68]	valid_0's l2: 0.165304
[69]	valid_0's l2: 0.163656
[70]	valid_0's l2: 0.162083
[71]	valid_0's l2: 0.160492
[72]	valid_0's l2: 0.15903
[73]	valid_0's l2: 0.157814
[74]	valid_0's l2: 0.156352
[75]	valid_0's l2: 0.155014
[76]	valid_0's l2: 0.153985
[77]	valid_0's l2: 0.152924
[78]	valid_0's l2: 0.151821
[79]	valid_0's l2: 0.150688
[80]	valid_0's l2: 0.149531
[81]	valid_0's l2: 0.148456
[82]	valid_0's l2: 0.147694
[83]	valid_0's l2: 0.146765
[84]	valid_0's l2: 0.146123
[85]	valid_0's l2: 0.145419
[86]	valid_0's l2: 0.144695
[87]	valid_0's l2: 0.143943
[88]	valid_0's l2: 0.143171
[89]	valid_0's l2: 0.142534
[90]	valid_0's l2: 0.141877
[91]	valid_0's l2: 0.141059
[92]	valid_0's l2: 0.140551
[93]	valid_0's l2: 0.140009
[94]	valid_0's l2: 0.139279
[95]	valid_0's l2: 0.138569
[96]	valid_0's l2: 0.137897
[97]	valid_0's l2: 0.137388
[98]	valid_0's l2: 0.13691
[99]	valid_0's l2: 0.136396
[100]	valid_0's l2: 0.135965
[101]	valid_0's l2: 0.135373
[102]	valid_0's l2: 0.134943
[103]	valid_0's l2: 0.134332
[104]	valid_0's l2: 0.13381
[105]	valid_0's l2: 0.133447
[106]	valid_0's l2: 0.133132
[107]	valid_0's l2: 0.132678
[108]	valid_0's l2: 0.132488
[109]	valid_0's l2: 0.132117
[110]	valid_0's l2: 0.131765
[111]	valid_0's l2: 0.131372
[112]	valid_0's l2: 0.131325
[113]	valid_0's l2: 0.130853
[114]	valid_0's l2: 0.13045
[115]	valid_0's l2: 0.130218
[116]	valid_0's l2: 0.13
[117]	valid_0's l2: 0.129733
[118]	valid_0's l2: 0.129497
[119]	valid_0's l2: 0.129287
[120]	valid_0's l2: 0.128982
[121]	valid_0's l2: 0.128911
[122]	valid_0's l2: 0.128714
[123]	valid_0's l2: 0.128563
[124]	valid_0's l2: 0.128345
[125]	valid_0's l2: 0.12804
[126]	valid_0's l2: 0.127975
[127]	valid_0's l2: 0.127807
[128]	valid_0's l2: 0.127702
[129]	valid_0's l2: 0.127429
[130]	valid_0's l2: 0.127234
[131]	valid_0's l2: 0.127074
[132]	valid_0's l2: 0.127011
[133]	valid_0's l2: 0.12685
[134]	valid_0's l2: 0.12671
[135]	valid_0's l2: 0.126406
[136]	valid_0's l2: 0.126114
[137]	valid_0's l2: 0.125927
[138]	valid_0's l2: 0.125792
[139]	valid_0's l2: 0.125551
[140]	valid_0's l2: 0.125378
[141]	valid_0's l2: 0.125224
[142]	valid_0's l2: 0.125075
[143]	valid_0's l2: 0.124855
[144]	valid_0's l2: 0.124729
[145]	valid_0's l2: 0.124603
[146]	valid_0's l2: 0.124488
[147]	valid_0's l2: 0.124202
[148]	valid_0's l2: 0.123975
[149]	valid_0's l2: 0.123834
[150]	valid_0's l2: 0.123747
[151]	valid_0's l2: 0.123547
[152]	valid_0's l2: 0.123488
[153]	valid_0's l2: 0.123464
[154]	valid_0's l2: 0.123332
[155]	valid_0's l2: 0.123226
[156]	valid_0's l2: 0.123147
[157]	valid_0's l2: 0.123009
[158]	valid_0's l2: 0.122874
[159]	valid_0's l2: 0.122711
[160]	valid_0's l2: 0.122515
[161]	valid_0's l2: 0.12241
[162]	valid_0's l2: 0.122304
[163]	valid_0's l2: 0.122279
[164]	valid_0's l2: 0.122161
[165]	valid_0's l2: 0.122075
[166]	valid_0's l2: 0.121921
[167]	valid_0's l2: 0.121898
[168]	valid_0's l2: 0.121687
[169]	valid_0's l2: 0.121627
[170]	valid_0's l2: 0.121632
[171]	valid_0's l2: 0.121461
[172]	valid_0's l2: 0.12135
[173]	valid_0's l2: 0.121271
[174]	valid_0's l2: 0.1211
[175]	valid_0's l2: 0.121032
[176]	valid_0's l2: 0.12105
[177]	valid_0's l2: 0.120974
[178]	valid_0's l2: 0.120873
[179]	valid_0's l2: 0.120812
[180]	valid_0's l2: 0.120656
[181]	valid_0's l2: 0.120546
[182]	valid_0's l2: 0.120499
[183]	valid_0's l2: 0.12045
[184]	valid_0's l2: 0.120431
[185]	valid_0's l2: 0.120385
[186]	valid_0's l2: 0.120317
[187]	valid_0's l2: 0.120107
[188]	valid_0's l2: 0.120094
[189]	valid_0's l2: 0.120012
[190]	valid_0's l2: 0.119968
[191]	valid_0's l2: 0.119858
[192]	valid_0's l2: 0.119831
[193]	valid_0's l2: 0.119706
[194]	valid_0's l2: 0.119654
[195]	valid_0's l2: 0.119578
[196]	valid_0's l2: 0.119593
[197]	valid_0's l2: 0.119558
[198]	valid_0's l2: 0.119572
[199]	valid_0's l2: 0.11962
[200]	valid_0's l2: 0.119574
[201]	valid_0's l2: 0.119535
[202]	valid_0's l2: 0.119481
[203]	valid_0's l2: 0.1194
[204]	valid_0's l2: 0.119352
[205]	valid_0's l2: 0.119355
[206]	valid_0's l2: 0.119352
[207]	valid_0's l2: 0.119336
[208]	valid_0's l2: 0.119256
[209]	valid_0's l2: 0.119248
[210]	valid_0's l2: 0.1193
[211]	valid_0's l2: 0.119222
[212]	valid_0's l2: 0.1191
[213]	valid_0's l2: 0.119105
[214]	valid_0's l2: 0.119048
[215]	valid_0's l2: 0.119149
[216]	valid_0's l2: 0.119107
[217]	valid_0's l2: 0.119024
[218]	valid_0's l2: 0.118886
[219]	valid_0's l2: 0.118847
[220]	valid_0's l2: 0.118799
[221]	valid_0's l2: 0.118715
[222]	valid_0's l2: 0.11867
[223]	valid_0's l2: 0.118671
[224]	valid_0's l2: 0.118667
[225]	valid_0's l2: 0.118674
[226]	valid_0's l2: 0.118661
[227]	valid_0's l2: 0.118636
[228]	valid_0's l2: 0.118587
[229]	valid_0's l2: 0.118612
[230]	valid_0's l2: 0.118581
[231]	valid_0's l2: 0.118531
[232]	valid_0's l2: 0.118462
[233]	valid_0's l2: 0.118486
[234]	valid_0's l2: 0.118461
[235]	valid_0's l2: 0.11846
[236]	valid_0's l2: 0.118457
[237]	valid_0's l2: 0.118307
[238]	valid_0's l2: 0.118244
[239]	valid_0's l2: 0.118185
[240]	valid_0's l2: 0.11818
[241]	valid_0's l2: 0.118242
[242]	valid_0's l2: 0.118193
[243]	valid_0's l2: 0.118126
[244]	valid_0's l2: 0.118134
[245]	valid_0's l2: 0.118132
[246]	valid_0's l2: 0.11809
[247]	valid_0's l2: 0.118078
[248]	valid_0's l2: 0.118082
[249]	valid_0's l2: 0.118026
[250]	valid_0's l2: 0.117904
[251]	valid_0's l2: 0.117845
[252]	valid_0's l2: 0.11778
[253]	valid_0's l2: 0.117714
[254]	valid_0's l2: 0.117665
[255]	valid_0's l2: 0.117614
[256]	valid_0's l2: 0.117606
[257]	valid_0's l2: 0.117558
[258]	valid_0's l2: 0.117562
[259]	valid_0's l2: 0.117578
[260]	valid_0's l2: 0.117497
[261]	valid_0's l2: 0.117504
[262]	valid_0's l2: 0.117394
[263]	valid_0's l2: 0.117426
[264]	valid_0's l2: 0.117393
[265]	valid_0's l2: 0.117334
[266]	valid_0's l2: 0.117273
[267]	valid_0's l2: 0.117258
[268]	valid_0's l2: 0.117163
[269]	valid_0's l2: 0.117064
[270]	valid_0's l2: 0.117054
[271]	valid_0's l2: 0.116993
[272]	valid_0's l2: 0.116947
[273]	valid_0's l2: 0.116938
[274]	valid_0's l2: 0.11687
[275]	valid_0's l2: 0.116836
[276]	valid_0's l2: 0.116819
[277]	valid_0's l2: 0.116712
[278]	valid_0's l2: 0.116708
[279]	valid_0's l2: 0.116678
[280]	valid_0's l2: 0.116601
[281]	valid_0's l2: 0.116624
[282]	valid_0's l2: 0.116609
[283]	valid_0's l2: 0.116566
[284]	valid_0's l2: 0.116513
[285]	valid_0's l2: 0.116429
[286]	valid_0's l2: 0.116397
[287]	valid_0's l2: 0.116341
[288]	valid_0's l2: 0.116352
[289]	valid_0's l2: 0.116273
[290]	valid_0's l2: 0.116209
[291]	valid_0's l2: 0.116211
[292]	valid_0's l2: 0.116152
[293]	valid_0's l2: 0.116054
[294]	valid_0's l2: 0.116108
[295]	valid_0's l2: 0.116138
[296]	valid_0's l2: 0.116053
[297]	valid_0's l2: 0.115981
[298]	valid_0's l2: 0.115985
[299]	valid_0's l2: 0.115993
[300]	valid_0's l2: 0.116019
[301]	valid_0's l2: 0.115995
[302]	valid_0's l2: 0.115975
[303]	valid_0's l2: 0.116
[304]	valid_0's l2: 0.116
[305]	valid_0's l2: 0.116021
[306]	valid_0's l2: 0.115995
[307]	valid_0's l2: 0.11593
[308]	valid_0's l2: 0.116007
[309]	valid_0's l2: 0.115919
[310]	valid_0's l2: 0.115891
[311]	valid_0's l2: 0.115829
[312]	valid_0's l2: 0.115794
[313]	valid_0's l2: 0.115731
[314]	valid_0's l2: 0.115761
[315]	valid_0's l2: 0.115739
[316]	valid_0's l2: 0.115764
[317]	valid_0's l2: 0.11573
[318]	valid_0's l2: 0.115768
[319]	valid_0's l2: 0.115734
[320]	valid_0's l2: 0.115697
[321]	valid_0's l2: 0.115695
[322]	valid_0's l2: 0.115712
[323]	valid_0's l2: 0.115718
[324]	valid_0's l2: 0.115734
[325]	valid_0's l2: 0.115727
[326]	valid_0's l2: 0.115686
[327]	valid_0's l2: 0.115648
[328]	valid_0's l2: 0.115636
[329]	valid_0's l2: 0.115625
[330]	valid_0's l2: 0.115597
[331]	valid_0's l2: 0.115626
[332]	valid_0's l2: 0.115595
[333]	valid_0's l2: 0.115582
[334]	valid_0's l2: 0.115557
[335]	valid_0's l2: 0.115527
[336]	valid_0's l2: 0.115515
[337]	valid_0's l2: 0.115534
[338]	valid_0's l2: 0.115483
[339]	valid_0's l2: 0.115444
[340]	valid_0's l2: 0.115384
[341]	valid_0's l2: 0.115406
[342]	valid_0's l2: 0.115391
[343]	valid_0's l2: 0.115322
[344]	valid_0's l2: 0.115266
[345]	valid_0's l2: 0.115204
[346]	valid_0's l2: 0.115179
[347]	valid_0's l2: 0.11522
[348]	valid_0's l2: 0.115214
[349]	valid_0's l2: 0.115203
[350]	valid_0's l2: 0.115172
[351]	valid_0's l2: 0.115147
[352]	valid_0's l2: 0.115143
[353]	valid_0's l2: 0.115097
[354]	valid_0's l2: 0.115099
[355]	valid_0's l2: 0.115052
[356]	valid_0's l2: 0.115009
[357]	valid_0's l2: 0.114997
[358]	valid_0's l2: 0.114963
[359]	valid_0's l2: 0.114959
[360]	valid_0's l2: 0.114912
[361]	valid_0's l2: 0.11486
[362]	valid_0's l2: 0.114881
[363]	valid_0's l2: 0.11483
[364]	valid_0's l2: 0.114854
[365]	valid_0's l2: 0.114857
[366]	valid_0's l2: 0.114823
[367]	valid_0's l2: 0.114828
[368]	valid_0's l2: 0.114765
[369]	valid_0's l2: 0.114746
[370]	valid_0's l2: 0.114722
[371]	valid_0's l2: 0.114708
[372]	valid_0's l2: 0.114678
[373]	valid_0's l2: 0.114686
[374]	valid_0's l2: 0.114669
[375]	valid_0's l2: 0.114655
[376]	valid_0's l2: 0.114645
[377]	valid_0's l2: 0.11467
[378]	valid_0's l2: 0.114674
[379]	valid_0's l2: 0.114707
[380]	valid_0's l2: 0.114689
[381]	valid_0's l2: 0.114679
[382]	valid_0's l2: 0.11467
[383]	valid_0's l2: 0.11469
[384]	valid_0's l2: 0.114667
[385]	valid_0's l2: 0.114656
[386]	valid_0's l2: 0.114657
Early stopping, best iteration is:
[376]	valid_0's l2: 0.114645
lgb now score is: [0.11464525315121991]
[LightGBM] [Warning] Unknown parameter: tree_method
[LightGBM] [Warning] Unknown parameter: colsample_bylevel
[LightGBM] [Warning] Unknown parameter: silent
[LightGBM] [Warning] Unknown parameter: tree_method
[LightGBM] [Warning] Unknown parameter: colsample_bylevel
[LightGBM] [Warning] Unknown parameter: silent
[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000858 seconds.
You can set `force_col_wise=true` to remove the overhead.
[LightGBM] [Info] Total Bins 8833
[LightGBM] [Info] Number of data points in the train set: 2310, number of used features: 38
[LightGBM] [Warning] Unknown parameter: tree_method
[LightGBM] [Warning] Unknown parameter: colsample_bylevel
[LightGBM] [Warning] Unknown parameter: silent
[LightGBM] [Info] Start training from score 0.111094
[1]	valid_0's l2: 0.86091
Training until validation scores don't improve for 10 rounds
[2]	valid_0's l2: 0.822041
[3]	valid_0's l2: 0.784715
[4]	valid_0's l2: 0.749897
[5]	valid_0's l2: 0.71688
[6]	valid_0's l2: 0.686156
[7]	valid_0's l2: 0.656374
[8]	valid_0's l2: 0.628383
[9]	valid_0's l2: 0.601579
[10]	valid_0's l2: 0.576335
[11]	valid_0's l2: 0.552179
[12]	valid_0's l2: 0.529819
[13]	valid_0's l2: 0.508092
[14]	valid_0's l2: 0.488383
[15]	valid_0's l2: 0.46922
[16]	valid_0's l2: 0.45117
[17]	valid_0's l2: 0.434806
[18]	valid_0's l2: 0.419967
[19]	valid_0's l2: 0.404153
[20]	valid_0's l2: 0.389472
[21]	valid_0's l2: 0.375817
[22]	valid_0's l2: 0.362587
[23]	valid_0's l2: 0.349989
[24]	valid_0's l2: 0.337657
[25]	valid_0's l2: 0.326825
[26]	valid_0's l2: 0.316732
[27]	valid_0's l2: 0.306624
[28]	valid_0's l2: 0.296987
[29]	valid_0's l2: 0.287803
[30]	valid_0's l2: 0.279086
[31]	valid_0's l2: 0.27072
[32]	valid_0's l2: 0.262841
[33]	valid_0's l2: 0.255607
[34]	valid_0's l2: 0.24856
[35]	valid_0's l2: 0.24187
[36]	valid_0's l2: 0.23572
[37]	valid_0's l2: 0.229697
[38]	valid_0's l2: 0.223881
[39]	valid_0's l2: 0.218407
[40]	valid_0's l2: 0.213644
[41]	valid_0's l2: 0.208768
[42]	valid_0's l2: 0.204136
[43]	valid_0's l2: 0.199997
[44]	valid_0's l2: 0.196003
[45]	valid_0's l2: 0.192227
[46]	valid_0's l2: 0.18864
[47]	valid_0's l2: 0.185191
[48]	valid_0's l2: 0.181781
[49]	valid_0's l2: 0.178617
[50]	valid_0's l2: 0.17564
[51]	valid_0's l2: 0.17263
[52]	valid_0's l2: 0.170023
[53]	valid_0's l2: 0.167523
[54]	valid_0's l2: 0.16511
[55]	valid_0's l2: 0.162756
[56]	valid_0's l2: 0.160283
[57]	valid_0's l2: 0.158385
[58]	valid_0's l2: 0.156343
[59]	valid_0's l2: 0.154427
[60]	valid_0's l2: 0.152609
[61]	valid_0's l2: 0.150822
[62]	valid_0's l2: 0.148894
[63]	valid_0's l2: 0.147392
[64]	valid_0's l2: 0.145769
[65]	valid_0's l2: 0.14409
[66]	valid_0's l2: 0.142545
[67]	valid_0's l2: 0.141187
[68]	valid_0's l2: 0.139822
[69]	valid_0's l2: 0.138396
[70]	valid_0's l2: 0.137236
[71]	valid_0's l2: 0.136107
[72]	valid_0's l2: 0.135055
[73]	valid_0's l2: 0.133874
[74]	valid_0's l2: 0.132889
[75]	valid_0's l2: 0.131947
[76]	valid_0's l2: 0.131061
[77]	valid_0's l2: 0.130208
[78]	valid_0's l2: 0.12939
[79]	valid_0's l2: 0.128622
[80]	valid_0's l2: 0.127771
[81]	valid_0's l2: 0.126905
[82]	valid_0's l2: 0.126137
[83]	valid_0's l2: 0.12548
[84]	valid_0's l2: 0.124831
[85]	valid_0's l2: 0.12432
[86]	valid_0's l2: 0.123806
[87]	valid_0's l2: 0.12327
[88]	valid_0's l2: 0.122608
[89]	valid_0's l2: 0.12219
[90]	valid_0's l2: 0.12163
[91]	valid_0's l2: 0.121188
[92]	valid_0's l2: 0.120653
[93]	valid_0's l2: 0.120216
[94]	valid_0's l2: 0.119754
[95]	valid_0's l2: 0.119218
[96]	valid_0's l2: 0.118903
[97]	valid_0's l2: 0.118442
[98]	valid_0's l2: 0.118042
[99]	valid_0's l2: 0.117763
[100]	valid_0's l2: 0.117492
[101]	valid_0's l2: 0.117117
[102]	valid_0's l2: 0.116843
[103]	valid_0's l2: 0.116697
[104]	valid_0's l2: 0.116437
[105]	valid_0's l2: 0.11609
[106]	valid_0's l2: 0.115905
[107]	valid_0's l2: 0.115726
[108]	valid_0's l2: 0.115542
[109]	valid_0's l2: 0.115341
[110]	valid_0's l2: 0.115002
[111]	valid_0's l2: 0.11472
[112]	valid_0's l2: 0.114469
[113]	valid_0's l2: 0.114079
[114]	valid_0's l2: 0.113868
[115]	valid_0's l2: 0.113736
[116]	valid_0's l2: 0.113459
[117]	valid_0's l2: 0.113129
[118]	valid_0's l2: 0.112894
[119]	valid_0's l2: 0.112745
[120]	valid_0's l2: 0.112488
[121]	valid_0's l2: 0.112348
[122]	valid_0's l2: 0.112149
[123]	valid_0's l2: 0.11209
[124]	valid_0's l2: 0.112012
[125]	valid_0's l2: 0.111745
[126]	valid_0's l2: 0.111642
[127]	valid_0's l2: 0.111644
[128]	valid_0's l2: 0.111437
[129]	valid_0's l2: 0.111257
[130]	valid_0's l2: 0.111145
[131]	valid_0's l2: 0.110934
[132]	valid_0's l2: 0.110702
[133]	valid_0's l2: 0.110584
[134]	valid_0's l2: 0.11045
[135]	valid_0's l2: 0.110237
[136]	valid_0's l2: 0.110031
[137]	valid_0's l2: 0.109968
[138]	valid_0's l2: 0.109877
[139]	valid_0's l2: 0.109718
[140]	valid_0's l2: 0.109607
[141]	valid_0's l2: 0.109424
[142]	valid_0's l2: 0.109276
[143]	valid_0's l2: 0.109145
[144]	valid_0's l2: 0.10913
[145]	valid_0's l2: 0.108893
[146]	valid_0's l2: 0.108787
[147]	valid_0's l2: 0.10868
[148]	valid_0's l2: 0.108603
[149]	valid_0's l2: 0.108512
[150]	valid_0's l2: 0.108397
[151]	valid_0's l2: 0.108256
[152]	valid_0's l2: 0.108152
[153]	valid_0's l2: 0.108033
[154]	valid_0's l2: 0.10799
[155]	valid_0's l2: 0.107907
[156]	valid_0's l2: 0.107812
[157]	valid_0's l2: 0.107647
[158]	valid_0's l2: 0.107671
[159]	valid_0's l2: 0.107591
[160]	valid_0's l2: 0.107406
[161]	valid_0's l2: 0.107304
[162]	valid_0's l2: 0.107164
[163]	valid_0's l2: 0.107023
[164]	valid_0's l2: 0.106946
[165]	valid_0's l2: 0.106876
[166]	valid_0's l2: 0.106805
[167]	valid_0's l2: 0.106705
[168]	valid_0's l2: 0.106638
[169]	valid_0's l2: 0.106514
[170]	valid_0's l2: 0.106384
[171]	valid_0's l2: 0.106304
[172]	valid_0's l2: 0.106133
[173]	valid_0's l2: 0.106015
[174]	valid_0's l2: 0.105924
[175]	valid_0's l2: 0.105863
[176]	valid_0's l2: 0.105869
[177]	valid_0's l2: 0.105779
[178]	valid_0's l2: 0.105551
[179]	valid_0's l2: 0.105415
[180]	valid_0's l2: 0.105419
[181]	valid_0's l2: 0.105338
[182]	valid_0's l2: 0.105306
[183]	valid_0's l2: 0.105239
[184]	valid_0's l2: 0.105156
[185]	valid_0's l2: 0.105091
[186]	valid_0's l2: 0.104991
[187]	valid_0's l2: 0.104883
[188]	valid_0's l2: 0.104786
[189]	valid_0's l2: 0.104672
[190]	valid_0's l2: 0.104588
[191]	valid_0's l2: 0.10446
[192]	valid_0's l2: 0.104321
[193]	valid_0's l2: 0.104167
[194]	valid_0's l2: 0.104112
[195]	valid_0's l2: 0.104088
[196]	valid_0's l2: 0.103962
[197]	valid_0's l2: 0.103874
[198]	valid_0's l2: 0.103831
[199]	valid_0's l2: 0.10369
[200]	valid_0's l2: 0.103604
[201]	valid_0's l2: 0.103497
[202]	valid_0's l2: 0.103407
[203]	valid_0's l2: 0.103347
[204]	valid_0's l2: 0.103283
[205]	valid_0's l2: 0.103201
[206]	valid_0's l2: 0.103152
[207]	valid_0's l2: 0.103015
[208]	valid_0's l2: 0.10293
[209]	valid_0's l2: 0.102952
[210]	valid_0's l2: 0.102864
[211]	valid_0's l2: 0.102785
[212]	valid_0's l2: 0.102692
[213]	valid_0's l2: 0.102638
[214]	valid_0's l2: 0.102561
[215]	valid_0's l2: 0.10256
[216]	valid_0's l2: 0.102528
[217]	valid_0's l2: 0.102487
[218]	valid_0's l2: 0.102494
[219]	valid_0's l2: 0.10243
[220]	valid_0's l2: 0.102457
[221]	valid_0's l2: 0.102365
[222]	valid_0's l2: 0.102228
[223]	valid_0's l2: 0.102156
[224]	valid_0's l2: 0.102063
[225]	valid_0's l2: 0.102145
[226]	valid_0's l2: 0.102083
[227]	valid_0's l2: 0.102066
[228]	valid_0's l2: 0.102016
[229]	valid_0's l2: 0.102006
[230]	valid_0's l2: 0.101921
[231]	valid_0's l2: 0.101862
[232]	valid_0's l2: 0.101847
[233]	valid_0's l2: 0.101761
[234]	valid_0's l2: 0.101755
[235]	valid_0's l2: 0.101747
[236]	valid_0's l2: 0.101702
[237]	valid_0's l2: 0.101608
[238]	valid_0's l2: 0.101607
[239]	valid_0's l2: 0.101464
[240]	valid_0's l2: 0.101384
[241]	valid_0's l2: 0.101364
[242]	valid_0's l2: 0.101346
[243]	valid_0's l2: 0.101302
[244]	valid_0's l2: 0.101225
[245]	valid_0's l2: 0.101195
[246]	valid_0's l2: 0.101125
[247]	valid_0's l2: 0.101072
[248]	valid_0's l2: 0.101005
[249]	valid_0's l2: 0.100976
[250]	valid_0's l2: 0.100874
[251]	valid_0's l2: 0.100874
[252]	valid_0's l2: 0.100843
[253]	valid_0's l2: 0.100802
[254]	valid_0's l2: 0.100783
[255]	valid_0's l2: 0.100779
[256]	valid_0's l2: 0.100681
[257]	valid_0's l2: 0.10069
[258]	valid_0's l2: 0.100656
[259]	valid_0's l2: 0.100587
[260]	valid_0's l2: 0.100539
[261]	valid_0's l2: 0.100492
[262]	valid_0's l2: 0.10048
[263]	valid_0's l2: 0.100485
[264]	valid_0's l2: 0.100487
[265]	valid_0's l2: 0.100397
[266]	valid_0's l2: 0.100363
[267]	valid_0's l2: 0.100408
[268]	valid_0's l2: 0.100387
[269]	valid_0's l2: 0.100363
[270]	valid_0's l2: 0.100378
[271]	valid_0's l2: 0.100295
[272]	valid_0's l2: 0.100238
[273]	valid_0's l2: 0.100259
[274]	valid_0's l2: 0.100276
[275]	valid_0's l2: 0.100261
[276]	valid_0's l2: 0.100274
[277]	valid_0's l2: 0.100225
[278]	valid_0's l2: 0.100204
[279]	valid_0's l2: 0.10016
[280]	valid_0's l2: 0.100246
[281]	valid_0's l2: 0.100259
[282]	valid_0's l2: 0.100302
[283]	valid_0's l2: 0.100283
[284]	valid_0's l2: 0.100334
[285]	valid_0's l2: 0.100346
[286]	valid_0's l2: 0.100354
[287]	valid_0's l2: 0.100367
[288]	valid_0's l2: 0.100362
[289]	valid_0's l2: 0.100308
Early stopping, best iteration is:
[279]	valid_0's l2: 0.10016
lgb now score is: [0.11464525315121991, 0.1001602219958722]
[LightGBM] [Warning] Unknown parameter: tree_method
[LightGBM] [Warning] Unknown parameter: colsample_bylevel
[LightGBM] [Warning] Unknown parameter: silent
[LightGBM] [Warning] Unknown parameter: tree_method
[LightGBM] [Warning] Unknown parameter: colsample_bylevel
[LightGBM] [Warning] Unknown parameter: silent
[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000817 seconds.
You can set `force_col_wise=true` to remove the overhead.
[LightGBM] [Info] Total Bins 8847
[LightGBM] [Info] Number of data points in the train set: 2310, number of used features: 38
[LightGBM] [Warning] Unknown parameter: tree_method
[LightGBM] [Warning] Unknown parameter: colsample_bylevel
[LightGBM] [Warning] Unknown parameter: silent
[LightGBM] [Info] Start training from score 0.137391
[1]	valid_0's l2: 0.925026
Training until validation scores don't improve for 10 rounds
[2]	valid_0's l2: 0.882263
[3]	valid_0's l2: 0.841261
[4]	valid_0's l2: 0.802716
[5]	valid_0's l2: 0.766211
[6]	valid_0's l2: 0.732444
[7]	valid_0's l2: 0.700039
[8]	valid_0's l2: 0.669955
[9]	valid_0's l2: 0.641029
[10]	valid_0's l2: 0.614484
[11]	valid_0's l2: 0.588721
[12]	valid_0's l2: 0.564669
[13]	valid_0's l2: 0.541491
[14]	valid_0's l2: 0.519434
[15]	valid_0's l2: 0.498759
[16]	valid_0's l2: 0.478783
[17]	valid_0's l2: 0.460135
[18]	valid_0's l2: 0.443683
[19]	valid_0's l2: 0.427316
[20]	valid_0's l2: 0.411839
[21]	valid_0's l2: 0.396345
[22]	valid_0's l2: 0.382314
[23]	valid_0's l2: 0.368991
[24]	valid_0's l2: 0.356666
[25]	valid_0's l2: 0.344728
[26]	valid_0's l2: 0.332818
[27]	valid_0's l2: 0.322327
[28]	valid_0's l2: 0.312361
[29]	valid_0's l2: 0.302618
[30]	valid_0's l2: 0.293583
[31]	valid_0's l2: 0.284772
[32]	valid_0's l2: 0.27643
[33]	valid_0's l2: 0.268242
[34]	valid_0's l2: 0.260699
[35]	valid_0's l2: 0.25392
[36]	valid_0's l2: 0.247676
[37]	valid_0's l2: 0.241242
[38]	valid_0's l2: 0.235133
[39]	valid_0's l2: 0.229733
[40]	valid_0's l2: 0.224359
[41]	valid_0's l2: 0.219506
[42]	valid_0's l2: 0.214509
[43]	valid_0's l2: 0.210222
[44]	valid_0's l2: 0.205951
[45]	valid_0's l2: 0.201601
[46]	valid_0's l2: 0.197821
[47]	valid_0's l2: 0.193874
[48]	valid_0's l2: 0.190225
[49]	valid_0's l2: 0.1869
[50]	valid_0's l2: 0.183644
[51]	valid_0's l2: 0.180409
[52]	valid_0's l2: 0.177691
[53]	valid_0's l2: 0.174935
[54]	valid_0's l2: 0.172226
[55]	valid_0's l2: 0.169595
[56]	valid_0's l2: 0.167186
[57]	valid_0's l2: 0.164992
[58]	valid_0's l2: 0.162876
[59]	valid_0's l2: 0.16082
[60]	valid_0's l2: 0.158811
[61]	valid_0's l2: 0.156882
[62]	valid_0's l2: 0.155007
[63]	valid_0's l2: 0.153246
[64]	valid_0's l2: 0.151493
[65]	valid_0's l2: 0.149942
[66]	valid_0's l2: 0.148382
[67]	valid_0's l2: 0.146962
[68]	valid_0's l2: 0.14547
[69]	valid_0's l2: 0.144072
[70]	valid_0's l2: 0.142969
[71]	valid_0's l2: 0.141822
[72]	valid_0's l2: 0.140529
[73]	valid_0's l2: 0.139502
[74]	valid_0's l2: 0.138325
[75]	valid_0's l2: 0.13723
[76]	valid_0's l2: 0.136285
[77]	valid_0's l2: 0.13545
[78]	valid_0's l2: 0.134449
[79]	valid_0's l2: 0.13355
[80]	valid_0's l2: 0.13274
[81]	valid_0's l2: 0.132011
[82]	valid_0's l2: 0.131104
[83]	valid_0's l2: 0.130412
[84]	valid_0's l2: 0.129725
[85]	valid_0's l2: 0.129029
[86]	valid_0's l2: 0.12832
[87]	valid_0's l2: 0.127588
[88]	valid_0's l2: 0.126903
[89]	valid_0's l2: 0.126178
[90]	valid_0's l2: 0.125507
[91]	valid_0's l2: 0.12504
[92]	valid_0's l2: 0.124611
[93]	valid_0's l2: 0.124076
[94]	valid_0's l2: 0.123501
[95]	valid_0's l2: 0.122931
[96]	valid_0's l2: 0.122337
[97]	valid_0's l2: 0.121932
[98]	valid_0's l2: 0.121426
[99]	valid_0's l2: 0.121102
[100]	valid_0's l2: 0.12074
[101]	valid_0's l2: 0.12041
[102]	valid_0's l2: 0.119975
[103]	valid_0's l2: 0.119506
[104]	valid_0's l2: 0.119246
[105]	valid_0's l2: 0.119108
[106]	valid_0's l2: 0.118802
[107]	valid_0's l2: 0.118554
[108]	valid_0's l2: 0.118359
[109]	valid_0's l2: 0.118068
[110]	valid_0's l2: 0.117868
[111]	valid_0's l2: 0.117693
[112]	valid_0's l2: 0.117375
[113]	valid_0's l2: 0.117295
[114]	valid_0's l2: 0.117114
[115]	valid_0's l2: 0.116833
[116]	valid_0's l2: 0.116496
[117]	valid_0's l2: 0.116232
[118]	valid_0's l2: 0.115975
[119]	valid_0's l2: 0.115697
[120]	valid_0's l2: 0.115323
[121]	valid_0's l2: 0.114989
[122]	valid_0's l2: 0.114795
[123]	valid_0's l2: 0.114335
[124]	valid_0's l2: 0.114077
[125]	valid_0's l2: 0.11381
[126]	valid_0's l2: 0.113463
[127]	valid_0's l2: 0.113259
[128]	valid_0's l2: 0.113066
[129]	valid_0's l2: 0.112829
[130]	valid_0's l2: 0.112548
[131]	valid_0's l2: 0.112225
[132]	valid_0's l2: 0.112089
[133]	valid_0's l2: 0.111891
[134]	valid_0's l2: 0.111786
[135]	valid_0's l2: 0.111564
[136]	valid_0's l2: 0.111354
[137]	valid_0's l2: 0.111186
[138]	valid_0's l2: 0.110975
[139]	valid_0's l2: 0.110834
[140]	valid_0's l2: 0.110615
[141]	valid_0's l2: 0.110417
[142]	valid_0's l2: 0.110173
[143]	valid_0's l2: 0.109991
[144]	valid_0's l2: 0.109864
[145]	valid_0's l2: 0.109838
[146]	valid_0's l2: 0.109655
[147]	valid_0's l2: 0.109553
[148]	valid_0's l2: 0.109457
[149]	valid_0's l2: 0.109316
[150]	valid_0's l2: 0.109164
[151]	valid_0's l2: 0.10898
[152]	valid_0's l2: 0.108961
[153]	valid_0's l2: 0.108942
[154]	valid_0's l2: 0.10879
[155]	valid_0's l2: 0.10868
[156]	valid_0's l2: 0.108643
[157]	valid_0's l2: 0.108405
[158]	valid_0's l2: 0.108422
[159]	valid_0's l2: 0.10839
[160]	valid_0's l2: 0.108262
[161]	valid_0's l2: 0.108223
[162]	valid_0's l2: 0.108148
[163]	valid_0's l2: 0.10811
[164]	valid_0's l2: 0.107988
[165]	valid_0's l2: 0.107869
[166]	valid_0's l2: 0.107822
[167]	valid_0's l2: 0.107675
[168]	valid_0's l2: 0.107646
[169]	valid_0's l2: 0.107497
[170]	valid_0's l2: 0.107425
[171]	valid_0's l2: 0.107329
[172]	valid_0's l2: 0.107197
[173]	valid_0's l2: 0.107198
[174]	valid_0's l2: 0.107122
[175]	valid_0's l2: 0.107053
[176]	valid_0's l2: 0.106904
[177]	valid_0's l2: 0.10679
[178]	valid_0's l2: 0.106755
[179]	valid_0's l2: 0.106669
[180]	valid_0's l2: 0.10663
[181]	valid_0's l2: 0.106529
[182]	valid_0's l2: 0.106497
[183]	valid_0's l2: 0.106452
[184]	valid_0's l2: 0.106409
[185]	valid_0's l2: 0.106313
[186]	valid_0's l2: 0.106197
[187]	valid_0's l2: 0.106146
[188]	valid_0's l2: 0.106034
[189]	valid_0's l2: 0.10599
[190]	valid_0's l2: 0.105846
[191]	valid_0's l2: 0.105816
[192]	valid_0's l2: 0.105789
[193]	valid_0's l2: 0.105665
[194]	valid_0's l2: 0.105676
[195]	valid_0's l2: 0.105641
[196]	valid_0's l2: 0.105541
[197]	valid_0's l2: 0.105433
[198]	valid_0's l2: 0.105324
[199]	valid_0's l2: 0.105224
[200]	valid_0's l2: 0.105179
[201]	valid_0's l2: 0.105183
[202]	valid_0's l2: 0.105147
[203]	valid_0's l2: 0.105078
[204]	valid_0's l2: 0.105004
[205]	valid_0's l2: 0.104887
[206]	valid_0's l2: 0.104842
[207]	valid_0's l2: 0.104784
[208]	valid_0's l2: 0.104785
[209]	valid_0's l2: 0.104708
[210]	valid_0's l2: 0.104679
[211]	valid_0's l2: 0.104632
[212]	valid_0's l2: 0.104641
[213]	valid_0's l2: 0.104635
[214]	valid_0's l2: 0.104577
[215]	valid_0's l2: 0.104529
[216]	valid_0's l2: 0.104524
[217]	valid_0's l2: 0.104494
[218]	valid_0's l2: 0.104406
[219]	valid_0's l2: 0.104381
[220]	valid_0's l2: 0.104399
[221]	valid_0's l2: 0.104383
[222]	valid_0's l2: 0.104298
[223]	valid_0's l2: 0.104221
[224]	valid_0's l2: 0.1042
[225]	valid_0's l2: 0.104102
[226]	valid_0's l2: 0.1041
[227]	valid_0's l2: 0.104083
[228]	valid_0's l2: 0.104095
[229]	valid_0's l2: 0.104039
[230]	valid_0's l2: 0.104092
[231]	valid_0's l2: 0.104071
[232]	valid_0's l2: 0.103929
[233]	valid_0's l2: 0.103913
[234]	valid_0's l2: 0.103853
[235]	valid_0's l2: 0.103845
[236]	valid_0's l2: 0.103753
[237]	valid_0's l2: 0.103843
[238]	valid_0's l2: 0.103845
[239]	valid_0's l2: 0.103858
[240]	valid_0's l2: 0.103875
[241]	valid_0's l2: 0.103815
[242]	valid_0's l2: 0.103871
[243]	valid_0's l2: 0.103868
[244]	valid_0's l2: 0.103859
[245]	valid_0's l2: 0.103853
[246]	valid_0's l2: 0.103829
Early stopping, best iteration is:
[236]	valid_0's l2: 0.103753
lgb now score is: [0.11464525315121991, 0.1001602219958722, 0.10375315940652878]
[LightGBM] [Warning] Unknown parameter: tree_method
[LightGBM] [Warning] Unknown parameter: colsample_bylevel
[LightGBM] [Warning] Unknown parameter: silent
[LightGBM] [Warning] Unknown parameter: tree_method
[LightGBM] [Warning] Unknown parameter: colsample_bylevel
[LightGBM] [Warning] Unknown parameter: silent
[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000785 seconds.
You can set `force_col_wise=true` to remove the overhead.
[LightGBM] [Info] Total Bins 8852
[LightGBM] [Info] Number of data points in the train set: 2311, number of used features: 38
[LightGBM] [Warning] Unknown parameter: tree_method
[LightGBM] [Warning] Unknown parameter: colsample_bylevel
[LightGBM] [Warning] Unknown parameter: silent
[LightGBM] [Info] Start training from score 0.126265
[1]	valid_0's l2: 0.908704
Training until validation scores don't improve for 10 rounds
[2]	valid_0's l2: 0.868691
[3]	valid_0's l2: 0.830532
[4]	valid_0's l2: 0.794315
[5]	valid_0's l2: 0.7611
[6]	valid_0's l2: 0.728347
[7]	valid_0's l2: 0.698418
[8]	valid_0's l2: 0.669822
[9]	valid_0's l2: 0.642236
[10]	valid_0's l2: 0.617694
[11]	valid_0's l2: 0.593604
[12]	valid_0's l2: 0.57053
[13]	valid_0's l2: 0.548345
[14]	valid_0's l2: 0.527479
[15]	valid_0's l2: 0.508192
[16]	valid_0's l2: 0.489393
[17]	valid_0's l2: 0.471802
[18]	valid_0's l2: 0.456568
[19]	valid_0's l2: 0.440842
[20]	valid_0's l2: 0.425425
[21]	valid_0's l2: 0.411385
[22]	valid_0's l2: 0.39851
[23]	valid_0's l2: 0.3862
[24]	valid_0's l2: 0.374549
[25]	valid_0's l2: 0.363075
[26]	valid_0's l2: 0.352338
[27]	valid_0's l2: 0.342305
[28]	valid_0's l2: 0.332717
[29]	valid_0's l2: 0.323468
[30]	valid_0's l2: 0.314911
[31]	valid_0's l2: 0.306372
[32]	valid_0's l2: 0.298758
[33]	valid_0's l2: 0.291218
[34]	valid_0's l2: 0.284186
[35]	valid_0's l2: 0.277231
[36]	valid_0's l2: 0.270833
[37]	valid_0's l2: 0.264518
[38]	valid_0's l2: 0.258655
[39]	valid_0's l2: 0.253038
[40]	valid_0's l2: 0.247684
[41]	valid_0's l2: 0.242439
[42]	valid_0's l2: 0.23761
[43]	valid_0's l2: 0.233467
[44]	valid_0's l2: 0.229177
[45]	valid_0's l2: 0.224878
[46]	valid_0's l2: 0.220847
[47]	valid_0's l2: 0.216928
[48]	valid_0's l2: 0.213423
[49]	valid_0's l2: 0.209869
[50]	valid_0's l2: 0.206618
[51]	valid_0's l2: 0.203104
[52]	valid_0's l2: 0.200198
[53]	valid_0's l2: 0.197226
[54]	valid_0's l2: 0.194248
[55]	valid_0's l2: 0.19163
[56]	valid_0's l2: 0.188989
[57]	valid_0's l2: 0.186768
[58]	valid_0's l2: 0.184316
[59]	valid_0's l2: 0.181846
[60]	valid_0's l2: 0.179628
[61]	valid_0's l2: 0.177584
[62]	valid_0's l2: 0.175533
[63]	valid_0's l2: 0.173613
[64]	valid_0's l2: 0.17185
[65]	valid_0's l2: 0.170217
[66]	valid_0's l2: 0.168476
[67]	valid_0's l2: 0.166895
[68]	valid_0's l2: 0.165221
[69]	valid_0's l2: 0.163738
[70]	valid_0's l2: 0.162531
[71]	valid_0's l2: 0.161344
[72]	valid_0's l2: 0.160004
[73]	valid_0's l2: 0.158696
[74]	valid_0's l2: 0.157288
[75]	valid_0's l2: 0.155998
[76]	valid_0's l2: 0.15503
[77]	valid_0's l2: 0.1542
[78]	valid_0's l2: 0.153279
[79]	valid_0's l2: 0.152067
[80]	valid_0's l2: 0.15107
[81]	valid_0's l2: 0.150341
[82]	valid_0's l2: 0.149716
[83]	valid_0's l2: 0.148821
[84]	valid_0's l2: 0.148028
[85]	valid_0's l2: 0.147276
[86]	valid_0's l2: 0.146672
[87]	valid_0's l2: 0.145871
[88]	valid_0's l2: 0.145176
[89]	valid_0's l2: 0.144463
[90]	valid_0's l2: 0.143833
[91]	valid_0's l2: 0.143094
[92]	valid_0's l2: 0.142517
[93]	valid_0's l2: 0.141894
[94]	valid_0's l2: 0.141182
[95]	valid_0's l2: 0.140756
[96]	valid_0's l2: 0.140221
[97]	valid_0's l2: 0.139578
[98]	valid_0's l2: 0.139056
[99]	valid_0's l2: 0.138617
[100]	valid_0's l2: 0.137901
[101]	valid_0's l2: 0.137304
[102]	valid_0's l2: 0.13684
[103]	valid_0's l2: 0.13648
[104]	valid_0's l2: 0.13598
[105]	valid_0's l2: 0.135534
[106]	valid_0's l2: 0.135175
[107]	valid_0's l2: 0.134796
[108]	valid_0's l2: 0.134354
[109]	valid_0's l2: 0.13388
[110]	valid_0's l2: 0.13349
[111]	valid_0's l2: 0.133151
[112]	valid_0's l2: 0.132774
[113]	valid_0's l2: 0.132428
[114]	valid_0's l2: 0.132023
[115]	valid_0's l2: 0.131581
[116]	valid_0's l2: 0.131294
[117]	valid_0's l2: 0.131037
[118]	valid_0's l2: 0.130764
[119]	valid_0's l2: 0.130524
[120]	valid_0's l2: 0.130234
[121]	valid_0's l2: 0.130037
[122]	valid_0's l2: 0.129779
[123]	valid_0's l2: 0.129572
[124]	valid_0's l2: 0.129269
[125]	valid_0's l2: 0.12902
[126]	valid_0's l2: 0.128784
[127]	valid_0's l2: 0.128563
[128]	valid_0's l2: 0.128316
[129]	valid_0's l2: 0.128097
[130]	valid_0's l2: 0.127928
[131]	valid_0's l2: 0.127674
[132]	valid_0's l2: 0.127558
[133]	valid_0's l2: 0.127249
[134]	valid_0's l2: 0.127054
[135]	valid_0's l2: 0.126784
[136]	valid_0's l2: 0.126564
[137]	valid_0's l2: 0.126427
[138]	valid_0's l2: 0.126325
[139]	valid_0's l2: 0.126169
[140]	valid_0's l2: 0.125961
[141]	valid_0's l2: 0.125765
[142]	valid_0's l2: 0.125582
[143]	valid_0's l2: 0.125344
[144]	valid_0's l2: 0.125255
[145]	valid_0's l2: 0.125064
[146]	valid_0's l2: 0.12491
[147]	valid_0's l2: 0.124751
[148]	valid_0's l2: 0.124617
[149]	valid_0's l2: 0.124444
[150]	valid_0's l2: 0.12425
[151]	valid_0's l2: 0.124184
[152]	valid_0's l2: 0.123996
[153]	valid_0's l2: 0.123928
[154]	valid_0's l2: 0.123756
[155]	valid_0's l2: 0.123536
[156]	valid_0's l2: 0.123461
[157]	valid_0's l2: 0.123334
[158]	valid_0's l2: 0.123178
[159]	valid_0's l2: 0.122978
[160]	valid_0's l2: 0.122906
[161]	valid_0's l2: 0.122794
[162]	valid_0's l2: 0.122735
[163]	valid_0's l2: 0.122631
[164]	valid_0's l2: 0.122532
[165]	valid_0's l2: 0.122428
[166]	valid_0's l2: 0.122406
[167]	valid_0's l2: 0.122365
[168]	valid_0's l2: 0.122219
[169]	valid_0's l2: 0.12221
[170]	valid_0's l2: 0.122085
[171]	valid_0's l2: 0.122005
[172]	valid_0's l2: 0.121934
[173]	valid_0's l2: 0.121767
[174]	valid_0's l2: 0.121818
[175]	valid_0's l2: 0.121747
[176]	valid_0's l2: 0.121677
[177]	valid_0's l2: 0.121591
[178]	valid_0's l2: 0.121582
[179]	valid_0's l2: 0.121484
[180]	valid_0's l2: 0.1214
[181]	valid_0's l2: 0.121248
[182]	valid_0's l2: 0.12112
[183]	valid_0's l2: 0.121066
[184]	valid_0's l2: 0.120999
[185]	valid_0's l2: 0.120897
[186]	valid_0's l2: 0.120918
[187]	valid_0's l2: 0.120862
[188]	valid_0's l2: 0.120789
[189]	valid_0's l2: 0.12073
[190]	valid_0's l2: 0.120684
[191]	valid_0's l2: 0.120622
[192]	valid_0's l2: 0.120585
[193]	valid_0's l2: 0.120482
[194]	valid_0's l2: 0.12043
[195]	valid_0's l2: 0.120443
[196]	valid_0's l2: 0.120366
[197]	valid_0's l2: 0.120223
[198]	valid_0's l2: 0.120173
[199]	valid_0's l2: 0.120144
[200]	valid_0's l2: 0.120074
[201]	valid_0's l2: 0.120014
[202]	valid_0's l2: 0.119923
[203]	valid_0's l2: 0.119829
[204]	valid_0's l2: 0.119771
[205]	valid_0's l2: 0.119675
[206]	valid_0's l2: 0.11956
[207]	valid_0's l2: 0.119615
[208]	valid_0's l2: 0.119598
[209]	valid_0's l2: 0.119612
[210]	valid_0's l2: 0.119571
[211]	valid_0's l2: 0.119534
[212]	valid_0's l2: 0.119483
[213]	valid_0's l2: 0.119477
[214]	valid_0's l2: 0.119439
[215]	valid_0's l2: 0.119413
[216]	valid_0's l2: 0.119368
[217]	valid_0's l2: 0.119304
[218]	valid_0's l2: 0.119201
[219]	valid_0's l2: 0.119133
[220]	valid_0's l2: 0.119044
[221]	valid_0's l2: 0.11902
[222]	valid_0's l2: 0.118855
[223]	valid_0's l2: 0.118823
[224]	valid_0's l2: 0.118805
[225]	valid_0's l2: 0.118776
[226]	valid_0's l2: 0.118777
[227]	valid_0's l2: 0.118727
[228]	valid_0's l2: 0.118676
[229]	valid_0's l2: 0.118682
[230]	valid_0's l2: 0.118591
[231]	valid_0's l2: 0.118531
[232]	valid_0's l2: 0.118467
[233]	valid_0's l2: 0.118464
[234]	valid_0's l2: 0.118355
[235]	valid_0's l2: 0.118341
[236]	valid_0's l2: 0.118404
[237]	valid_0's l2: 0.118371
[238]	valid_0's l2: 0.118304
[239]	valid_0's l2: 0.118237
[240]	valid_0's l2: 0.118115
[241]	valid_0's l2: 0.11802
[242]	valid_0's l2: 0.118064
[243]	valid_0's l2: 0.118044
[244]	valid_0's l2: 0.118033
[245]	valid_0's l2: 0.117901
[246]	valid_0's l2: 0.117935
[247]	valid_0's l2: 0.117907
[248]	valid_0's l2: 0.117851
[249]	valid_0's l2: 0.117779
[250]	valid_0's l2: 0.117733
[251]	valid_0's l2: 0.117687
[252]	valid_0's l2: 0.1177
[253]	valid_0's l2: 0.117634
[254]	valid_0's l2: 0.117604
[255]	valid_0's l2: 0.117523
[256]	valid_0's l2: 0.117526
[257]	valid_0's l2: 0.117537
[258]	valid_0's l2: 0.117514
[259]	valid_0's l2: 0.117486
[260]	valid_0's l2: 0.117431
[261]	valid_0's l2: 0.117423
[262]	valid_0's l2: 0.117415
[263]	valid_0's l2: 0.117318
[264]	valid_0's l2: 0.117347
[265]	valid_0's l2: 0.117354
[266]	valid_0's l2: 0.117369
[267]	valid_0's l2: 0.117323
[268]	valid_0's l2: 0.117319
[269]	valid_0's l2: 0.117312
[270]	valid_0's l2: 0.117242
[271]	valid_0's l2: 0.117219
[272]	valid_0's l2: 0.117197
[273]	valid_0's l2: 0.117206
[274]	valid_0's l2: 0.117185
[275]	valid_0's l2: 0.117144
[276]	valid_0's l2: 0.117173
[277]	valid_0's l2: 0.117149
[278]	valid_0's l2: 0.117094
[279]	valid_0's l2: 0.117011
[280]	valid_0's l2: 0.117057
[281]	valid_0's l2: 0.117051
[282]	valid_0's l2: 0.11699
[283]	valid_0's l2: 0.116972
[284]	valid_0's l2: 0.116944
[285]	valid_0's l2: 0.116927
[286]	valid_0's l2: 0.116896
[287]	valid_0's l2: 0.116851
[288]	valid_0's l2: 0.116828
[289]	valid_0's l2: 0.116835
[290]	valid_0's l2: 0.116745
[291]	valid_0's l2: 0.116734
[292]	valid_0's l2: 0.116606
[293]	valid_0's l2: 0.116575
[294]	valid_0's l2: 0.116496
[295]	valid_0's l2: 0.116509
[296]	valid_0's l2: 0.116569
[297]	valid_0's l2: 0.11651
[298]	valid_0's l2: 0.116459
[299]	valid_0's l2: 0.116448
[300]	valid_0's l2: 0.116377
[301]	valid_0's l2: 0.116332
[302]	valid_0's l2: 0.116302
[303]	valid_0's l2: 0.116206
[304]	valid_0's l2: 0.116181
[305]	valid_0's l2: 0.11621
[306]	valid_0's l2: 0.116195
[307]	valid_0's l2: 0.116138
[308]	valid_0's l2: 0.116128
[309]	valid_0's l2: 0.116061
[310]	valid_0's l2: 0.116021
[311]	valid_0's l2: 0.116014
[312]	valid_0's l2: 0.11599
[313]	valid_0's l2: 0.115987
[314]	valid_0's l2: 0.115936
[315]	valid_0's l2: 0.115952
[316]	valid_0's l2: 0.115952
[317]	valid_0's l2: 0.11591
[318]	valid_0's l2: 0.115905
[319]	valid_0's l2: 0.115813
[320]	valid_0's l2: 0.115804
[321]	valid_0's l2: 0.115803
[322]	valid_0's l2: 0.115789
[323]	valid_0's l2: 0.115777
[324]	valid_0's l2: 0.115737
[325]	valid_0's l2: 0.11575
[326]	valid_0's l2: 0.115736
[327]	valid_0's l2: 0.115735
[328]	valid_0's l2: 0.115712
[329]	valid_0's l2: 0.115731
[330]	valid_0's l2: 0.115698
[331]	valid_0's l2: 0.115667
[332]	valid_0's l2: 0.115652
[333]	valid_0's l2: 0.11564
[334]	valid_0's l2: 0.115639
[335]	valid_0's l2: 0.115645
[336]	valid_0's l2: 0.115679
[337]	valid_0's l2: 0.115679
[338]	valid_0's l2: 0.115659
[339]	valid_0's l2: 0.115693
[340]	valid_0's l2: 0.115663
[341]	valid_0's l2: 0.115645
[342]	valid_0's l2: 0.115628
[343]	valid_0's l2: 0.115643
[344]	valid_0's l2: 0.115552
[345]	valid_0's l2: 0.115512
[346]	valid_0's l2: 0.115459
[347]	valid_0's l2: 0.115434
[348]	valid_0's l2: 0.115415
[349]	valid_0's l2: 0.115389
[350]	valid_0's l2: 0.115343
[351]	valid_0's l2: 0.115299
[352]	valid_0's l2: 0.11531
[353]	valid_0's l2: 0.115358
[354]	valid_0's l2: 0.115296
[355]	valid_0's l2: 0.115279
[356]	valid_0's l2: 0.115275
[357]	valid_0's l2: 0.115236
[358]	valid_0's l2: 0.115306
[359]	valid_0's l2: 0.115319
[360]	valid_0's l2: 0.115298
[361]	valid_0's l2: 0.115298
[362]	valid_0's l2: 0.115309
[363]	valid_0's l2: 0.115352
[364]	valid_0's l2: 0.115392
[365]	valid_0's l2: 0.11543
[366]	valid_0's l2: 0.115422
[367]	valid_0's l2: 0.115375
Early stopping, best iteration is:
[357]	valid_0's l2: 0.115236
lgb now score is: [0.11464525315121991, 0.1001602219958722, 0.10375315940652878, 0.11523585238782036]
[LightGBM] [Warning] Unknown parameter: tree_method
[LightGBM] [Warning] Unknown parameter: colsample_bylevel
[LightGBM] [Warning] Unknown parameter: silent
[LightGBM] [Warning] Unknown parameter: tree_method
[LightGBM] [Warning] Unknown parameter: colsample_bylevel
[LightGBM] [Warning] Unknown parameter: silent
[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000791 seconds.
You can set `force_col_wise=true` to remove the overhead.
[LightGBM] [Info] Total Bins 8848
[LightGBM] [Info] Number of data points in the train set: 2311, number of used features: 38
[LightGBM] [Warning] Unknown parameter: tree_method
[LightGBM] [Warning] Unknown parameter: colsample_bylevel
[LightGBM] [Warning] Unknown parameter: silent
[LightGBM] [Info] Start training from score 0.130812
[1]	valid_0's l2: 0.936206
Training until validation scores don't improve for 10 rounds
[2]	valid_0's l2: 0.895841
[3]	valid_0's l2: 0.857167
[4]	valid_0's l2: 0.820518
[5]	valid_0's l2: 0.786159
[6]	valid_0's l2: 0.752506
[7]	valid_0's l2: 0.721687
[8]	valid_0's l2: 0.692328
[9]	valid_0's l2: 0.664303
[10]	valid_0's l2: 0.638598
[11]	valid_0's l2: 0.613257
[12]	valid_0's l2: 0.589258
[13]	valid_0's l2: 0.56658
[14]	valid_0's l2: 0.545071
[15]	valid_0's l2: 0.524812
[16]	valid_0's l2: 0.505232
[17]	valid_0's l2: 0.486845
[18]	valid_0's l2: 0.470246
[19]	valid_0's l2: 0.453394
[20]	valid_0's l2: 0.437888
[21]	valid_0's l2: 0.422856
[22]	valid_0's l2: 0.408857
[23]	valid_0's l2: 0.395189
[24]	valid_0's l2: 0.382459
[25]	valid_0's l2: 0.370937
[26]	valid_0's l2: 0.35901
[27]	valid_0's l2: 0.347942
[28]	valid_0's l2: 0.337296
[29]	valid_0's l2: 0.327316
[30]	valid_0's l2: 0.318055
[31]	valid_0's l2: 0.308764
[32]	valid_0's l2: 0.300115
[33]	valid_0's l2: 0.29191
[34]	valid_0's l2: 0.284227
[35]	valid_0's l2: 0.276487
[36]	valid_0's l2: 0.269354
[37]	valid_0's l2: 0.262784
[38]	valid_0's l2: 0.256853
[39]	valid_0's l2: 0.250473
[40]	valid_0's l2: 0.244643
[41]	valid_0's l2: 0.239244
[42]	valid_0's l2: 0.234111
[43]	valid_0's l2: 0.229482
[44]	valid_0's l2: 0.224734
[45]	valid_0's l2: 0.220344
[46]	valid_0's l2: 0.216184
[47]	valid_0's l2: 0.212055
[48]	valid_0's l2: 0.207895
[49]	valid_0's l2: 0.204274
[50]	valid_0's l2: 0.200657
[51]	valid_0's l2: 0.197318
[52]	valid_0's l2: 0.194157
[53]	valid_0's l2: 0.190906
[54]	valid_0's l2: 0.187951
[55]	valid_0's l2: 0.184928
[56]	valid_0's l2: 0.18224
[57]	valid_0's l2: 0.179823
[58]	valid_0's l2: 0.177306
[59]	valid_0's l2: 0.17482
[60]	valid_0's l2: 0.17251
[61]	valid_0's l2: 0.170311
[62]	valid_0's l2: 0.168198
[63]	valid_0's l2: 0.166429
[64]	valid_0's l2: 0.16444
[65]	valid_0's l2: 0.162641
[66]	valid_0's l2: 0.16089
[67]	valid_0's l2: 0.15942
[68]	valid_0's l2: 0.157732
[69]	valid_0's l2: 0.156235
[70]	valid_0's l2: 0.154809
[71]	valid_0's l2: 0.153308
[72]	valid_0's l2: 0.151879
[73]	valid_0's l2: 0.150654
[74]	valid_0's l2: 0.149304
[75]	valid_0's l2: 0.148061
[76]	valid_0's l2: 0.146904
[77]	valid_0's l2: 0.145824
[78]	valid_0's l2: 0.144673
[79]	valid_0's l2: 0.143632
[80]	valid_0's l2: 0.142637
[81]	valid_0's l2: 0.141813
[82]	valid_0's l2: 0.140874
[83]	valid_0's l2: 0.140128
[84]	valid_0's l2: 0.139218
[85]	valid_0's l2: 0.13836
[86]	valid_0's l2: 0.137665
[87]	valid_0's l2: 0.136774
[88]	valid_0's l2: 0.136091
[89]	valid_0's l2: 0.135301
[90]	valid_0's l2: 0.134483
[91]	valid_0's l2: 0.133781
[92]	valid_0's l2: 0.133191
[93]	valid_0's l2: 0.132801
[94]	valid_0's l2: 0.132071
[95]	valid_0's l2: 0.131386
[96]	valid_0's l2: 0.13081
[97]	valid_0's l2: 0.130295
[98]	valid_0's l2: 0.129822
[99]	valid_0's l2: 0.129201
[100]	valid_0's l2: 0.128688
[101]	valid_0's l2: 0.128175
[102]	valid_0's l2: 0.127658
[103]	valid_0's l2: 0.127142
[104]	valid_0's l2: 0.126764
[105]	valid_0's l2: 0.12633
[106]	valid_0's l2: 0.1259
[107]	valid_0's l2: 0.125436
[108]	valid_0's l2: 0.125023
[109]	valid_0's l2: 0.124531
[110]	valid_0's l2: 0.124266
[111]	valid_0's l2: 0.124039
[112]	valid_0's l2: 0.123578
[113]	valid_0's l2: 0.12328
[114]	valid_0's l2: 0.122965
[115]	valid_0's l2: 0.122615
[116]	valid_0's l2: 0.12231
[117]	valid_0's l2: 0.121995
[118]	valid_0's l2: 0.12178
[119]	valid_0's l2: 0.121599
[120]	valid_0's l2: 0.121311
[121]	valid_0's l2: 0.121056
[122]	valid_0's l2: 0.120772
[123]	valid_0's l2: 0.120369
[124]	valid_0's l2: 0.120029
[125]	valid_0's l2: 0.119837
[126]	valid_0's l2: 0.119627
[127]	valid_0's l2: 0.119299
[128]	valid_0's l2: 0.119159
[129]	valid_0's l2: 0.118872
[130]	valid_0's l2: 0.118819
[131]	valid_0's l2: 0.118669
[132]	valid_0's l2: 0.118404
[133]	valid_0's l2: 0.118245
[134]	valid_0's l2: 0.118092
[135]	valid_0's l2: 0.117967
[136]	valid_0's l2: 0.117839
[137]	valid_0's l2: 0.117458
[138]	valid_0's l2: 0.11716
[139]	valid_0's l2: 0.116996
[140]	valid_0's l2: 0.11679
[141]	valid_0's l2: 0.116548
[142]	valid_0's l2: 0.116298
[143]	valid_0's l2: 0.116123
[144]	valid_0's l2: 0.116018
[145]	valid_0's l2: 0.115965
[146]	valid_0's l2: 0.115854
[147]	valid_0's l2: 0.115679
[148]	valid_0's l2: 0.115415
[149]	valid_0's l2: 0.115337
[150]	valid_0's l2: 0.11512
[151]	valid_0's l2: 0.115015
[152]	valid_0's l2: 0.114857
[153]	valid_0's l2: 0.114625
[154]	valid_0's l2: 0.114484
[155]	valid_0's l2: 0.114351
[156]	valid_0's l2: 0.114251
[157]	valid_0's l2: 0.114188
[158]	valid_0's l2: 0.113966
[159]	valid_0's l2: 0.113816
[160]	valid_0's l2: 0.113629
[161]	valid_0's l2: 0.113567
[162]	valid_0's l2: 0.113474
[163]	valid_0's l2: 0.113372
[164]	valid_0's l2: 0.113188
[165]	valid_0's l2: 0.113099
[166]	valid_0's l2: 0.11293
[167]	valid_0's l2: 0.112879
[168]	valid_0's l2: 0.112909
[169]	valid_0's l2: 0.112728
[170]	valid_0's l2: 0.112666
[171]	valid_0's l2: 0.112566
[172]	valid_0's l2: 0.11255
[173]	valid_0's l2: 0.112469
[174]	valid_0's l2: 0.112368
[175]	valid_0's l2: 0.112383
[176]	valid_0's l2: 0.112212
[177]	valid_0's l2: 0.112214
[178]	valid_0's l2: 0.112292
[179]	valid_0's l2: 0.11215
[180]	valid_0's l2: 0.112031
[181]	valid_0's l2: 0.111966
[182]	valid_0's l2: 0.112036
[183]	valid_0's l2: 0.111839
[184]	valid_0's l2: 0.111743
[185]	valid_0's l2: 0.11162
[186]	valid_0's l2: 0.11155
[187]	valid_0's l2: 0.11144
[188]	valid_0's l2: 0.111387
[189]	valid_0's l2: 0.11135
[190]	valid_0's l2: 0.111384
[191]	valid_0's l2: 0.111317
[192]	valid_0's l2: 0.111246
[193]	valid_0's l2: 0.111207
[194]	valid_0's l2: 0.111208
[195]	valid_0's l2: 0.111178
[196]	valid_0's l2: 0.111088
[197]	valid_0's l2: 0.110978
[198]	valid_0's l2: 0.110861
[199]	valid_0's l2: 0.110857
[200]	valid_0's l2: 0.110852
[201]	valid_0's l2: 0.110765
[202]	valid_0's l2: 0.110762
[203]	valid_0's l2: 0.11076
[204]	valid_0's l2: 0.110639
[205]	valid_0's l2: 0.110585
[206]	valid_0's l2: 0.110508
[207]	valid_0's l2: 0.110447
[208]	valid_0's l2: 0.110415
[209]	valid_0's l2: 0.110324
[210]	valid_0's l2: 0.110356
[211]	valid_0's l2: 0.110303
[212]	valid_0's l2: 0.110307
[213]	valid_0's l2: 0.110288
[214]	valid_0's l2: 0.110188
[215]	valid_0's l2: 0.110091
[216]	valid_0's l2: 0.110052
[217]	valid_0's l2: 0.109953
[218]	valid_0's l2: 0.109941
[219]	valid_0's l2: 0.109894
[220]	valid_0's l2: 0.109813
[221]	valid_0's l2: 0.10971
[222]	valid_0's l2: 0.10966
[223]	valid_0's l2: 0.109711
[224]	valid_0's l2: 0.109603
[225]	valid_0's l2: 0.109622
[226]	valid_0's l2: 0.109553
[227]	valid_0's l2: 0.10954
[228]	valid_0's l2: 0.109503
[229]	valid_0's l2: 0.1094
[230]	valid_0's l2: 0.109398
[231]	valid_0's l2: 0.109377
[232]	valid_0's l2: 0.109411
[233]	valid_0's l2: 0.109335
[234]	valid_0's l2: 0.109301
[235]	valid_0's l2: 0.109261
[236]	valid_0's l2: 0.109191
[237]	valid_0's l2: 0.109043
[238]	valid_0's l2: 0.10901
[239]	valid_0's l2: 0.108991
[240]	valid_0's l2: 0.109039
[241]	valid_0's l2: 0.108958
[242]	valid_0's l2: 0.108935
[243]	valid_0's l2: 0.10891
[244]	valid_0's l2: 0.108871
[245]	valid_0's l2: 0.108813
[246]	valid_0's l2: 0.108771
[247]	valid_0's l2: 0.108763
[248]	valid_0's l2: 0.108667
[249]	valid_0's l2: 0.108628
[250]	valid_0's l2: 0.108674
[251]	valid_0's l2: 0.10863
[252]	valid_0's l2: 0.108624
[253]	valid_0's l2: 0.108566
[254]	valid_0's l2: 0.108488
[255]	valid_0's l2: 0.108443
[256]	valid_0's l2: 0.108413
[257]	valid_0's l2: 0.108392
[258]	valid_0's l2: 0.108343
[259]	valid_0's l2: 0.10836
[260]	valid_0's l2: 0.108346
[261]	valid_0's l2: 0.108361
[262]	valid_0's l2: 0.108367
[263]	valid_0's l2: 0.108289
[264]	valid_0's l2: 0.108259
[265]	valid_0's l2: 0.108212
[266]	valid_0's l2: 0.108262
[267]	valid_0's l2: 0.10822
[268]	valid_0's l2: 0.108101
[269]	valid_0's l2: 0.108019
[270]	valid_0's l2: 0.108008
[271]	valid_0's l2: 0.107976
[272]	valid_0's l2: 0.108033
[273]	valid_0's l2: 0.107989
[274]	valid_0's l2: 0.107926
[275]	valid_0's l2: 0.107947
[276]	valid_0's l2: 0.107964
[277]	valid_0's l2: 0.107889
[278]	valid_0's l2: 0.107856
[279]	valid_0's l2: 0.107835
[280]	valid_0's l2: 0.107718
[281]	valid_0's l2: 0.107649
[282]	valid_0's l2: 0.107641
[283]	valid_0's l2: 0.107624
[284]	valid_0's l2: 0.107612
[285]	valid_0's l2: 0.1076
[286]	valid_0's l2: 0.10755
[287]	valid_0's l2: 0.107525
[288]	valid_0's l2: 0.107403
[289]	valid_0's l2: 0.107386
[290]	valid_0's l2: 0.107421
[291]	valid_0's l2: 0.107425
[292]	valid_0's l2: 0.107383
[293]	valid_0's l2: 0.107328
[294]	valid_0's l2: 0.107313
[295]	valid_0's l2: 0.107297
[296]	valid_0's l2: 0.107275
[297]	valid_0's l2: 0.107286
[298]	valid_0's l2: 0.107273
[299]	valid_0's l2: 0.107216
[300]	valid_0's l2: 0.107143
[301]	valid_0's l2: 0.107092
[302]	valid_0's l2: 0.107074
[303]	valid_0's l2: 0.107119
[304]	valid_0's l2: 0.107051
[305]	valid_0's l2: 0.107027
[306]	valid_0's l2: 0.106993
[307]	valid_0's l2: 0.106946
[308]	valid_0's l2: 0.106894
[309]	valid_0's l2: 0.10686
[310]	valid_0's l2: 0.106841
[311]	valid_0's l2: 0.106812
[312]	valid_0's l2: 0.106788
[313]	valid_0's l2: 0.106774
[314]	valid_0's l2: 0.106784
[315]	valid_0's l2: 0.106725
[316]	valid_0's l2: 0.106675
[317]	valid_0's l2: 0.106631
[318]	valid_0's l2: 0.106652
[319]	valid_0's l2: 0.106643
[320]	valid_0's l2: 0.106649
[321]	valid_0's l2: 0.106614
[322]	valid_0's l2: 0.106688
[323]	valid_0's l2: 0.106714
[324]	valid_0's l2: 0.106738
[325]	valid_0's l2: 0.106739
[326]	valid_0's l2: 0.106729
[327]	valid_0's l2: 0.10675
[328]	valid_0's l2: 0.106749
[329]	valid_0's l2: 0.106699
[330]	valid_0's l2: 0.106733
[331]	valid_0's l2: 0.106719
Early stopping, best iteration is:
[321]	valid_0's l2: 0.106614
lgb now score is: [0.11464525315121991, 0.1001602219958722, 0.10375315940652878, 0.11523585238782036, 0.10661431579572031]
lgb_score_list: [0.11464525315121991, 0.1001602219958722, 0.10375315940652878, 0.11523585238782036, 0.10661431579572031]
lgb_score_mean: 0.1080817605474323
(2888, 38)
(2888, 39)
lgb_reg
False
lgb_reg
[LightGBM] [Warning] Unknown parameter: tree_method
[LightGBM] [Warning] Unknown parameter: colsample_bylevel
[LightGBM] [Warning] Unknown parameter: silent
[LightGBM] [Warning] Unknown parameter: tree_method
[LightGBM] [Warning] Unknown parameter: colsample_bylevel
[LightGBM] [Warning] Unknown parameter: silent
[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000872 seconds.
You can set `force_col_wise=true` to remove the overhead.
[LightGBM] [Info] Total Bins 9126
[LightGBM] [Info] Number of data points in the train set: 2888, number of used features: 39
[LightGBM] [Warning] Unknown parameter: tree_method
[LightGBM] [Warning] Unknown parameter: colsample_bylevel
[LightGBM] [Warning] Unknown parameter: silent
[LightGBM] [Info] Start training from score 0.126353
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[1]	valid_0's l2: 0.920198
Training until validation scores don't improve for 10 rounds
[2]	valid_0's l2: 0.877501
[3]	valid_0's l2: 0.837096
[4]	valid_0's l2: 0.796634
[5]	valid_0's l2: 0.760988
[6]	valid_0's l2: 0.726895
[7]	valid_0's l2: 0.692514
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[8]	valid_0's l2: 0.66003
[9]	valid_0's l2: 0.629321
[10]	valid_0's l2: 0.600303
[11]	valid_0's l2: 0.574484
[12]	valid_0's l2: 0.550053
[13]	valid_0's l2: 0.525398
[14]	valid_0's l2: 0.503693
[15]	valid_0's l2: 0.481571
[16]	valid_0's l2: 0.460633
[17]	valid_0's l2: 0.440842
[18]	valid_0's l2: 0.422154
[19]	valid_0's l2: 0.405527
[20]	valid_0's l2: 0.388729
[21]	valid_0's l2: 0.372852
[22]	valid_0's l2: 0.358731
[23]	valid_0's l2: 0.344477
[24]	valid_0's l2: 0.331865
[25]	valid_0's l2: 0.320117
[26]	valid_0's l2: 0.30877
[27]	valid_0's l2: 0.297215
[28]	valid_0's l2: 0.286285
[29]	valid_0's l2: 0.276795
[30]	valid_0's l2: 0.266987
[31]	valid_0's l2: 0.257675
[32]	valid_0's l2: 0.248847
[33]	valid_0's l2: 0.240504
[34]	valid_0's l2: 0.23261
[35]	valid_0's l2: 0.225159
[36]	valid_0's l2: 0.21812
[37]	valid_0's l2: 0.211375
[38]	valid_0's l2: 0.205017
[39]	valid_0's l2: 0.199026
[40]	valid_0's l2: 0.193628
[41]	valid_0's l2: 0.188137
[42]	valid_0's l2: 0.183008
[43]	valid_0's l2: 0.178142
[44]	valid_0's l2: 0.173445
[45]	valid_0's l2: 0.169077
[46]	valid_0's l2: 0.165068
[47]	valid_0's l2: 0.161044
[48]	valid_0's l2: 0.157256
[49]	valid_0's l2: 0.153645
[50]	valid_0's l2: 0.150257
[51]	valid_0's l2: 0.146919
[52]	valid_0's l2: 0.143854
[53]	valid_0's l2: 0.140868
[54]	valid_0's l2: 0.137998
[55]	valid_0's l2: 0.135382
[56]	valid_0's l2: 0.132761
[57]	valid_0's l2: 0.130272
[58]	valid_0's l2: 0.12792
[59]	valid_0's l2: 0.125694
[60]	valid_0's l2: 0.123562
[61]	valid_0's l2: 0.12158
[62]	valid_0's l2: 0.119611
[63]	valid_0's l2: 0.117707
[64]	valid_0's l2: 0.115863
[65]	valid_0's l2: 0.114163
[66]	valid_0's l2: 0.112492
[67]	valid_0's l2: 0.110916
[68]	valid_0's l2: 0.109392
[69]	valid_0's l2: 0.107826
[70]	valid_0's l2: 0.106362
[71]	valid_0's l2: 0.10497
[72]	valid_0's l2: 0.103623
[73]	valid_0's l2: 0.10234
[74]	valid_0's l2: 0.101078
[75]	valid_0's l2: 0.0998728
[76]	valid_0's l2: 0.0987565
[77]	valid_0's l2: 0.0976769
[78]	valid_0's l2: 0.0966
[79]	valid_0's l2: 0.0956068
[80]	valid_0's l2: 0.0946104
[81]	valid_0's l2: 0.093656
[82]	valid_0's l2: 0.0926436
[83]	valid_0's l2: 0.0917123
[84]	valid_0's l2: 0.090806
[85]	valid_0's l2: 0.0899506
[86]	valid_0's l2: 0.0891238
[87]	valid_0's l2: 0.0883361
[88]	valid_0's l2: 0.0875694
[89]	valid_0's l2: 0.0868473
[90]	valid_0's l2: 0.0860895
[91]	valid_0's l2: 0.0853649
[92]	valid_0's l2: 0.0846328
[93]	valid_0's l2: 0.0838801
[94]	valid_0's l2: 0.0832734
[95]	valid_0's l2: 0.082608
[96]	valid_0's l2: 0.0819408
[97]	valid_0's l2: 0.0813099
[98]	valid_0's l2: 0.0807066
[99]	valid_0's l2: 0.0800998
[100]	valid_0's l2: 0.079549
[101]	valid_0's l2: 0.0789841
[102]	valid_0's l2: 0.0784413
[103]	valid_0's l2: 0.0778946
[104]	valid_0's l2: 0.0773375
[105]	valid_0's l2: 0.076757
[106]	valid_0's l2: 0.076234
[107]	valid_0's l2: 0.0757108
[108]	valid_0's l2: 0.0752375
[109]	valid_0's l2: 0.0747273
[110]	valid_0's l2: 0.0742649
[111]	valid_0's l2: 0.0737894
[112]	valid_0's l2: 0.0733377
[113]	valid_0's l2: 0.0728907
[114]	valid_0's l2: 0.0724813
[115]	valid_0's l2: 0.0720514
[116]	valid_0's l2: 0.0716235
[117]	valid_0's l2: 0.0712518
[118]	valid_0's l2: 0.0708029
[119]	valid_0's l2: 0.0703807
[120]	valid_0's l2: 0.0699569
[121]	valid_0's l2: 0.0695757
[122]	valid_0's l2: 0.069188
[123]	valid_0's l2: 0.0688301
[124]	valid_0's l2: 0.0684313
[125]	valid_0's l2: 0.0680283
[126]	valid_0's l2: 0.0676548
[127]	valid_0's l2: 0.0672451
[128]	valid_0's l2: 0.0668647
[129]	valid_0's l2: 0.0664871
[130]	valid_0's l2: 0.0661413
[131]	valid_0's l2: 0.065796
[132]	valid_0's l2: 0.0654315
[133]	valid_0's l2: 0.0650831
[134]	valid_0's l2: 0.0647244
[135]	valid_0's l2: 0.0643841
[136]	valid_0's l2: 0.0640391
[137]	valid_0's l2: 0.0637384
[138]	valid_0's l2: 0.0634262
[139]	valid_0's l2: 0.0631044
[140]	valid_0's l2: 0.062778
[141]	valid_0's l2: 0.0624782
[142]	valid_0's l2: 0.0621561
[143]	valid_0's l2: 0.0618215
[144]	valid_0's l2: 0.0615053
[145]	valid_0's l2: 0.0611636
[146]	valid_0's l2: 0.0608256
[147]	valid_0's l2: 0.0605239
[148]	valid_0's l2: 0.0602332
[149]	valid_0's l2: 0.05994
[150]	valid_0's l2: 0.0596534
[151]	valid_0's l2: 0.0593666
[152]	valid_0's l2: 0.0590838
[153]	valid_0's l2: 0.0588218
[154]	valid_0's l2: 0.0585661
[155]	valid_0's l2: 0.0582825
[156]	valid_0's l2: 0.0580247
[157]	valid_0's l2: 0.0577481
[158]	valid_0's l2: 0.0574347
[159]	valid_0's l2: 0.0571654
[160]	valid_0's l2: 0.0568895
[161]	valid_0's l2: 0.0566506
[162]	valid_0's l2: 0.05642
[163]	valid_0's l2: 0.0561828
[164]	valid_0's l2: 0.0559371
[165]	valid_0's l2: 0.0557143
[166]	valid_0's l2: 0.0554341
[167]	valid_0's l2: 0.0552245
[168]	valid_0's l2: 0.0549674
[169]	valid_0's l2: 0.0547357
[170]	valid_0's l2: 0.0544524
[171]	valid_0's l2: 0.0541773
[172]	valid_0's l2: 0.0539216
[173]	valid_0's l2: 0.0536876
[174]	valid_0's l2: 0.0534448
[175]	valid_0's l2: 0.0532163
[176]	valid_0's l2: 0.0529931
[177]	valid_0's l2: 0.0527497
[178]	valid_0's l2: 0.0525413
[179]	valid_0's l2: 0.0523342
[180]	valid_0's l2: 0.0520939
[181]	valid_0's l2: 0.0518482
[182]	valid_0's l2: 0.0515886
[183]	valid_0's l2: 0.0513792
[184]	valid_0's l2: 0.0511629
[185]	valid_0's l2: 0.050926
[186]	valid_0's l2: 0.0507362
[187]	valid_0's l2: 0.0505168
[188]	valid_0's l2: 0.0503015
[189]	valid_0's l2: 0.0501021
[190]	valid_0's l2: 0.0498635
[191]	valid_0's l2: 0.0496362
[192]	valid_0's l2: 0.0494557
[193]	valid_0's l2: 0.0492321
[194]	valid_0's l2: 0.0490165
[195]	valid_0's l2: 0.0488094
[196]	valid_0's l2: 0.0486346
[197]	valid_0's l2: 0.0484021
[198]	valid_0's l2: 0.0482034
[199]	valid_0's l2: 0.0480083
[200]	valid_0's l2: 0.0478169
[201]	valid_0's l2: 0.047626
[202]	valid_0's l2: 0.0474026
[203]	valid_0's l2: 0.0472183
[204]	valid_0's l2: 0.046995
[205]	valid_0's l2: 0.0467757
[206]	valid_0's l2: 0.0465913
[207]	valid_0's l2: 0.0463583
[208]	valid_0's l2: 0.04615
[209]	valid_0's l2: 0.0459667
[210]	valid_0's l2: 0.0457877
[211]	valid_0's l2: 0.0455888
[212]	valid_0's l2: 0.0454067
[213]	valid_0's l2: 0.0452036
[214]	valid_0's l2: 0.0450257
[215]	valid_0's l2: 0.0448604
[216]	valid_0's l2: 0.044662
[217]	valid_0's l2: 0.0444738
[218]	valid_0's l2: 0.0443127
[219]	valid_0's l2: 0.0441101
[220]	valid_0's l2: 0.0439273
[221]	valid_0's l2: 0.043749
[222]	valid_0's l2: 0.0435535
[223]	valid_0's l2: 0.0434021
[224]	valid_0's l2: 0.0432227
[225]	valid_0's l2: 0.043051
[226]	valid_0's l2: 0.0428982
[227]	valid_0's l2: 0.0427408
[228]	valid_0's l2: 0.0425735
[229]	valid_0's l2: 0.0424031
[230]	valid_0's l2: 0.0422558
[231]	valid_0's l2: 0.0420839
[232]	valid_0's l2: 0.0419328
[233]	valid_0's l2: 0.0417914
[234]	valid_0's l2: 0.041622
[235]	valid_0's l2: 0.0414701
[236]	valid_0's l2: 0.0412914
[237]	valid_0's l2: 0.0411226
[238]	valid_0's l2: 0.0409581
[239]	valid_0's l2: 0.0407911
[240]	valid_0's l2: 0.0406334
[241]	valid_0's l2: 0.0404966
[242]	valid_0's l2: 0.0403621
[243]	valid_0's l2: 0.0402119
[244]	valid_0's l2: 0.0400429
[245]	valid_0's l2: 0.039876
[246]	valid_0's l2: 0.0397355
[247]	valid_0's l2: 0.0395668
[248]	valid_0's l2: 0.0394274
[249]	valid_0's l2: 0.0392655
[250]	valid_0's l2: 0.0391132
[251]	valid_0's l2: 0.0389681
[252]	valid_0's l2: 0.0388175
[253]	valid_0's l2: 0.0386789
[254]	valid_0's l2: 0.0385332
[255]	valid_0's l2: 0.0383852
[256]	valid_0's l2: 0.0382532
[257]	valid_0's l2: 0.0380884
[258]	valid_0's l2: 0.037932
[259]	valid_0's l2: 0.0377994
[260]	valid_0's l2: 0.0376666
[261]	valid_0's l2: 0.0375289
[262]	valid_0's l2: 0.0373884
[263]	valid_0's l2: 0.0372284
[264]	valid_0's l2: 0.0370883
[265]	valid_0's l2: 0.0369467
[266]	valid_0's l2: 0.0367985
[267]	valid_0's l2: 0.0366672
[268]	valid_0's l2: 0.0365261
[269]	valid_0's l2: 0.0364001
[270]	valid_0's l2: 0.0362617
[271]	valid_0's l2: 0.03614
[272]	valid_0's l2: 0.0359984
[273]	valid_0's l2: 0.0358566
[274]	valid_0's l2: 0.0357353
[275]	valid_0's l2: 0.0356
[276]	valid_0's l2: 0.0354529
[277]	valid_0's l2: 0.0353143
[278]	valid_0's l2: 0.0351655
[279]	valid_0's l2: 0.0350389
[280]	valid_0's l2: 0.0349027
[281]	valid_0's l2: 0.0347735
[282]	valid_0's l2: 0.034644
[283]	valid_0's l2: 0.0345016
[284]	valid_0's l2: 0.0343841
[285]	valid_0's l2: 0.0342532
[286]	valid_0's l2: 0.0341224
[287]	valid_0's l2: 0.0339967
[288]	valid_0's l2: 0.0338592
[289]	valid_0's l2: 0.0337322
[290]	valid_0's l2: 0.0336135
[291]	valid_0's l2: 0.0334987
[292]	valid_0's l2: 0.0333646
[293]	valid_0's l2: 0.0332563
[294]	valid_0's l2: 0.0331167
[295]	valid_0's l2: 0.0330035
[296]	valid_0's l2: 0.0328742
[297]	valid_0's l2: 0.0327561
[298]	valid_0's l2: 0.0326382
[299]	valid_0's l2: 0.0325294
[300]	valid_0's l2: 0.0324194
[301]	valid_0's l2: 0.0323175
[302]	valid_0's l2: 0.0321924
[303]	valid_0's l2: 0.0320636
[304]	valid_0's l2: 0.0319467
[305]	valid_0's l2: 0.0317993
[306]	valid_0's l2: 0.0316835
[307]	valid_0's l2: 0.0315738
[308]	valid_0's l2: 0.0314605
[309]	valid_0's l2: 0.0313546
[310]	valid_0's l2: 0.0312225
[311]	valid_0's l2: 0.0311004
[312]	valid_0's l2: 0.0309879
[313]	valid_0's l2: 0.0308794
[314]	valid_0's l2: 0.0307566
[315]	valid_0's l2: 0.0306497
[316]	valid_0's l2: 0.0305359
[317]	valid_0's l2: 0.0304253
[318]	valid_0's l2: 0.0303187
[319]	valid_0's l2: 0.0302078
[320]	valid_0's l2: 0.0300841
[321]	valid_0's l2: 0.0299782
[322]	valid_0's l2: 0.0298677
[323]	valid_0's l2: 0.0297692
[324]	valid_0's l2: 0.0296543
[325]	valid_0's l2: 0.0295466
[326]	valid_0's l2: 0.0294533
[327]	valid_0's l2: 0.0293471
[328]	valid_0's l2: 0.0292263
[329]	valid_0's l2: 0.0291237
[330]	valid_0's l2: 0.029026
[331]	valid_0's l2: 0.028925
[332]	valid_0's l2: 0.0288194
[333]	valid_0's l2: 0.0287153
[334]	valid_0's l2: 0.0286177
[335]	valid_0's l2: 0.0285273
[336]	valid_0's l2: 0.0284344
[337]	valid_0's l2: 0.028326
[338]	valid_0's l2: 0.0282405
[339]	valid_0's l2: 0.0281493
[340]	valid_0's l2: 0.0280444
[341]	valid_0's l2: 0.0279395
[342]	valid_0's l2: 0.0278497
[343]	valid_0's l2: 0.0277544
[344]	valid_0's l2: 0.0276564
[345]	valid_0's l2: 0.027556
[346]	valid_0's l2: 0.0274631
[347]	valid_0's l2: 0.0273584
[348]	valid_0's l2: 0.0272571
[349]	valid_0's l2: 0.0271598
[350]	valid_0's l2: 0.0270661
[351]	valid_0's l2: 0.0269656
[352]	valid_0's l2: 0.026872
[353]	valid_0's l2: 0.0267654
[354]	valid_0's l2: 0.026683
[355]	valid_0's l2: 0.0265876
[356]	valid_0's l2: 0.0265064
[357]	valid_0's l2: 0.0264206
[358]	valid_0's l2: 0.0263261
[359]	valid_0's l2: 0.0262268
[360]	valid_0's l2: 0.0261385
[361]	valid_0's l2: 0.0260491
[362]	valid_0's l2: 0.0259586
[363]	valid_0's l2: 0.0258718
[364]	valid_0's l2: 0.0257845
[365]	valid_0's l2: 0.025698
[366]	valid_0's l2: 0.0256091
[367]	valid_0's l2: 0.0255066
[368]	valid_0's l2: 0.025414
[369]	valid_0's l2: 0.0253178
[370]	valid_0's l2: 0.0252268
[371]	valid_0's l2: 0.0251415
[372]	valid_0's l2: 0.0250536
[373]	valid_0's l2: 0.0249656
[374]	valid_0's l2: 0.0248922
[375]	valid_0's l2: 0.0248076
[376]	valid_0's l2: 0.0247133
[377]	valid_0's l2: 0.0246336
[378]	valid_0's l2: 0.0245543
[379]	valid_0's l2: 0.024463
[380]	valid_0's l2: 0.0243908
[381]	valid_0's l2: 0.0243089
[382]	valid_0's l2: 0.0242223
[383]	valid_0's l2: 0.0241504
[384]	valid_0's l2: 0.0240773
[385]	valid_0's l2: 0.0239832
[386]	valid_0's l2: 0.0238987
[387]	valid_0's l2: 0.0238124
[388]	valid_0's l2: 0.0237279
[389]	valid_0's l2: 0.0236464
[390]	valid_0's l2: 0.0235599
[391]	valid_0's l2: 0.0234739
[392]	valid_0's l2: 0.0233941
[393]	valid_0's l2: 0.0233051
[394]	valid_0's l2: 0.0232125
[395]	valid_0's l2: 0.0231359
[396]	valid_0's l2: 0.0230624
[397]	valid_0's l2: 0.0229959
[398]	valid_0's l2: 0.0229201
[399]	valid_0's l2: 0.0228434
[400]	valid_0's l2: 0.0227616
[401]	valid_0's l2: 0.0226815
[402]	valid_0's l2: 0.022607
[403]	valid_0's l2: 0.0225342
[404]	valid_0's l2: 0.0224529
[405]	valid_0's l2: 0.0223827
[406]	valid_0's l2: 0.0223098
[407]	valid_0's l2: 0.022241
[408]	valid_0's l2: 0.0221678
[409]	valid_0's l2: 0.0220848
[410]	valid_0's l2: 0.0220053
[411]	valid_0's l2: 0.0219357
[412]	valid_0's l2: 0.0218662
[413]	valid_0's l2: 0.0217852
[414]	valid_0's l2: 0.0217066
[415]	valid_0's l2: 0.0216262
[416]	valid_0's l2: 0.0215557
[417]	valid_0's l2: 0.0214884
[418]	valid_0's l2: 0.0214134
[419]	valid_0's l2: 0.0213336
[420]	valid_0's l2: 0.0212735
[421]	valid_0's l2: 0.0212104
[422]	valid_0's l2: 0.0211432
[423]	valid_0's l2: 0.021079
[424]	valid_0's l2: 0.0210032
[425]	valid_0's l2: 0.0209392
[426]	valid_0's l2: 0.0208721
[427]	valid_0's l2: 0.0208122
[428]	valid_0's l2: 0.0207427
[429]	valid_0's l2: 0.0206784
[430]	valid_0's l2: 0.0205995
[431]	valid_0's l2: 0.0205368
[432]	valid_0's l2: 0.0204694
[433]	valid_0's l2: 0.0204004
[434]	valid_0's l2: 0.0203348
[435]	valid_0's l2: 0.0202709
[436]	valid_0's l2: 0.0202165
[437]	valid_0's l2: 0.0201513
[438]	valid_0's l2: 0.0200816
[439]	valid_0's l2: 0.02002
[440]	valid_0's l2: 0.0199487
[441]	valid_0's l2: 0.019885
[442]	valid_0's l2: 0.0198272
[443]	valid_0's l2: 0.0197579
[444]	valid_0's l2: 0.0196982
[445]	valid_0's l2: 0.0196292
[446]	valid_0's l2: 0.0195708
[447]	valid_0's l2: 0.0195081
[448]	valid_0's l2: 0.0194422
[449]	valid_0's l2: 0.0193768
[450]	valid_0's l2: 0.0193136
[451]	valid_0's l2: 0.0192471
[452]	valid_0's l2: 0.0191747
[453]	valid_0's l2: 0.0191124
[454]	valid_0's l2: 0.0190483
[455]	valid_0's l2: 0.0189932
[456]	valid_0's l2: 0.0189274
[457]	valid_0's l2: 0.0188666
[458]	valid_0's l2: 0.0188057
[459]	valid_0's l2: 0.0187468
[460]	valid_0's l2: 0.0186874
[461]	valid_0's l2: 0.0186373
[462]	valid_0's l2: 0.0185759
[463]	valid_0's l2: 0.0185127
[464]	valid_0's l2: 0.0184518
[465]	valid_0's l2: 0.0183863
[466]	valid_0's l2: 0.0183335
[467]	valid_0's l2: 0.0182767
[468]	valid_0's l2: 0.0182227
[469]	valid_0's l2: 0.0181616
[470]	valid_0's l2: 0.0181034
[471]	valid_0's l2: 0.0180449
[472]	valid_0's l2: 0.0179857
[473]	valid_0's l2: 0.0179243
[474]	valid_0's l2: 0.017864
[475]	valid_0's l2: 0.017811
[476]	valid_0's l2: 0.0177563
[477]	valid_0's l2: 0.0176983
[478]	valid_0's l2: 0.0176437
[479]	valid_0's l2: 0.017594
[480]	valid_0's l2: 0.0175356
[481]	valid_0's l2: 0.0174881
[482]	valid_0's l2: 0.0174288
[483]	valid_0's l2: 0.017371
[484]	valid_0's l2: 0.0173169
[485]	valid_0's l2: 0.0172679
[486]	valid_0's l2: 0.017213
[487]	valid_0's l2: 0.0171669
[488]	valid_0's l2: 0.0171131
[489]	valid_0's l2: 0.017055
[490]	valid_0's l2: 0.0169895
[491]	valid_0's l2: 0.0169398
[492]	valid_0's l2: 0.0168845
[493]	valid_0's l2: 0.0168276
[494]	valid_0's l2: 0.0167689
[495]	valid_0's l2: 0.0167261
[496]	valid_0's l2: 0.0166614
[497]	valid_0's l2: 0.0166035
[498]	valid_0's l2: 0.0165538
[499]	valid_0's l2: 0.0164969
[500]	valid_0's l2: 0.0164417
[501]	valid_0's l2: 0.016396
[502]	valid_0's l2: 0.016347
[503]	valid_0's l2: 0.016293
[504]	valid_0's l2: 0.0162467
[505]	valid_0's l2: 0.0161936
[506]	valid_0's l2: 0.0161354
[507]	valid_0's l2: 0.0160856
[508]	valid_0's l2: 0.0160306
[509]	valid_0's l2: 0.0159787
[510]	valid_0's l2: 0.0159252
[511]	valid_0's l2: 0.015869
[512]	valid_0's l2: 0.0158201
[513]	valid_0's l2: 0.0157738
[514]	valid_0's l2: 0.0157277
[515]	valid_0's l2: 0.0156685
[516]	valid_0's l2: 0.0156203
[517]	valid_0's l2: 0.0155731
[518]	valid_0's l2: 0.0155216
[519]	valid_0's l2: 0.0154731
[520]	valid_0's l2: 0.0154257
[521]	valid_0's l2: 0.0153855
[522]	valid_0's l2: 0.0153433
[523]	valid_0's l2: 0.0152947
[524]	valid_0's l2: 0.0152412
[525]	valid_0's l2: 0.0151917
[526]	valid_0's l2: 0.0151513
[527]	valid_0's l2: 0.0151026
[528]	valid_0's l2: 0.0150631
[529]	valid_0's l2: 0.0150194
[530]	valid_0's l2: 0.0149765
[531]	valid_0's l2: 0.0149336
[532]	valid_0's l2: 0.0148831
[533]	valid_0's l2: 0.014834
[534]	valid_0's l2: 0.0147868
[535]	valid_0's l2: 0.0147363
[536]	valid_0's l2: 0.0146885
[537]	valid_0's l2: 0.014646
[538]	valid_0's l2: 0.0145951
[539]	valid_0's l2: 0.0145546
[540]	valid_0's l2: 0.0145084
[541]	valid_0's l2: 0.0144638
[542]	valid_0's l2: 0.0144179
[543]	valid_0's l2: 0.0143723
[544]	valid_0's l2: 0.0143271
[545]	valid_0's l2: 0.0142815
[546]	valid_0's l2: 0.0142391
[547]	valid_0's l2: 0.0141955
[548]	valid_0's l2: 0.0141514
[549]	valid_0's l2: 0.0141118
[550]	valid_0's l2: 0.0140631
[551]	valid_0's l2: 0.0140253
[552]	valid_0's l2: 0.0139813
[553]	valid_0's l2: 0.0139389
[554]	valid_0's l2: 0.0138994
[555]	valid_0's l2: 0.013859
[556]	valid_0's l2: 0.0138089
[557]	valid_0's l2: 0.0137652
[558]	valid_0's l2: 0.0137244
[559]	valid_0's l2: 0.013682
[560]	valid_0's l2: 0.013642
[561]	valid_0's l2: 0.0135916
[562]	valid_0's l2: 0.0135578
[563]	valid_0's l2: 0.0135169
[564]	valid_0's l2: 0.0134765
[565]	valid_0's l2: 0.013426
[566]	valid_0's l2: 0.0133852
[567]	valid_0's l2: 0.01334
[568]	valid_0's l2: 0.013302
[569]	valid_0's l2: 0.0132602
[570]	valid_0's l2: 0.013221
[571]	valid_0's l2: 0.0131795
[572]	valid_0's l2: 0.0131393
[573]	valid_0's l2: 0.0131015
[574]	valid_0's l2: 0.0130655
[575]	valid_0's l2: 0.0130266
[576]	valid_0's l2: 0.0129882
[577]	valid_0's l2: 0.0129407
[578]	valid_0's l2: 0.0129036
[579]	valid_0's l2: 0.0128704
[580]	valid_0's l2: 0.01283
[581]	valid_0's l2: 0.0127951
[582]	valid_0's l2: 0.0127581
[583]	valid_0's l2: 0.0127225
[584]	valid_0's l2: 0.0126852
[585]	valid_0's l2: 0.0126508
[586]	valid_0's l2: 0.0126157
[587]	valid_0's l2: 0.0125778
[588]	valid_0's l2: 0.0125396
[589]	valid_0's l2: 0.0125093
[590]	valid_0's l2: 0.0124753
[591]	valid_0's l2: 0.0124364
[592]	valid_0's l2: 0.0123992
[593]	valid_0's l2: 0.0123644
[594]	valid_0's l2: 0.012324
[595]	valid_0's l2: 0.0122882
[596]	valid_0's l2: 0.012247
[597]	valid_0's l2: 0.0122074
[598]	valid_0's l2: 0.0121702
[599]	valid_0's l2: 0.0121326
[600]	valid_0's l2: 0.0120962
[601]	valid_0's l2: 0.0120611
[602]	valid_0's l2: 0.0120274
[603]	valid_0's l2: 0.0119917
[604]	valid_0's l2: 0.0119589
[605]	valid_0's l2: 0.0119258
[606]	valid_0's l2: 0.01189
[607]	valid_0's l2: 0.0118473
[608]	valid_0's l2: 0.0118075
[609]	valid_0's l2: 0.0117714
[610]	valid_0's l2: 0.0117349
[611]	valid_0's l2: 0.0117024
[612]	valid_0's l2: 0.0116726
[613]	valid_0's l2: 0.0116377
[614]	valid_0's l2: 0.0116028
[615]	valid_0's l2: 0.011567
[616]	valid_0's l2: 0.011533
[617]	valid_0's l2: 0.011501
[618]	valid_0's l2: 0.0114642
[619]	valid_0's l2: 0.0114297
[620]	valid_0's l2: 0.0113969
[621]	valid_0's l2: 0.011357
[622]	valid_0's l2: 0.0113201
[623]	valid_0's l2: 0.0112868
[624]	valid_0's l2: 0.011254
[625]	valid_0's l2: 0.011224
[626]	valid_0's l2: 0.011191
[627]	valid_0's l2: 0.0111522
[628]	valid_0's l2: 0.011113
[629]	valid_0's l2: 0.0110821
[630]	valid_0's l2: 0.0110482
[631]	valid_0's l2: 0.0110084
[632]	valid_0's l2: 0.0109698
[633]	valid_0's l2: 0.0109339
[634]	valid_0's l2: 0.0109028
[635]	valid_0's l2: 0.0108721
[636]	valid_0's l2: 0.0108381
[637]	valid_0's l2: 0.0108081
[638]	valid_0's l2: 0.0107805
[639]	valid_0's l2: 0.0107546
[640]	valid_0's l2: 0.0107213
[641]	valid_0's l2: 0.0106916
[642]	valid_0's l2: 0.01066
[643]	valid_0's l2: 0.0106281
[644]	valid_0's l2: 0.0105954
[645]	valid_0's l2: 0.0105681
[646]	valid_0's l2: 0.0105409
[647]	valid_0's l2: 0.0105041
[648]	valid_0's l2: 0.0104694
[649]	valid_0's l2: 0.010437
[650]	valid_0's l2: 0.0104062
[651]	valid_0's l2: 0.0103761
[652]	valid_0's l2: 0.0103435
[653]	valid_0's l2: 0.0103075
[654]	valid_0's l2: 0.0102794
[655]	valid_0's l2: 0.0102512
[656]	valid_0's l2: 0.0102219
[657]	valid_0's l2: 0.0101965
[658]	valid_0's l2: 0.0101652
[659]	valid_0's l2: 0.0101386
[660]	valid_0's l2: 0.0101061
[661]	valid_0's l2: 0.0100746
[662]	valid_0's l2: 0.0100489
[663]	valid_0's l2: 0.0100162
[664]	valid_0's l2: 0.00998765
[665]	valid_0's l2: 0.00996005
[666]	valid_0's l2: 0.00993276
[667]	valid_0's l2: 0.00990016
[668]	valid_0's l2: 0.00987478
[669]	valid_0's l2: 0.00984551
[670]	valid_0's l2: 0.00981317
[671]	valid_0's l2: 0.0097865
[672]	valid_0's l2: 0.00975501
[673]	valid_0's l2: 0.00972338
[674]	valid_0's l2: 0.00969267
[675]	valid_0's l2: 0.00966437
[676]	valid_0's l2: 0.00964019
[677]	valid_0's l2: 0.00961377
[678]	valid_0's l2: 0.00958592
[679]	valid_0's l2: 0.00955535
[680]	valid_0's l2: 0.00953371
[681]	valid_0's l2: 0.00950365
[682]	valid_0's l2: 0.00948142
[683]	valid_0's l2: 0.0094571
[684]	valid_0's l2: 0.00942921
[685]	valid_0's l2: 0.00940303
[686]	valid_0's l2: 0.00937365
[687]	valid_0's l2: 0.00934648
[688]	valid_0's l2: 0.00931874
[689]	valid_0's l2: 0.009291
[690]	valid_0's l2: 0.00926219
[691]	valid_0's l2: 0.00923635
[692]	valid_0's l2: 0.00921003
[693]	valid_0's l2: 0.00918178
[694]	valid_0's l2: 0.00915312
[695]	valid_0's l2: 0.00912756
[696]	valid_0's l2: 0.00910119
[697]	valid_0's l2: 0.00907177
[698]	valid_0's l2: 0.00904345
[699]	valid_0's l2: 0.00901829
[700]	valid_0's l2: 0.00899257
[701]	valid_0's l2: 0.0089655
[702]	valid_0's l2: 0.0089406
[703]	valid_0's l2: 0.00891661
[704]	valid_0's l2: 0.00889277
[705]	valid_0's l2: 0.00886283
[706]	valid_0's l2: 0.00883454
[707]	valid_0's l2: 0.0088102
[708]	valid_0's l2: 0.00878652
[709]	valid_0's l2: 0.00875958
[710]	valid_0's l2: 0.00873833
[711]	valid_0's l2: 0.00871026
[712]	valid_0's l2: 0.00868098
[713]	valid_0's l2: 0.00865403
[714]	valid_0's l2: 0.00862837
[715]	valid_0's l2: 0.00860197
[716]	valid_0's l2: 0.00857628
[717]	valid_0's l2: 0.0085533
[718]	valid_0's l2: 0.00852698
[719]	valid_0's l2: 0.00850126
[720]	valid_0's l2: 0.00847684
[721]	valid_0's l2: 0.00844918
[722]	valid_0's l2: 0.00842668
[723]	valid_0's l2: 0.00840213
[724]	valid_0's l2: 0.00837805
[725]	valid_0's l2: 0.00834966
[726]	valid_0's l2: 0.00832772
[727]	valid_0's l2: 0.0083058
[728]	valid_0's l2: 0.00828248
[729]	valid_0's l2: 0.00825812
[730]	valid_0's l2: 0.00823344
[731]	valid_0's l2: 0.00820943
[732]	valid_0's l2: 0.0081886
[733]	valid_0's l2: 0.00816706
[734]	valid_0's l2: 0.00813851
[735]	valid_0's l2: 0.00811264
[736]	valid_0's l2: 0.00808684
[737]	valid_0's l2: 0.00805964
[738]	valid_0's l2: 0.00803515
[739]	valid_0's l2: 0.0080105
[740]	valid_0's l2: 0.00798574
[741]	valid_0's l2: 0.00795922
[742]	valid_0's l2: 0.00793924
[743]	valid_0's l2: 0.00792129
[744]	valid_0's l2: 0.00789851
[745]	valid_0's l2: 0.00787613
[746]	valid_0's l2: 0.00785161
[747]	valid_0's l2: 0.00782597
[748]	valid_0's l2: 0.00780589
[749]	valid_0's l2: 0.00778356
[750]	valid_0's l2: 0.00776195
[751]	valid_0's l2: 0.0077431
[752]	valid_0's l2: 0.00772208
[753]	valid_0's l2: 0.00770088
[754]	valid_0's l2: 0.00767512
[755]	valid_0's l2: 0.00765321
[756]	valid_0's l2: 0.00762882
[757]	valid_0's l2: 0.00760725
[758]	valid_0's l2: 0.00758567
[759]	valid_0's l2: 0.00756528
[760]	valid_0's l2: 0.00754057
[761]	valid_0's l2: 0.00751887
[762]	valid_0's l2: 0.00749747
[763]	valid_0's l2: 0.00747596
[764]	valid_0's l2: 0.00745263
[765]	valid_0's l2: 0.0074307
[766]	valid_0's l2: 0.00741019
[767]	valid_0's l2: 0.00739204
[768]	valid_0's l2: 0.00737176
[769]	valid_0's l2: 0.00734942
[770]	valid_0's l2: 0.00732936
[771]	valid_0's l2: 0.00730547
[772]	valid_0's l2: 0.00728343
[773]	valid_0's l2: 0.00726527
[774]	valid_0's l2: 0.00725046
[775]	valid_0's l2: 0.007226
[776]	valid_0's l2: 0.00720724
[777]	valid_0's l2: 0.00719013
[778]	valid_0's l2: 0.00716942
[779]	valid_0's l2: 0.00714917
[780]	valid_0's l2: 0.00712796
[781]	valid_0's l2: 0.00710613
[782]	valid_0's l2: 0.0070854
[783]	valid_0's l2: 0.007067
[784]	valid_0's l2: 0.00704738
[785]	valid_0's l2: 0.00702729
[786]	valid_0's l2: 0.00700954
[787]	valid_0's l2: 0.00698751
[788]	valid_0's l2: 0.00696706
[789]	valid_0's l2: 0.00695013
[790]	valid_0's l2: 0.00692799
[791]	valid_0's l2: 0.00690834
[792]	valid_0's l2: 0.00689022
[793]	valid_0's l2: 0.00687461
[794]	valid_0's l2: 0.00685537
[795]	valid_0's l2: 0.00683864
[796]	valid_0's l2: 0.00681821
[797]	valid_0's l2: 0.00679957
[798]	valid_0's l2: 0.00678304
[799]	valid_0's l2: 0.00676526
[800]	valid_0's l2: 0.00674829
[801]	valid_0's l2: 0.00672748
[802]	valid_0's l2: 0.00671314
[803]	valid_0's l2: 0.00669539
[804]	valid_0's l2: 0.00667696
[805]	valid_0's l2: 0.00666189
[806]	valid_0's l2: 0.00664278
[807]	valid_0's l2: 0.00662476
[808]	valid_0's l2: 0.0066045
[809]	valid_0's l2: 0.00658688
[810]	valid_0's l2: 0.00657008
[811]	valid_0's l2: 0.00654931
[812]	valid_0's l2: 0.00652892
[813]	valid_0's l2: 0.00650945
[814]	valid_0's l2: 0.00649235
[815]	valid_0's l2: 0.00647223
[816]	valid_0's l2: 0.00645115
[817]	valid_0's l2: 0.00643335
[818]	valid_0's l2: 0.00641364
[819]	valid_0's l2: 0.00639329
[820]	valid_0's l2: 0.00637625
[821]	valid_0's l2: 0.00635915
[822]	valid_0's l2: 0.00633663
[823]	valid_0's l2: 0.00631993
[824]	valid_0's l2: 0.0063028
[825]	valid_0's l2: 0.0062848
[826]	valid_0's l2: 0.00626578
[827]	valid_0's l2: 0.00624708
[828]	valid_0's l2: 0.00623038
[829]	valid_0's l2: 0.00621295
[830]	valid_0's l2: 0.00619585
[831]	valid_0's l2: 0.00618205
[832]	valid_0's l2: 0.00616672
[833]	valid_0's l2: 0.00614851
[834]	valid_0's l2: 0.00613162
[835]	valid_0's l2: 0.00611457
[836]	valid_0's l2: 0.00609737
[837]	valid_0's l2: 0.00608204
[838]	valid_0's l2: 0.00606595
[839]	valid_0's l2: 0.00604497
[840]	valid_0's l2: 0.00602921
[841]	valid_0's l2: 0.00601185
[842]	valid_0's l2: 0.00599524
[843]	valid_0's l2: 0.00597993
[844]	valid_0's l2: 0.00596425
[845]	valid_0's l2: 0.00594689
[846]	valid_0's l2: 0.00592871
[847]	valid_0's l2: 0.0059126
[848]	valid_0's l2: 0.00589392
[849]	valid_0's l2: 0.00587897
[850]	valid_0's l2: 0.00586262
[851]	valid_0's l2: 0.00584732
[852]	valid_0's l2: 0.00582891
[853]	valid_0's l2: 0.00581132
[854]	valid_0's l2: 0.00579666
[855]	valid_0's l2: 0.00577848
[856]	valid_0's l2: 0.00576402
[857]	valid_0's l2: 0.00574758
[858]	valid_0's l2: 0.00572844
[859]	valid_0's l2: 0.00571458
[860]	valid_0's l2: 0.00570056
[861]	valid_0's l2: 0.00568792
[862]	valid_0's l2: 0.00567322
[863]	valid_0's l2: 0.00565784
[864]	valid_0's l2: 0.00564145
[865]	valid_0's l2: 0.00562681
[866]	valid_0's l2: 0.00560813
[867]	valid_0's l2: 0.00559159
[868]	valid_0's l2: 0.00557886
[869]	valid_0's l2: 0.00556378
[870]	valid_0's l2: 0.00554823
[871]	valid_0's l2: 0.00553179
[872]	valid_0's l2: 0.00551684
[873]	valid_0's l2: 0.00550021
[874]	valid_0's l2: 0.0054866
[875]	valid_0's l2: 0.00547274
[876]	valid_0's l2: 0.00545589
[877]	valid_0's l2: 0.00544012
[878]	valid_0's l2: 0.0054281
[879]	valid_0's l2: 0.00541436
[880]	valid_0's l2: 0.00540007
[881]	valid_0's l2: 0.00538562
[882]	valid_0's l2: 0.00537077
[883]	valid_0's l2: 0.00535535
[884]	valid_0's l2: 0.00534103
[885]	valid_0's l2: 0.00532573
[886]	valid_0's l2: 0.00530942
[887]	valid_0's l2: 0.00529119
[888]	valid_0's l2: 0.00527625
[889]	valid_0's l2: 0.00526415
[890]	valid_0's l2: 0.00525031
[891]	valid_0's l2: 0.00523463
[892]	valid_0's l2: 0.00522234
[893]	valid_0's l2: 0.00520553
[894]	valid_0's l2: 0.00519216
[895]	valid_0's l2: 0.00517861
[896]	valid_0's l2: 0.0051687
[897]	valid_0's l2: 0.00515267
[898]	valid_0's l2: 0.00514124
[899]	valid_0's l2: 0.00512636
[900]	valid_0's l2: 0.00511042
[901]	valid_0's l2: 0.00509234
[902]	valid_0's l2: 0.00507939
[903]	valid_0's l2: 0.00506686
[904]	valid_0's l2: 0.00505396
[905]	valid_0's l2: 0.00503911
[906]	valid_0's l2: 0.00502585
[907]	valid_0's l2: 0.00501396
[908]	valid_0's l2: 0.00500107
[909]	valid_0's l2: 0.00498863
[910]	valid_0's l2: 0.00497211
[911]	valid_0's l2: 0.00496041
[912]	valid_0's l2: 0.00494742
[913]	valid_0's l2: 0.00493091
[914]	valid_0's l2: 0.00491638
[915]	valid_0's l2: 0.00490328
[916]	valid_0's l2: 0.00488862
[917]	valid_0's l2: 0.00487562
[918]	valid_0's l2: 0.00486164
[919]	valid_0's l2: 0.00484643
[920]	valid_0's l2: 0.00483327
[921]	valid_0's l2: 0.00482031
[922]	valid_0's l2: 0.00481015
[923]	valid_0's l2: 0.00479466
[924]	valid_0's l2: 0.00478117
[925]	valid_0's l2: 0.00476855
[926]	valid_0's l2: 0.00475397
[927]	valid_0's l2: 0.00473896
[928]	valid_0's l2: 0.00472724
[929]	valid_0's l2: 0.00471297
[930]	valid_0's l2: 0.00470006
[931]	valid_0's l2: 0.00468702
[932]	valid_0's l2: 0.00467521
[933]	valid_0's l2: 0.00466514
[934]	valid_0's l2: 0.00465004
[935]	valid_0's l2: 0.0046374
[936]	valid_0's l2: 0.00462535
[937]	valid_0's l2: 0.00461603
[938]	valid_0's l2: 0.00460596
[939]	valid_0's l2: 0.00459329
[940]	valid_0's l2: 0.00458267
[941]	valid_0's l2: 0.00457171
[942]	valid_0's l2: 0.00456195
[943]	valid_0's l2: 0.00454889
[944]	valid_0's l2: 0.00453498
[945]	valid_0's l2: 0.0045241
[946]	valid_0's l2: 0.00451257
[947]	valid_0's l2: 0.00450106
[948]	valid_0's l2: 0.00449129
[949]	valid_0's l2: 0.00447776
[950]	valid_0's l2: 0.00446848
[951]	valid_0's l2: 0.00445618
[952]	valid_0's l2: 0.00444714
[953]	valid_0's l2: 0.00443571
[954]	valid_0's l2: 0.00442496
[955]	valid_0's l2: 0.0044127
[956]	valid_0's l2: 0.00440143
[957]	valid_0's l2: 0.00438998
[958]	valid_0's l2: 0.0043749
[959]	valid_0's l2: 0.00436525
[960]	valid_0's l2: 0.00435274
[961]	valid_0's l2: 0.00434034
[962]	valid_0's l2: 0.00432844
[963]	valid_0's l2: 0.0043179
[964]	valid_0's l2: 0.00430632
[965]	valid_0's l2: 0.00429557
[966]	valid_0's l2: 0.0042838
[967]	valid_0's l2: 0.00427301
[968]	valid_0's l2: 0.00426204
[969]	valid_0's l2: 0.00425168
[970]	valid_0's l2: 0.0042391
[971]	valid_0's l2: 0.00422889
[972]	valid_0's l2: 0.00421565
[973]	valid_0's l2: 0.00420352
[974]	valid_0's l2: 0.00419274
[975]	valid_0's l2: 0.00418182
[976]	valid_0's l2: 0.00417131
[977]	valid_0's l2: 0.00415949
[978]	valid_0's l2: 0.00414624
[979]	valid_0's l2: 0.00413553
[980]	valid_0's l2: 0.00412225
[981]	valid_0's l2: 0.00411082
[982]	valid_0's l2: 0.00409779
[983]	valid_0's l2: 0.00408897
[984]	valid_0's l2: 0.00407828
[985]	valid_0's l2: 0.00406799
[986]	valid_0's l2: 0.00405855
[987]	valid_0's l2: 0.00404701
[988]	valid_0's l2: 0.00403567
[989]	valid_0's l2: 0.00402557
[990]	valid_0's l2: 0.00401755
[991]	valid_0's l2: 0.00400607
[992]	valid_0's l2: 0.00399713
[993]	valid_0's l2: 0.00398729
[994]	valid_0's l2: 0.00397694
[995]	valid_0's l2: 0.00396689
[996]	valid_0's l2: 0.00395627
[997]	valid_0's l2: 0.0039451
[998]	valid_0's l2: 0.00393505
[999]	valid_0's l2: 0.00392469
[1000]	valid_0's l2: 0.00391117
Did not meet early stopping. Best iteration is:
[1000]	valid_0's l2: 0.00391117
pred
[[ 0.47436676]
 [ 0.37895368]
 [ 0.08911858]
 ...
 [-2.55487643]
 [-2.56092274]
 [-2.54996901]]
[[ 0.47436676]
 [ 0.37895368]
 [ 0.08911858]
 ...
 [-2.55487643]
 [-2.56092274]
 [-2.54996901]]
  • 43
    点赞
  • 20
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

ZShiJ

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值