XGBoost的原生接口和xgboost库接口对比

这篇博客对比了XGBoost的原生接口和使用xgboost库进行分类和回归任务的效果。在分类任务中,两种接口在训练和测试上的错误率有所不同;在回归任务中,两者在减少均方根误差方面也有各自的最优迭代次数。通过对比,可以帮助开发者了解哪种接口更适合他们的应用场景。
摘要由CSDN通过智能技术生成

1、XGBoost原生接口----分类

import numpy as np
from sklearn.datasets import load_iris
from sklearn.metrics import accuracy_score
from sklearn.model_selection import train_test_split
import xgboost as xgb

data = load_iris()

x = data.data
y = data.target

x_train, x_test, y_train, y_test = train_test_split(x, y, random_state=0, test_size=0.2)

params = {
    'eta': 0.1,
    'max_depth': 2,
    'min_child_weight': 3,
    'gamma': 0,
    'subsample': 0.8,
    'objective': 'multi:softmax',  # 多分类
    'num_class': 4
}

dtrain = xgb.DMatrix(x_train, y_train)
dtest = xgb.DMatrix(x_test, y_test)

xgbclassifer = xgb.train(params=params, dtrain=dtrain, num_boost_round=100, early_stopping_rounds=5, evals=[(dtrain, 'train'), (dtest, 'test')])

y_pred = xgbclassifer.predict(xgb.DMatrix(x_test))

print(accuracy_score(y_pred, y_test))

[0] train-merror:0.033333 test-merror:0
Multiple eval metrics have been passed: ‘test-merror’ will be used for early stopping.

Will train until test-merror hasn’t improved in 5 rounds.
[1] train-merror:0.041667 test-merror:0
[2] train-merror:0.041667 test-merror:0.033333
[3] train-merror:0.041667 test-merror:0.033333
[4] train-merror:0.041667 test-merror:0.033333
[5] train-merror:0.041667 test-merror:0.033333
Stopping. Best iteration:
[0] train-merror:0.033333 test-merror:0

0.9666666666666667

2、XGBoost的xgboost库接口----分类

import numpy as np
from xgboost import XGBClassifier
from sklearn.datasets import load_iris
from sklearn.metrics import accuracy_score
from sklearn.model_selection import train_test_split
import xgboost as xgb

data = load_iris()

x = data.data
y = data
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值