极限梯度提升算法 XGB(eXtreme Gradient Boosting)
算法原理
XGB是由GBDT发展而来,GBDT算法在上一篇博文“集成算法 — 梯度提升树GBDT”中已经介绍,两者的区别在于GBDT中预测值是由所有弱分类器上的预测结果的加权求和,其中每个样本上的预测结果就是样本所在的叶子节点的均值,而XGBT中的预测值是所有弱分类器上的叶子权重直接求和得到。
调用模型
from xgboost import XGBRegressor as XGBR, XGBClassifier as XGBC
from sklearn import datasets
from sklearn.model_selection import cross_val_score, train_test_split
#调用分类模型
iris = datasets.load_iris()
iris_x = iris.data
iris_y = iris.target
x_train, x_test, y_train, y_test = train_test_split(iris_x, iris_y, test_size = 0.3)
cls = XGBC(n_estimators = 100, learning_rate = 0.8, max_depth = 10)
cls = cls.fit(x_train, y_train)
result = cls.predict(x_test)
print(result)
[1 0 0 1 2 2 2 2 0 2 0 1 2 1 2 0 2 1 1 2 2 1 1 1 1 1 2 1 2 0 1 0 <