XGBoost调参指南
方法1
可按照max_depth, min_child_weight colsamplt_bytree,eta的顺序一个一个调,每次调的时候其他参数保持不变
方法2:防止过拟合
When you observe high training accuracy, but low tests accuracy, it is likely that you encounter overfitting problem.
There are in general two ways that you can control overfitting in xgboost
- The first way is to directly control model complexity
This include max_depth, min_child_weight and gamma - The second way is to add randomness to make training robust to noise
This include subsample, colsample_bytree
You can also reduce stepsize eta, but needs to remember to increase num_round when you do so.
XGBoost参数
参考1-官网 参考2-CSDN
XGBoost参数类型分为三种:
- general parameters/一般参数:决定使用哪种booster,可选gbtree、dart、gblinear
- booster parameters/提升器参数:不同的booster选取不同的