使用AppleClang
版本安装
export CC=clang export CXX=clang++ brew install boost #boost 不可少 否则会报错 brew install libomp # 不可少 mkdir build;cd build cmake -DUSE_GPU=1 .. make -j4 # 4 cpu核心 一般默认是4
安装
cd ../python-package
python setup.py install --gpu
代码测试 参考
https://github.com/Microsoft/LightGBM/tree/master/examples/python-guide
测试代码
import lightgbm as lgb import time dtrain = lgb.Dataset('/LightGBM/tests/data/categorical.data') params = {'max_bin': 63, 'num_leaves': 255, 'learning_rate': 0.1, 'tree_learner': 'serial', 'task': 'train', 'is_training_metric': 'false', 'min_data_in_leaf': 1, 'min_sum_hessian_in_leaf': 100, 'ndcg_eval_at': [1, 3, 5, 10], 'sparse_threshold': 1.0, 'device': 'gpu', 'gpu_platform_id': 1, 'gpu_device_id': 0 } t0 = time.time() gbm = lgb.train(params, train_set=dtrain, num_boost_round=10, valid_sets=None, valid_names=None, fobj=None, feval=None, init_model=None, feature_name='auto', categorical_feature='auto', early_stopping_rounds=None, evals_result=None, verbose_eval=True, keep_training_booster=False, callbacks=None) t1 = time.time() print('gpu version elapse time: {}'.format(t1-t0))
gpu版本运行结果
[LightGBM] [Info] This is the GPU trainer!!
[LightGBM] [Info] Total Bins 325
[LightGBM] [Info] Number of data: 7000, number of used features: 8[LightGBM] [Info] Using GPU Device: GeForce GTX 1070, Vendor: NVIDIA
[LightGBM] [Info] Compiling OpenCL Kernel with 64 bins...
[LightGBM] [Info] GPU programs have been built
[LightGBM] [Info] Size of histogram bin entry: 12
[LightGBM] [Info] 8 dense feature groups (0.05 MB) transferred to GPU in 0.261116 secs. 0 sparse feature groups
[LightGBM] [Info] Start training from score 0.181286
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
gpu version elapse time: 8.268070697784424
cpu运行版本
params = {'max_bin': 63, 'num_leaves': 255, 'learning_rate': 0.1, 'tree_learner': 'serial', 'task': 'train', 'is_training_metric': 'false', 'min_data_in_leaf': 1, 'min_sum_hessian_in_leaf': 100, 'ndcg_eval_at': [1, 3, 5, 10], 'sparse_threshold': 1.0, 'device': 'cpu' } print("*****************************") t0 = time.time() gbm = lgb.train(params, train_set=dtrain, num_boost_round=10, valid_sets=None, valid_names=None, fobj=None, feval=None, init_model=None, feature_name='auto', categorical_feature='auto', early_stopping_rounds=None, evals_result=None, verbose_eval=True, keep_training_booster=False, callbacks=None) t1 = time.time() print('cpu version elapse time: {}'.format(t1-t0))
cpu运行结果
[LightGBM] [Info] Total Bins 325
[LightGBM] [Info] Number of data: 7000, number of used features: 8
[LightGBM] [Info] Start training from score 0.181286
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
cpu version elapse time: 0.0941917896270752
gpu反而比cpu运行时间多 .....
可能是数据量太小导致的
缺少boost库会提示 -- Found OpenCL: /System/Library/Frameworks/OpenCL.framework (found version "1.2") -- OpenCL include directory:/System/Library/Frameworks/OpenCL.framework CMake Error at /usr/local/Cellar/cmake/3.14.1/share/cmake/Modules/FindBoost.cmake:2132 (message): Unable to find the requested Boost libraries. Unable to find the Boost header files. Please set BOOST_ROOT to the root directory containing Boost or BOOST_INCLUDEDIR to the directory containing Boost's headers. Call Stack (most recent call first): CMakeLists.txt:100 (find_package) CMake Error: The following variables are used in this project, but they are set to NOTFOUND. Please set them or make sure they are set and tested correctly in the CMake files: Boost_INCLUDE_DIR (ADVANCED) used as include directory in directory /Users/yh/Downloads/LightGBM used as include directory in directory /Users/yh/Downloads/LightGBM used as include directory in directory /Users/yh/Downloads/LightGBM used as include directory in directory /Users/yh/Downloads/LightGBM used as include directory in directory /Users/yh/Downloads/LightGBM used as include directory in directory /Users/yh/Downloads/LightGBM used as include directory in directory /Users/yh/Downloads/LightGBM used as include directory in directory /Users/yh/Downloads/LightGBM -- Configuring incomplete, errors occurred! See also "/Users/yh/Downloads/LightGBM/build/CMakeFiles/CMakeOutput.log". See also "/Users/yh/Downloads/LightGBM/build/CMakeFiles/CMakeError.log".
部分参考链接