![](https://img-blog.csdnimg.cn/20201014180756927.png?x-oss-process=image/resize,m_fixed,h_64,w_64)
machine learning
Keyboard Interrupt
github: https://github.com/ScottPanIE
展开
-
集成学习Ensemble Learning整理
目录集成学习序列化方法 & 并行化方法Random ForestGBDTRF与GBDT区别XGBOOST(极端梯度提升)XGBoost优缺点LightGBMXGBoost调参LightGBM调参集成学习集成学习的核心思想是使用弱学习器(线性模型,决策树等)进行加权求和,从而产生性能较为强大的强学习器。RF,GBDT,XGB 与 LGBM都属于集成学习,首先对集成学习做一个初步的介绍。集成学习(Ensemble Learning)的目的是通过结合多个基本学习器的预测结果来改善基本学习器的泛化能力和原创 2020-11-16 19:06:36 · 582 阅读 · 0 评论 -
pandas 1.0翻译与部分理解
CONTENTAbstractPandas 1.0.0 What's New?New Deprecation PolicyEnhancementsUsing Numba in rolling.apply and expanding.applyDefining custom windows for rolling operationsConverting to MarkdownExperimental New FeaturesExperimental NA scalar to denote missing v原创 2020-07-16 11:36:15 · 811 阅读 · 0 评论 -
L1 and L2 Regularization正则化损失函数
In mathematics, statistics, and computer science, particularly in machine learning and inverse problems, regularization is the process of adding information in order to solve an ill-posed problem or to prevent overfitting.[1]原创 2020-07-09 16:54:43 · 1218 阅读 · 0 评论 -
初窥Shapley Values
Table of ContentIntroShapley Additive ExplanationsDefinitionShapley ValuesDefinitionExample for General IdeaAxioms of Shapley ValueSymmetryDummy Players(free rider)AdditivityTheorem of Shapley Value(and Calculation)Interpretation of the formulaIntro从同事那里原创 2020-06-17 20:00:08 · 1068 阅读 · 0 评论