超参数优化 贝叶斯优化框架_10个超参数优化框架

超参数优化 贝叶斯优化框架

Tune your Machine Learning models with open-source optimization libraries

使用开源优化库调整机器学习模型

介绍(Introduction)

Hyper-parameters are the parameters used to control the behavior of the algorithm while building the model. These parameters cannot be learned from the regular training process. They need to be assigned before training the model.

超参数是用于在构建模型时控制算法行为的参数。 这些参数无法从常规培训过程中学习。 在训练模型之前,需要先分配它们。

Example: n_neighbors (KNN), kernel (SVC) , max_depth & criterion (Decision Tree Classifier) etc.

例如: n_neighbors (KNN),内核(SVC), max_depth标准(决策树分类器)等。

Hyperparameter optimization or tuning in machine learning is the process of selecting the best combination of hyper-parameters that deliver the best performance.

机器学习中的超参数优化调整是选择可提供最佳性能的超参数的最佳组合的过程。

Various automatic optimization techniques exist, and each has its own strengths and drawbacks when applied to different types of problems.

存在各种自动优化技术,当应用于不同类型的问题时,每种技术都有其自身的优缺点。

Example: Grid Search, Random Search, Bayesian Search, etc.

示例:网格搜索,随机搜索,贝叶斯搜索等。

Scikit-learn is one of the frameworks we could use for Hyperparameter optimization, but there are other frameworks that could even perform better.

Scikit-learn是我们可以用于超参数优化的框架之一但是还有其他一些框架甚至可以表现更好。

  1. Ray-Tune

    雷·图恩
  2. Optuna

    奥图纳
  3. Hyperopt

    超级选择
  4. mlmachine

    机器
  5. Polyaxon

    多轴突
  6. BayesianOptimization

    贝叶斯优化
  7. Talos

    塔罗斯
  8. SHERPA

    夏尔巴人
  9. Scikit-Optimize

    Scikit优化
  10. GPyOpt

    GPyOpt

1.雷声 (1. Ray-Tune)

Tune is a Python library for experiment execution and hyperparameter tuning at any scale.[GitHub]

Tune是一个Python库,可用于任意规模的实验执行和超参数调整。[ GitHub ]

主要特征 (Key Features)

  1. Launch a multi-node distributed hyperparameter sweep in less than ten lines of code.

    少于十行代码即可启动多节点分布式超参数扫描

  2. Supports any machine learning framework, including PyTorch, XGBoost, MXNet, and Keras.

    支持任何机器学习框架,包括PyTorch,XGBoost,MXNet和Keras

  3. Choose among the state of the art algorithms such as Population Based Training (PBT), BayesOptSearch, HyperBand/ASHA.

    在最新的算法中进行选择,例如基于人口的训练(PBT)BayesOptSearchHyperBand / ASHA

  4. Tune’s Search Algorithms are wrappers around open-source optimization libraries such as HyperOpt, SigOpt, Dragonfly, and Facebook Ax.

    Tune的搜索算法围绕着开放源代码优化库(例如HyperOpt,SigOpt,Dragonfly和Facebook Ax)进行包装。

  5. Automatically visualize results with TensorBoard.

    使用TensorBoard自动显示结果。

#Tune for Scikit Learn

#Scikit学习调音

Installation: pip install ray[tune] tune-sklearn

安装:pip install ray [tune] tune-sklearn

# from sklearn.model_selection import GridSearchCV
from ray.tune.sklearn import TuneGridSearchCV
from sklearn.model_selection import train_test_split
from sklearn.linear_model import SGDClassifier
from sklearn.datasets import load_iris
import numpy as np




iris = load_iris()
X = iris.data
y = iris.target


x_train, x_test, y_train, y_test = train_test_split(X,y,test_size = 0.3,random_state = 14)


# Example parameters to tune from SGDClassifier
parameter_grid = {"alpha": [1e-4, 1e-1, 1], "epsilon": [0.01, 0.1]}


tune_search = TuneGridSearchCV(
    SGDClassifier(),
    parameter_grid,
    early_stopping=Tr
  • 4
    点赞
  • 19
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值