Hyperparameter Tuning with Ray Tune
使用Ray Tune进行超参数调优
Hyperparameter tuning with Ray Tune is natively supported with Ray Train.
Ray Train原生支持使用Ray Tune进行超参数调优。
The Tuner
will take in a Trainer
and execute multiple training runs, each with different hyperparameter configurations.Tuner
将接受 Trainer
并执行多个训练运行,每个训练运行具有不同的超参数配置。
1 Key Concepts 关键概念
There are a number of key concepts when doing hyperparameter optimization with a Tuner:
使用 Tuner
进行超参数优化时有许多关键概念:
-
A set of hyperparameters you want to tune in a search space.
要在搜索空间中调优的一组超参数。 -
A search algorithm to effectively optimize your parameters and optionally use a scheduler to stop searches early and speed up your experiments.
一种搜索算法,可有效优化参数,并可选择使用调度程序提前停止搜索并加快实验速度。 -
The search space, search algorithm, scheduler, and Trainer are passed to a Tuner, which runs the hyperparameter tuning workload by evaluating multiple hyperparameters in parallel.
搜索空间、搜索算法、调度器和Trainer被传递给Tuner,Tuner通过并行评估多个超参数来运行超参数调优工作负载。 -
Each individual hyperparameter evaluation run is called a trial.
每个单独的超参数评估运行称为试验。 -
The Tuner returns its results as a ResultGrid.
Tuner将其结果作为ResultGrid
返回。
Note 注意
Tuners can also be used to launch hyperparameter tuning without using Ray Train. See the Ray Tune documentation for more guides and examples.
调谐器也可以用于启动超参数调谐,而无需使用Ray Train。有关更多指南和示例,请参阅Ray Tune文档。
2 Basic usage 基本用法
You can take an existing Trainer and simply pass it into a Tuner.
您可以将现有的 Trainer
传入 Tuner
。
import ray
from ray import tune
from ray.tune import Tuner
from ray.train.xgboost import XGBoostTrainer
dataset = ray.data.read_csv(