-
网格搜索(Grid Search):
- 原理:网格搜索通过预定义的参数组合进行穷举搜索,评估每一种参数组合的性能,选择性能最佳的参数组合。
- 实现:使用
GridSearchCV
类。 - 示例代码:
from sklearn.model_selection import GridSearchCV from sklearn.svm import SVC param_grid = {'C': [0.1, 1, 10], 'kernel': ['linear', 'rbf']} grid_search = GridSearchCV(SVC(), param_grid, cv=5) grid_search.fit(X_train, y_train) print(grid_search.best_params_)
-
随机搜索(Randomized Search):
- 原理:随机搜索在预定义的参数空间中随机选择参数组合进行评估,通常比网格搜索更快,特别是在参数空间较大时。
- 实现:使用
RandomizedSearchCV
类。 - 示例代码:
from sklearn.model_selection import RandomizedSearchCV from sklearn.svm import SVC from scipy.stats import uniform param_dist = {'C': uniform(0.1, 10), 'kernel': ['linear', 'rbf']} random_search = RandomizedSearchCV(SVC(), param_dist, n_iter=10, cv=5) random_search.fit(X_train, y_train) print(random_search.best_params_)
-
贝叶斯优化(Bayesian Optimization):
- 原理:贝叶斯优化通过构建一个代理模型(如高斯过程)来预测不同参数组合的性能,并选择最有希望的参数组合进行评估。
- 实现:可以使用
skopt
库中的BayesSearchCV
类。 - 示例代码:
from skopt import BayesSearchCV from sklearn.svm import SVC param_space = {'C': (0.1, 10), 'kernel': ['linear', 'rbf']} bayes_search = BayesSearchCV(SVC(), param_space, n_iter=10, cv=5) bayes_search.fit(X_train, y_train) print(bayes_search.best_params_)
-
遗传算法(Genetic Algorithms):
- 原理:遗传算法模拟自然选择和遗传过程,通过交叉、变异等操作在参数空间中搜索最优解。
- 实现:可以使用
deap
库或其他遗传算法库。 - 示例代码:
from deap import base, creator, tools, algorithms from sklearn.svm import SVC from sklearn.model_selection import cross_val_score def eval_params(params): model = SVC(**params) score = cross_val_score(model, X_train, y_train, cv=5).mean() return score, creator.create("FitnessMax", base.Fitness, weights=(1.0,)) creator.create("Individual", list, fitness=creator.FitnessMax) toolbox = base.Toolbox() toolbox.register("attr_C", random.uniform, 0.1, 10) toolbox.register("attr_kernel", random.choice, ['linear', 'rbf']) toolbox.register("individual", tools.initCycle, creator.Individual, (toolbox.attr_C, toolbox.attr_kernel), n=1) toolbox.register("population", tools.initRepeat, list, toolbox.individual) toolbox.register("evaluate", eval_params) toolbox.register("mate", tools.cxTwoPoint) toolbox.register("mutate", tools.mutGaussian, mu=0, sigma=1, indpb=0.1) toolbox.register("select", tools.selTournament, tournsize=3) population = toolbox.population(n=10) algorithms.eaSimple(population, toolbox, cxpb=0.5, mutpb=0.2, ngen=10) best_individual = tools.selBest(population, 1)[0] print(best_individual)
scikit-learn超参数调优 (自动寻找模型最佳参数) 方法
最新推荐文章于 2025-03-17 09:55:58 发布