粒子群算法(PSO)及其 Python 和 MATLAB 代码

粒子群算法(Particle Swarm Optimization,PSO)是一种启发式优化算法,模拟鸟群或鱼群中个体间的合作与竞争关系,通过不断地调整每个“粒子”的速度和位置,最终找到全局最优解。PSO的核心思想是让每个粒子根据自己的经验和群体的经验来更新自己的位置和速度,以寻找更好的解。

PSO算法的步骤如下:

  1. 初始化粒子群,包括粒子位置和速度等信息。
  2. 根据每个粒子的适应度函数计算适应度值,并更新个体最优位置。
  3. 更新全局最优位置。
  4. 更新每个粒子的速度和位置。
  5. 重复步骤2-4,直到满足停止条件。

下面是一个简单的PSO算法的Python实现代码:

import random

class Particle:
    def __init__(self, dim):
        self.position = [random.uniform(-5, 5) for _ in range(dim)]
        self.velocity = [random.uniform(-1, 1) for _ in range(dim)]
        self.best_position = self.position
        self.best_fitness = float('inf')

def fitness_function(x):
    return sum([i**2 for i in x])

def pso(dim, num_particles, num_iterations):
    global_best_position = [0 for _ in range(dim)]
    global_best_fitness = float('inf')
    particles = [Particle(dim) for _ in range(num_particles)]

    for _ in range(num_iterations):
        for particle in particles:
            fitness = fitness_function(particle.position)
            if fitness < particle.best_fitness:
                particle.best_fitness = fitness
                particle.best_position = particle.position
                
            if fitness < global_best_fitness:
                global_best_fitness = fitness
                global_best_position = particle.position
                
        for particle in particles:
            for i in range(dim):
                new_velocity = 0.5 * particle.velocity[i] + 2 * random.random() * (particle.best_position[i] - particle.position[i]) + 2 * random.random() * (global_best_position[i] - particle.position[i])
                particle.position[i] = particle.position[i] + new_velocity
                particle.velocity[i] = new_velocity

    return global_best_position, global_best_fitness

if __name__ == '__main__':
    dim = 2
    num_particles = 50
    num_iterations = 100
    best_position, best_fitness = pso(dim, num_particles, num_iterations)
    print(f'Global best position: {best_position}')
    print(f'Global best fitness: {best_fitness}')

在这个代码中,定义了一个Particle类来表示粒子,包括粒子的位置、速度和最佳位置等信息。fitness_function函数为适应度函数,pso函数实现了PSO算法的主要步骤,包括初始化粒子群、更新个体最优位置和全局最优位置、更新粒子的速度和位置等。最后,在if _name_ == '_main_'下执行了PSO算法,并输出了全局最优位置和最优适应度值。

以下是使用粒子群优化(Particle Swarm Optimization, PSO)算法优化 SVM 超参数的 Python 代码示例:

from sklearn.svm import SVC
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score
import numpy as np
import pyswarms as ps

# Load the iris dataset
data = load_iris()
X = data['data']
y = data['target']

# Split the data into training and testing sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Define the objective function to optimize SVM hyperparameters
def objective_function(params):
    C, gamma = params
    model = SVC(C=C, gamma=gamma)
    model.fit(X_train, y_train)
    y_pred = model.predict(X_test)
    accuracy = accuracy_score(y_test, y_pred)
    return -accuracy  # Minimize the negative accuracy

# Define the bounds for C and gamma
max_bound = np.array([100.0, 1.0])
min_bound = np.array([0.1, 0.0001])
bounds = (min_bound, max_bound)

# Create a function to minimize using PSO
def minimize_svm(params):
    return objective_function(params)

# Initialize the swarm and optimize SVM hyperparameters using PSO
options = {'c1': 0.5, 'c2': 0.3, 'w':0.9}
optimizer = ps.single.GlobalBestPSO(n_particles=10, dimensions=2, options=options, bounds=bounds)
best_params = optimizer.optimize(minimize_svm, iters=100)

# Print the optimized hyperparameters
best_C, best_gamma = best_params
print("Optimized SVM hyperparameters:")
print("C =", best_C)
print("gamma =", best_gamma)

# Train the SVM model with the optimized hyperparameters
best_model = SVC(C=best_C, gamma=best_gamma)
best_model.fit(X_train, y_train)
y_pred = best_model.predict(X_test)
accuracy = accuracy_score(y_test, y_pred)
print("Accuracy with optimized hyperparameters:", accuracy)

MATLAB 代码示例如下:

% Load the iris dataset
data = load('iris.mat');
X = data.meas;
y = data.species;

% Split the data into training and testing sets
rng(42); % Set random seed for reproducibility
cv = cvpartition(y, 'HoldOut', 0.2);
X_train = X(cv.training,:);
y_train = y(cv.training);
X_test = X(cv.test,:);
y_test = y(cv.test);

% Define the objective function to optimize SVM hyperparameters
objective_function = @(params)svm_objective(params, X_train, y_train, X_test, y_test);

% Define the bounds for C and gamma
lb = [0.1, 0.0001];
ub = [100, 1];
bounds = [lb; ub];

% Optimize SVM hyperparameters using PSO
options = optimoptions(@particleswarm, 'SwarmSize', 10, 'MaxIterations', 100);
[best_params, ~] = particleswarm(objective_function, 2, lb, ub, options);

% Print the optimized hyperparameters
best_C = best_params(1);
best_gamma = best_params(2);
fprintf('Optimized SVM hyperparameters:\nC = %f\ngamma = %f\n', best_C, best_gamma);

% Train the SVM model with the optimized hyperparameters
best_model = fitcsvm(X_train, y_train, 'BoxConstraint', best_C, 'KernelFunction', 'rbf', 'KernelScale', 1/sqrt(2*best_gamma));
y_pred = predict(best_model, X_test);
accuracy = sum(y_pred == y_test) / numel(y_test);
fprintf('Accuracy with optimized hyperparameters: %f\n', accuracy);

这两个例子分别展示了如何使用 Python 和 MATLAB 中的粒子群优化算法来优化 SVM 的超参数。在 MATLAB 中,使用 particleswarm 函数进行粒子群优化。

  • 4
    点赞
  • 7
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值