sklearn之SVM二分类

理论部分

使用sklearn实现的不同核函数的SVM

使用不同核函数的 SVM 用于二分类问题并可视化分类结果。

# -*- coding: utf-8 -*-
import numpy as np
import matplotlib.pyplot as plt
import pandas as pd
from sklearn.svm import SVC

def bc():
    data = pd.read_table(r'./data/testSet.txt', header=None, delim_whitespace=True)
    print(data.info())
    print(data.head())
    X_train = np.array(data.loc[:][[0, 1]])
    y_train = np.array(data[2])
    y_train = np.where(y_train == 1, 1, -1)

    x_min = X_train[:, 0].min()
    x_max = X_train[:, 0].max()
    y_min = X_train[:, 1].min()
    y_max = X_train[:, 1].max()
    '''
    linear svm, poly svm, rbf svm
    '''
    plt.figure(figsize=(15, 15))
    for fig_num, kernel in enumerate(('linear', 'poly', 'rbf')):
        svm_ = SVC(kernel=kernel)
        svm_.fit(X_train, y_train)
        # support vectors
        # plt.figure(fig_num)
        # plt.clf()
        plt.subplot(222 + fig_num)
        plt.scatter(x = X_train[y_train == 1, 0], y = X_train[y_train == 1, 1],
                    s = 30, marker = 'o', color = 'yellow', zorder = 10)
        plt.scatter(x = X_train[y_train == -1, 0], y = X_train[y_train == -1, 1],
                    s = 30, marker = 'x', color = 'blue', zorder = 10)
        plt.scatter(x = [x[0] for x in svm_.support_vectors_], y = [x[1] for x in svm_.support_vectors_], s = 80, facecolors='none', zorder = 10)
        print(len(svm_.support_vectors_))
        plt.title(kernel)
        plt.xlabel('support vectors ' + str(len(svm_.support_vectors_)))
        plt.xticks([])
        plt.yticks([])
        plt.xlim(x_min, x_max)
        plt.ylim(y_min, y_max)
        XX, YY = np.mgrid[x_min:x_max:200j, y_min:y_max:200j]
        Z = svm_.decision_function(np.c_[XX.ravel(), YY.ravel()])
        Z = Z.reshape(XX.shape)
        plt.pcolormesh(XX, YY, Z > 0, cmap=plt.cm.Paired)
        plt.contour(XX, YY, Z, colors=['black', 'k', 'white'], linestyles=['--', '-', '--'], levels=[-.5, 0, .5])

    # plot data
    plt.subplot(221)
    plt.title('data')
    plt.scatter(x=X_train[y_train == 1, 0], y=X_train[y_train == 1, 1],
                s=30, marker='o', color='red', zorder=10)
    plt.scatter(x=X_train[y_train == -1, 0], y=X_train[y_train == -1, 1],
                s=30, marker='x', color='blue', zorder=10)
    plt.xticks([])
    plt.yticks([])
    plt.xlim(x_min, x_max)
    plt.ylim(y_min, y_max)

    plt.savefig(r'./data/svm' + str(kernel) + '.jpg')
    plt.show()

if __name__ == '__main__':
    bc()

运行结果

使用大圆圈圈出了支持向量,并且在每一个图下给出了支持向量的个数。


这里写图片描述

实验数据

-0.017612   14.053064   0
-1.395634   4.662541    1
-0.752157   6.538620    0
-1.322371   7.152853    0
0.423363    11.054677   0
0.406704    7.067335    1
0.667394    12.741452   0
-2.460150   6.866805    1
0.569411    9.548755    0
-0.026632   10.427743   0
0.850433    6.920334    1
1.347183    13.175500   0
1.176813    3.167020    1
-1.781871   9.097953    0
-0.566606   5.749003    1
0.931635    1.589505    1
-0.024205   6.151823    1
-0.036453   2.690988    1
-0.196949   0.444165    1
1.014459    5.754399    1
1.985298    3.230619    1
-1.693453   -0.557540   1
-0.576525   11.778922   0
-0.346811   -1.678730   1
-2.124484   2.672471    1
1.217916    9.597015    0
-0.733928   9.098687    0
-3.642001   -1.618087   1
0.315985    3.523953    1
1.416614    9.619232    0
-0.386323   3.989286    1
0.556921    8.294984    1
1.224863    11.587360   0
-1.347803   -2.406051   1
1.196604    4.951851    1
0.275221    9.543647    0
0.470575    9.332488    0
-1.889567   9.542662    0
-1.527893   12.150579   0
-1.185247   11.309318   0
-0.445678   3.297303    1
1.042222    6.105155    1
-0.618787   10.320986   0
1.152083    0.548467    1
0.828534    2.676045    1
-1.237728   10.549033   0
-0.683565   -2.166125   1
0.229456    5.921938    1
-0.959885   11.555336   0
0.492911    10.993324   0
0.184992    8.721488    0
-0.355715   10.325976   0
-0.397822   8.058397    0
0.824839    13.730343   0
1.507278    5.027866    1
0.099671    6.835839    1
-0.344008   10.717485   0
1.785928    7.718645    1
-0.918801   11.560217   0
-0.364009   4.747300    1
-0.841722   4.119083    1
0.490426    1.960539    1
-0.007194   9.075792    0
0.356107    12.447863   0
0.342578    12.281162   0
-0.810823   -1.466018   1
2.530777    6.476801    1
1.296683    11.607559   0
0.475487    12.040035   0
-0.783277   11.009725   0
0.074798    11.023650   0
-1.337472   0.468339    1
-0.102781   13.763651   0
-0.147324   2.874846    1
0.518389    9.887035    0
1.015399    7.571882    0
-1.658086   -0.027255   1
1.319944    2.171228    1
2.056216    5.019981    1
-0.851633   4.375691    1
-1.510047   6.061992    0
-1.076637   -3.181888   1
1.821096    10.283990   0
3.010150    8.401766    1
-1.099458   1.688274    1
-0.834872   -1.733869   1
-0.846637   3.849075    1
1.400102    12.628781   0
1.752842    5.468166    1
0.078557    0.059736    1
0.089392    -0.715300   1
1.825662    12.693808   0
0.197445    9.744638    0
0.126117    0.922311    1
-0.679797   1.220530    1
0.677983    2.556666    1
0.761349    10.693862   0
-2.168791   0.143632    1
1.388610    9.341997    0
0.317029    14.739025   0
  • 5
    点赞
  • 51
    收藏
    觉得还不错? 一键收藏
  • 5
    评论
sklearn中,使用svm进行二分类的步骤如下: 1. 首先,导入所需的库和模块: ``` from sklearn import svm ``` 2. 创建一个svm分类器的实例: ``` clf = svm.SVC() ``` 3. 准备训练数据和目标值: ``` X_train = [[0, 0], [1, 1]] y_train = [0, 1] ``` 4. 使用训练数据和目标值对svm分类器进行训练: ``` clf.fit(X_train, y_train) ``` 5. 对新的数据进行预测: ``` X_test = [[2, 2], [-1, -2]] y_pred = clf.predict(X_test) ``` 以上是使用sklearn中的svm进行二分类的基本步骤。关于具体参数的设置,可以根据具体情况进行调整。例如,可以使用`clf.set_params()`方法来设置svm的参数,比如`clf.set_params(kernel='linear')`表示使用线性核函数进行分类。 如果在进行二分类时出现了`ValueError: n_classes * n_clusters_per_class must be smaller or equal 2 ** n_informative`的错误,请检查数据集的标签是否正确,并确保n_classes * n_clusters_per_class小于等于2的n_informative次方。 另外,sklearn中的svm也支持Sigmoid函数和Logistic回归。你可以使用`cv2.ml.SVM_Sigmod`来使用Sigmod函数进行分类,它与Logistic回归类似。<span class="em">1</span><span class="em">2</span><span class="em">3</span> #### 引用[.reference_title] - *1* *2* *3* [SVM二分类与多分类尝试](https://blog.csdn.net/magicboom/article/details/88978198)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v93^chatsearchT3_1"}}] [.reference_item style="max-width: 100%"] [ .reference_list ]
评论 5
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值