Opencv 机器学习 ---- 支持向量机( SVM )

本文介绍了一个使用OpenCV进行支持向量机(SVM)训练和预测的Java示例。通过身高和体重数据集来区分男性和女性,展示了如何创建SVM模型、设置参数、训练模型并进行预测。

Opencv 机器学习 —- 支持向量机( SVM )

OpenCV3 Java 机器学习使用方法汇总

public class SVM {

    static {
        System.loadLibrary(Core.NATIVE_LIBRARY_NAME);
    }
    public static void run() {    
        // 训练数据,两个维度,表示身高和体重    
        float[] trainingData = { 186, 80, 185, 81, 160, 50, 161, 48 };    
        // 训练标签数据,前两个表示男生0,后两个表示女生1,由于使用了多种机器学习算法,他们的输入有些不一样,所以labelsMat有三种     
        float[] labels = { 0f, 0f, 0f, 0f, 1f, 1f, 1f, 1f };    
        int[] labels2 = { 0, 0, 1, 1 };    
        float[] labels3 = { 0, 0, 1, 1 };    
        // 测试数据,先男后女    
        float[] test = { 184, 79, 159, 50 };    

        Mat trainingDataMat = new Mat(4, 2, CvType.CV_32FC1);    
        trainingDataMat.put(0, 0, trainingData);       

        Mat labelsMat2 = new Mat(4, 1, CvType.CV_32SC1);    
        labelsMat2.put(0, 0, labels2);     

        Mat sampleMat = new Mat(2, 2, CvType.CV_32FC1);    
        sampleMat.put(0, 0, test);    

        MySvm(trainingDataMat, labelsMat2, sampleMat);
    }   

    // SVM 支持向量机
    public static Mat MySvm(Mat trainingData, Mat labels, Mat testData) {

        SVM svm = SVM.create();
        //配置SVM训练器参数
        TermCriteria criteria = new TermCriteria(TermCriteria.EPS + TermCriteria.MAX_ITER, 1000, 0);
        svm.setTermCriteria(criteria);//指定
        svm.setKernel(SVM.LINEAR);//使用预先定义的内核初始化
        svm.setType(SVM.C_SVC); //SVM的类型,默认是:SVM.C_SVC
        svm.setGamma(0.5);//核函数的参数
        svm.setNu(0.5);//SVM优化问题参数
        svm.setC(1);//SVM优化问题的参数C

        TrainData td = TrainData.create(trainingData, Ml.ROW_SAMPLE, labels);//类封装的训练数据
        boolean success = svm.train(td.getSamples(),Ml.ROW_SAMPLE,td.getResponses());//训练统计模型
        System.out.println("Svm training result: " + success);
        //svm.save(filename);//保存模型

        //测试数据
        Mat responseMat = new Mat();
        svm.predict(testData,responseMat,0);
        System.out.println("SVM responseMat:\n" + responseMat.dump());
        for(int i = 0;i<responseMat.height();i++) {
            if(responseMat.get(i, 0)[0] == 0) 
                System.out.println("Boy\n");
            if(responseMat.get(i, 0)[0] == 1)
                System.out.println("Girl\n");
        }
        return responseMat;
    }

        return responseMat;
    }

    public static void main(String[] args) {
        run();
    }

结果:

Svm training result: true
SVM responseMat:
[0;
 1]
Boy

Girl

方法:

SVM.create()

静态方法,创建一个空的模型。使用 train方法训练这个模型。

class TermCriteria : 官方定义是 定义迭代算法的终止标准的类

TermCriteria(int type, int maxCount, double epsilon)

  • type - 终止标准的类型 。COUNT,EPS 或者 COUNT + EPS。
Enumerator说明
TermCriteria.COUNT要计算的迭代或元素的最大数量
TermCriteria.MAX_ITER同上
TermCriteria.EPS迭代算法停止的参数的期望精度或变化

- maxCount - 迭代/元素的最大数量
- epsilon - 所需的精度

默认值:TermCriteria(TermCriteria.MAX_ITER + TermCriteria.EPS, 1000, ??? )

??? :此处在api中为:FLT_EPSILON ,但这是C或者C++才有的,意为单精度所能识别的最小精度。

void setKernel(int kernelType)

使用预先定义的内核初始化

Enumerator
CUSTOMReturned by SVM::getKernelType in case when custom kernel has been set
LINEARLinear kernel. No mapping is done, linear discrimination (or regression) is done in the original feature space. It is the fastest option. K(xi,xj)=xTixj.
POLYPolynomial kernel: K(xi,xj)=(γxTixj+coef0)degree,γ>0.
RBFRadial basis function (RBF), a good choice in most cases. K(xi,xj)=e−γ||xi−xj||2,γ>0.
SIGMOIDSigmoid kernel: K(xi,xj)=tanh(γxTixj+coef0).
CHI2Exponential Chi2 kernel, similar to the RBF kernel: K(xi,xj)=e−γχ2(xi,xj),χ2(xi,xj)=(xi−xj)2/(xi+xj),γ>0.
INTERHistogram intersection kernel. A fast kernel. K(xi,xj)=min(xi,xj).

void setType(int val)

SVM的类型 ,默认是 SVM.C_SVC

Enumerator
C_SVCC-Support Vector Classification. n-class classification (n ≥ 2), allows imperfect separation of classes with penalty multiplier C for outliers.
NU_SVCν-Support Vector Classification. n-class classification with possible imperfect separation. Parameter ν (in the range 0..1, the larger the value, the smoother the decision boundary) is used instead of C.
ONE_CLASSDistribution Estimation (One-class SVM). All the training data are from the same class, SVM builds a boundary that separates the class from the rest of the feature space.
EPS_SVRϵ-Support Vector Regression. The distance between feature vectors from the training set and the fitting hyper-plane must be less than p. For outliers the penalty multiplier C is used.
NU_SVRν-Support Vector Regression. ν is used instead of p. See [31] for details.
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值