libsvm java 多分类,libsvm中的多类分类

I'm working with libsvm and I must implement the classification for multiclasses with one versus all.

How can I do it?

Does libsvm version 2011 use this?

I think that my question is not very clear.

if libsvm don't use automatically one versus all,I will use one svm for every class, else how can i defined this parameters in the svmtrain function.

I had read README of libsvm.

解决方案

According to the official libsvm documentation (Section 7):

LIBSVM implements the "one-against-one" approach for multi-class

classification. If k is the number of classes, then k(k-1)/2

classifiers are constructed and each one trains data from two

classes.

In classification we use a voting strategy: each binary

classification is considered to be a voting where votes can be cast

for all data points x - in the end a point is designated to be in a

class with the maximum number of votes.

In the one-against-all approach, we build as many binary classifiers as there are classes, each trained to separate one class from the rest. To predict a new instance, we choose the classifier with the largest decision function value.

As I mentioned before, the idea is to train k SVM models each one separating one class from the rest. Once we have those binary classifiers, we use the probability outputs (the -b 1 option) to predict new instances by picking the class with the highest probability.

Consider the following example:

%# Fisher Iris dataset

load fisheriris

[~,~,labels] = unique(species); %# labels: 1/2/3

data = zscore(meas); %# scale features

numInst = size(data,1);

numLabels = max(labels);

%# split training/testing

idx = randperm(numInst);

numTrain = 100; numTest = numInst - numTrain;

trainData = data(idx(1:numTrain),:); testData = data(idx(numTrain+1:end),:);

trainLabel = labels(idx(1:numTrain)); testLabel = labels(idx(numTrain+1:end));

Here is my implementation for the one-against-all approach for multi-class SVM:

%# train one-against-all models

model = cell(numLabels,1);

for k=1:numLabels

model{k} = svmtrain(double(trainLabel==k), trainData, '-c 1 -g 0.2 -b 1');

end

%# get probability estimates of test instances using each model

prob = zeros(numTest,numLabels);

for k=1:numLabels

[~,~,p] = svmpredict(double(testLabel==k), testData, model{k}, '-b 1');

prob(:,k) = p(:,model{k}.Label==1); %# probability of class==k

end

%# predict the class with the highest probability

[~,pred] = max(prob,[],2);

acc = sum(pred == testLabel) ./ numel(testLabel) %# accuracy

C = confusionmat(testLabel, pred) %# confusion matrix

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值