svm的应用

svm即支持向量机,关于svm算法网上很多教程,这里不再赘述,直接从svm应用开始讲起:

step1:需要在终端执行命令 pip install sklearn (这是基于python一个机器学习库)同时还要 pip install numpy,否则运行下面代码将报错:没有该模块;

step2 :准备数据,这里我从网上找了一份数据(作为测试用),一共200个数据,将数据分为两份,前100个数据作为训练数据,后100个数据作为测试数据;

step3 :直接编写代码

#coding:utf-8
from sklearn import svm
from numpy import *  

## step 1: load data
print "step 1: load data..."  
dataSet = []  
labels = []  
fileIn = open('C:\\Users\\Administrator\\Desktop\\Code\\SVM\\data.txt')  
for line in fileIn.readlines():  
    lineArr = line.strip().split(',')  
    dataSet.append([float(lineArr[0]), float(lineArr[1])])  
    labels.append(float(lineArr[2]))  
dataSet = mat(dataSet)#mat为数组转换矩阵的函数  
labels = mat(labels).T  
train_x = dataSet[0:101, :]  
train_y = labels[0:101, :]  
test_x = dataSet[100:201, :]  
test_y = labels[100:201, :] 
## step 2: train data
clf = svm.SVC()
clf.fit(train_x, train_y)
## step 3: test data
for i in range(0,100):
	print 'n%d: %s'%(i,clf.predict(test_x[i]))
## step 4: accuracy
count = 0
for j in range(0,100):
	if test_y[j]==clf.predict(test_x[j]):
		count=count+1
print 'accuracy:%d%%'%count

数据:

-0.214824,0.662756,-1.000000
-0.061569,-0.091875,1.000000
0.406933,0.648055,-1.000000
0.223650,0.130142,1.000000
0.231317,0.766906,-1.000000
-0.748800,-0.531637,-1.000000
-0.557789,0.375797,-1.000000
0.207123,-0.019463,1.000000
0.286462,0.719470,-1.000000
0.195300,-0.179039,1.000000
-0.152696,-0.153030,1.000000
0.384471,0.653336,-1.000000
-0.117280,-0.153217,1.000000
-0.238076,0.000583,1.000000
-0.413576,0.145681,1.000000
0.490767,-0.680029,-1.000000
0.199894,-0.199381,1.000000
-0.356048,0.537960,-1.000000
-0.392868,-0.125261,1.000000
0.353588,-0.070617,1.000000
0.020984,0.925720,-1.000000
-0.475167,-0.346247,-1.000000
0.074952,0.042783,1.000000
0.394164,-0.058217,1.000000
0.663418,0.436525,-1.000000
0.402158,0.577744,-1.000000
-0.449349,-0.038074,1.000000
0.619080,-0.088188,-1.000000
0.268066,-0.071621,1.000000
-0.015165,0.359326,1.000000
0.539368,-0.374972,-1.000000
-0.319153,0.629673,-1.000000
0.694424,0.641180,-1.000000
0.079522,0.193198,1.000000
0.253289,-0.285861,1.000000
-0.035558,-0.010086,1.000000
-0.403483,0.474466,-1.000000
-0.034312,0.995685,-1.000000
-0.590657,0.438051,-1.000000
-0.098871,-0.023953,1.000000
-0.250001,0.141621,1.000000
-0.012998,0.525985,-1.000000
0.153738,0.491531,-1.000000
0.388215,-0.656567,-1.000000
0.049008,0.013499,1.000000
0.068286,0.392741,1.000000
0.747800,-0.066630,-1.000000
0.004621,-0.042932,1.000000
-0.701600,0.190983,-1.000000
0.055413,-0.024380,1.000000
0.035398,-0.333682,1.000000
0.211795,0.024689,1.000000
-0.045677,0.172907,1.000000
0.595222,0.209570,-1.000000
0.229465,0.250409,1.000000
-0.089293,0.068198,1.000000
0.384300,-0.176570,1.000000
0.834912,-0.110321,-1.000000
-0.307768,0.503038,-1.000000
-0.777063,-0.348066,-1.000000
0.017390,0.152441,1.000000
-0.293382,-0.139778,1.000000
-0.203272,0.286855,1.000000
0.957812,-0.152444,-1.000000
0.004609,-0.070617,1.000000
-0.755431,0.096711,-1.000000
-0.526487,0.547282,-1.000000
-0.246873,0.833713,-1.000000
0.185639,-0.066162,1.000000
0.851934,0.456603,-1.000000
-0.827912,0.117122,-1.000000
0.233512,-0.106274,1.000000
0.583671,-0.709033,-1.000000
-0.487023,0.625140,-1.000000
-0.448939,0.176725,1.000000
0.155907,-0.166371,1.000000
0.334204,0.381237,-1.000000
0.081536,-0.106212,1.000000
0.227222,0.527437,-1.000000
0.759290,0.330720,-1.000000
0.204177,-0.023516,1.000000
0.577939,0.403784,-1.000000
-0.568534,0.442948,-1.000000
-0.011520,0.021165,1.000000
0.875720,0.422476,-1.000000
0.297885,-0.632874,-1.000000
-0.015821,0.031226,1.000000
0.541359,-0.205969,-1.000000
-0.689946,-0.508674,-1.000000
-0.343049,0.841653,-1.000000
0.523902,-0.436156,-1.000000
0.249281,-0.711840,-1.000000
0.193449,0.574598,-1.000000
-0.257542,-0.753885,-1.000000
-0.021605,0.158080,1.000000
0.601559,-0.727041,-1.000000
-0.791603,0.095651,-1.000000
-0.908298,-0.053376,-1.000000
0.122020,0.850966,-1.000000
-0.725568,-0.292022,-1.000000
0.676771,-0.486687,-1.000000
0.008473,0.186070,1.000000
-0.727789,0.594062,-1.000000
0.112367,0.287852,1.000000
0.383633,-0.038068,1.000000
-0.927138,-0.032633,-1.000000
-0.842803,-0.423115,-1.000000
-0.003677,-0.367338,1.000000
0.443211,-0.698469,-1.000000
-0.473835,0.005233,1.000000
0.616741,0.590841,-1.000000
0.557463,-0.373461,-1.000000
-0.498535,-0.223231,-1.000000
-0.246744,0.276413,1.000000
-0.761980,-0.244188,-1.000000
0.641594,-0.479861,-1.000000
-0.659140,0.529830,-1.000000
-0.054873,-0.238900,1.000000
-0.089644,-0.244683,1.000000
-0.431576,-0.481538,-1.000000
-0.099535,0.728679,-1.000000
-0.188428,0.156443,1.000000
0.267051,0.318101,1.000000
0.222114,-0.528887,-1.000000
0.030369,0.113317,1.000000
0.392321,0.026089,1.000000
0.298871,-0.915427,-1.000000
-0.034581,-0.133887,1.000000
0.405956,0.206980,1.000000
0.144902,-0.605762,-1.000000
0.274362,-0.401338,1.000000
0.397998,-0.780144,-1.000000
0.037863,0.155137,1.000000
-0.010363,-0.004170,1.000000
0.506519,0.486619,-1.000000
0.000082,-0.020625,1.000000
0.057761,-0.155140,1.000000
0.027748,-0.553763,-1.000000
-0.413363,-0.746830,-1.000000
0.081500,-0.014264,1.000000
0.047137,-0.491271,1.000000
-0.267459,0.024770,1.000000
-0.148288,-0.532471,-1.000000
-0.225559,-0.201622,1.000000
0.772360,-0.518986,-1.000000
-0.440670,0.688739,-1.000000
0.329064,-0.095349,1.000000
0.970170,-0.010671,-1.000000
-0.689447,-0.318722,-1.000000
-0.465493,-0.227468,-1.000000
-0.049370,0.405711,1.000000
-0.166117,0.274807,1.000000
0.054483,0.012643,1.000000
0.021389,0.076125,1.000000
-0.104404,-0.914042,-1.000000
0.294487,0.440886,-1.000000
0.107915,-0.493703,-1.000000
0.076311,0.438860,1.000000
0.370593,-0.728737,-1.000000
0.409890,0.306851,-1.000000
0.285445,0.474399,-1.000000
-0.870134,-0.161685,-1.000000
-0.654144,-0.675129,-1.000000
0.285278,-0.767310,-1.000000
0.049548,-0.000907,1.000000
0.030014,-0.093265,1.000000
-0.128859,0.278865,1.000000
0.307463,0.085667,1.000000
0.023440,0.298638,1.000000
0.053920,0.235344,1.000000
0.059675,0.533339,-1.000000
0.817125,0.016536,-1.000000
-0.108771,0.477254,1.000000
-0.118106,0.017284,1.000000
0.288339,0.195457,1.000000
0.567309,-0.200203,-1.000000
-0.202446,0.409387,1.000000
-0.330769,-0.240797,1.000000
-0.422377,0.480683,-1.000000
-0.295269,0.326017,1.000000
0.261132,0.046478,1.000000
-0.492244,-0.319998,-1.000000
-0.384419,0.099170,1.000000
0.101882,-0.781145,-1.000000
0.234592,-0.383446,1.000000
-0.020478,-0.901833,-1.000000
0.328449,0.186633,1.000000
-0.150059,-0.409158,1.000000
-0.155876,-0.843413,-1.000000
-0.098134,-0.136786,1.000000
0.110575,-0.197205,1.000000
0.219021,0.054347,1.000000
0.030152,0.251682,1.000000
0.033447,-0.122824,1.000000
-0.686225,-0.020779,-1.000000
-0.911211,-0.262011,-1.000000
0.572557,0.377526,-1.000000
-0.073647,-0.519163,-1.000000
-0.281830,-0.797236,-1.000000
-0.555263,0.126232,-1.000000

运行结果:


代码解释:

将前100个数据作为训练数据,训练完成后将后100个数据作为测试数据进行分类,接着把分类结果与之前分类好的数据做对比,计算出精确度。

关于精确度,不仅与算法有关还与选取的特征,训练的数据有关系,比如选取对象为一头大象和一只蚂蚁,用体型大小做特征进行学习,结果精确度肯定高;若选取对象为一头猎豹和一只老虎,也是用体型大小作特征进行学习,学习效果肯定不好,精确度肯定低。

以上是我关于svm算法的编程应用,如有问题,欢迎讨论。

  • 1
    点赞
  • 6
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值