1. 写在前面的话
去年学完python之后,就再也没用过,一直在做软件相关的项目,已经快烦死了
。最近稍微可以做点自己的事,准备开始自己的机器学习之路(其实就想做做和数据相关的工作,什么数据分析之类的,然后想到了机器学习,毕竟本科是数学专业),好了废话就这么多了,代码敲起。
2.1 了解KNN算法
简单的说,就是求每个测试样本与训练集之间的欧氏距离,然后根据距离的远近来分类的算法,其伪代码如下:
- 计算已知类别数据集中的点与当前点的欧氏距离
- 按照距离递增进行排序
- 选取与当前点的距离最小的k个点
- 确定前k个点所在类别出现的频率
- 返回前k个点出现频率最干的类别作为当前点的预测分类
2.2 具体代码
我们以约会网站的
数据进行测试,看看使用KNN算法进行分类的效果是否理想。
import numpy as np
import operator
# 产生测试用列
def createDataSet():
group = np.array([[1.0, 1.1], [1.0, 1.0], [0, 0], [0, 0.1]])
labels = ['A', 'A', 'B', 'B']
return group, labels
# 分类函数(KNN算法实现)
def classify(inX, dataSet, labels, k):
dataSetSize = dataSet.shape[0]
# np.tile() 复制函数 沿y轴复制 dataSetSize 次
diffMat = np.tile(inX, (dataSetSize, 1)) - dataSet
sqDiffMat = diffMat ** 2
sqDistance = sqDiffMat.sum(axis=1)
distances = sqDistance ** 0.5
sortedDistIndicies = np.argsort(distances)
classCount = {}
for i in range(k):
voteIlable = labels[sortedDistIndicies[i]]
classCount[voteIlable] = classCount.get(voteIlable, 0) + 1
sortedClassCount = sorted(classCount.items(), key=operator.itemgetter(1), reverse=True)
return sortedClassCount[0][0]
# 获取数据
def file2matrix(filename):
fr = open(filename)
# 获取所有行
arrayOLines = fr.readlines()
numberOfLines = len(arrayOLines)
# 产生一个 numberOfLines 行3列的矩阵
returnMat = np.zeros((numberOfLines, 3))
classLabelVector = []
for index in range(numberOfLines):
listFromLine = arrayOLines[index].strip().split('\t')
returnMat[index, :] = listFromLine[0:3]
classLabelVector.append(int(listFromLine[-1]))
return returnMat, classLabelVector
# 数据归一化
def autoNorm(dataSet):
# 每一列的最小值,列表形式
minVals = dataSet.min(0)
# 每一列的最大值,列表形式
maxVals = dataSet.max(0)
# 区间值,列表形式
ranges = maxVals - minVals
normDataSet = np.zeros(np.shape(dataSet))
m = dataSet.shape[0]
normDataSet = dataSet- np.tile(minVals, (m, 1))
normDataSet = normDataSet / np.tile(ranges, (m, 1))
return normDataSet, ranges, minVals
# 测试函数
def datingClassTest():
hoRatio = 0.05
datingDataMat, datingLabels = file2matrix('datingTestSet2.txt')
normMat, ranges, minVals = autoNorm(datingDataMat)
m = normMat.shape[0]
numTestVecs = int(m * hoRatio)
errorCount = 0.0
for i in range(numTestVecs):
classifierResult = classify(normMat[i, :], normMat[numTestVecs:m, :], datingLabels[numTestVecs:m], 3)
print('The classifier came back with: %d, the real answer is: %d' % (classifierResult, datingLabels[i]))
if (classifierResult != datingLabels[i]):
errorCount += 1.0
print('The total error rate is: %f' % (errorCount / float(numTestVecs)))
datingClassTest()
我们使用总数据的5%来进行测试,用95%的数据来进行训练,得到的错误在2%,已经非常理想,效果见下。
runfile('/Volumes/文档/ML/KNN.py', wdir='/Volumes/文档/ML')
The classifier came back with: 3, the real answer is: 3
The classifier came back with: 2, the real answer is: 2
The classifier came back with: 1, the real answer is: 1
The classifier came back with: 1, the real answer is: 1
The classifier came back with: 1, the real answer is: 1
The classifier came back with: 1, the real answer is: 1
The classifier came back with: 3, the real answer is: 3
The classifier came back with: 3, the real answer is: 3
The classifier came back with: 1, the real answer is: 1
The classifier came back with: 3, the real answer is: 3
The classifier came back with: 1, the real answer is: 1
The classifier came back with: 1, the real answer is: 1
The classifier came back with: 2, the real answer is: 2
The classifier came back with: 1, the real answer is: 1
The classifier came back with: 1, the real answer is: 1
The classifier came back with: 1, the real answer is: 1
The classifier came back with: 1, the real answer is: 1
The classifier came back with: 1, the real answer is: 1
The classifier came back with: 2, the real answer is: 2
The classifier came back with: 3, the real answer is: 3
The classifier came back with: 2, the real answer is: 2
The classifier came back with: 1, the real answer is: 1
The classifier came back with: 3, the real answer is: 2
The classifier came back with: 3, the real answer is: 3
The classifier came back with: 2, the real answer is: 2
The classifier came back with: 3, the real answer is: 3
The classifier came back with: 2, the real answer is: 2
The classifier came back with: 3, the real answer is: 3
The classifier came back with: 2, the real answer is: 2
The classifier came back with: 1, the real answer is: 1
The classifier came back with: 3, the real answer is: 3
The classifier came back with: 1, the real answer is: 1
The classifier came back with: 3, the real answer is: 3
The classifier came back with: 1, the real answer is: 1
The classifier came back with: 2, the real answer is: 2
The classifier came back with: 1, the real answer is: 1
The classifier came back with: 1, the real answer is: 1
The classifier came back with: 2, the real answer is: 2
The classifier came back with: 3, the real answer is: 3
The classifier came back with: 3, the real answer is: 3
The classifier came back with: 1, the real answer is: 1
The classifier came back with: 2, the real answer is: 2
The classifier came back with: 3, the real answer is: 3
The classifier came back with: 3, the real answer is: 3
The classifier came back with: 3, the real answer is: 3
The classifier came back with: 1, the real answer is: 1
The classifier came back with: 1, the real answer is: 1
The classifier came back with: 1, the real answer is: 1
The classifier came back with: 1, the real answer is: 1
The classifier came back with: 2, the real answer is: 2
The total error rate is: 0.020000