<span style="font-family: Arial, Helvetica, sans-serif; background-color: rgb(255, 255, 255);">对于《机器学习实战》中逻辑斯谛回归算法,其中有一行不好理解:</span>
weights = weights + alpha * dataMatrix.transpose() * error
原理推导如下:
附:logistic算法
def sigmoid(inX):
return 1.0/(1+exp(-inX))
def gradAscent(dataMatIn, classLabels):
dataMatrix = mat(dataMatIn)
labelMat = mat(classLabels).transpose()
m, n = shape(dataMatrix)
alpha = 0.001
maxCycles = 500
weights = ones((n, 1))
for k in range(maxCycles):
h = sigmoid(dataMatrix*weights)
error = (labelMat - h)
weights = weights + alpha * dataMatrix.transpose() * error
return weights