对于《机器学习实战》中逻辑斯谛回归算法公式理解:
weights = weights + alpha * dataMatrix.transpose() * error 附:logistic算法[python] view plain copydef sigmoid(inX): return 1.0/(1+exp(-inX)) def gradAscent(dataMatIn, classLabels): dataMatrix = mat(dataMatIn) labelMat = mat(classLabels).transpose() m, n = shape(dataMatrix) alpha = 0.001 maxCycles = 500 weights = ones((n, 1)) for k in range(maxCycles): h = sigmoid(dataMatrix*weights) error = (labelMat - h) weights = weights + alpha * dataMatrix.transpose() * error return weights
原理推导如下: