前言
梯度下降法作为机器学习中较常使用的优化算法,有以下三种不同形式:
- BGD(Batch Gradient Descent):批量梯度下降
- MBGD(Mini-batch Gradient Descent):小批量梯度下降
- SGD(Stochastic Gradient Descent):随机梯度下降
批量梯度下降(Batch Gradient Descent,BGD)
代码:
import numpy as np
from matplotlib import pylab as plt
def BGD(x, y, alpha):
theta = 0
while True:
hypothesis = np.dot(x, theta)
loss = hypothesis - y
gradient = np.dot(x.transpose(), loss) / len(x)
theta = theta - alpha * gradient
if abs(gradient) < 0.0001:
break
return theta
# 假设出数据
x = np.array([0.2, 0.3, 0.5, 0.68, 0.8, 1.0, 1.15, 1.3, 1.7, 1.8, 1.5, 1.75, 1.7, 2.0])
y = np.array([0.7, 0.4, 1.0, 0.9, 1.4, 1.1, 1.25, 1.9, 2.2, 2.5, 1.7, 2.0, 2.6, 2.8])
# 学习率
alpha = 0.04
# 批量梯度下降
weight = BGD(x, y, alpha)
print(weight)
# 绘制所有数据点
plt.plot(x, y, 'ro')
# 绘制拟合出来的直线
plt.plot(x, x*weight)
# 显示
plt.show()