1 什么是PCA
主成分分析:
●一个非监督的机器学习算法
●主要用于数据的降维。通过降维,可以发现更便于人类理解的特征
●其他应用:可视化;去噪
2 使用梯度上升法求解PCA问题
3 求数据的主成分PCA
使用梯度上升法求解主成分:
import numpy as np
import matplotlib.pyplot as plt
x = np.empty((100, 2))
x[:, 0] = np.random.uniform(0, 100, size = 100)
x[:, 1] = x[:, 0] * 0.75 + 3 + np.random.normal(0, 10, size = 100)
plt.scatter(x[:, 0], x[:, 1])
plt.show()
输出:
demeam:
def demean(x):
return x - np.mean(x, axis = 0)
x_demean = demean(x)
plt.scatter(x_demean[:, 0], x_demean[:, 1])
plt.show()
输出:
# 查看均值是否为0
print(np.mean(x_demean[:, 0]))
print(np.mean(x_demean[:, 1]))
>>>2.4655832930875478e-14
>>>-7.460698725481052e-16
梯度上升法:
def f(w, x):
return np.sum((x.dot(w)) ** 2) / len(x)
def df_math(w, x):
return x.T.dot(x.dot(w)) * 2 / len(x)
def direction(w):
return w / np.linalg.norm(w) # 求w的单位向量
def gradient_ascent(x, initial_w, eta, n_iter = 1e4, epsilon=1e-8):
w = direction(initial_w)
cur_iter = 0
while cur_iter < n_iter:
gradient = df_math(w, x) # 计算梯度
last_w = w
w = w + eta * gradient # 更新w值
w = direction(w)
if abs(f(w, x) - f(last_w, x)) < epsilon:
break
cur_iter += 1
return w
initial_w = np.random.random(x.shape[1]) # 不能用0开始
eta = 0.01
gradient_ascent(x_demean, initial_w, eta)
>>>array([0.76763209, 0.64089077])
w = gradient_ascent(x_demean, initial_w, eta)
plt.scatter(x_demean[:, 0], x_demean[:, 1])
plt.plot([0, w[0] * 50], [0, w[1] * 50], color = 'r') # 先画出0,0点,再画出w[0]*50,w[1]*50点
plt.show()
输出: