PCA的核心就是对原始特征空间的重构(将一组可能线性相关的变量,通过正交变换变换成一组线性无关的变量)
两个基本的要求是最大投影方差(即找到的投影方向对于数据集投影方差最大),最小重构代价(即降维所得到的新数据与原数据相比,信息损失最小)
X = ( x 1 x 2 ⋯ x N ) N × p T = ( x 1 T x 2 T ⋮ x N T ) = ( x 11 x 12 ⋯ x 1 p x 21 x 22 ⋯ x 2 p ⋮ ⋮ ⋮ x N 1 x N 2 ⋯ x N P ) N × p x i ∈ R p , i = 1 , 2 , ⋯ , N 记 1 N = ( 1 1 ⋮ 1 ) N × 1 x ˉ = 1 N X T 1 N , S = 1 N X T H X \begin{gathered} X=\begin{pmatrix} x_{1} & x_{2} & \cdots & x_{N} \end{pmatrix}^{T}_{N \times p}=\begin{pmatrix} x_{1}^{T} \\ x_{2}^{T} \\ \vdots \\ x_{N}^{T} \end{pmatrix}=\begin{pmatrix} x_{11} & x_{12} & \cdots & x_{1p} \\ x_{21} & x_{22} & \cdots & x_{2p} \\ \vdots & \vdots & & \vdots \\ x_{N1} & x_{N2} & \cdots & x_{NP} \end{pmatrix}_{N \times p}\\ x_{i}\in \mathbb{R}^{p},i=1,2,\cdots ,N\\ 记1_{N}=\begin{pmatrix}1 \\ 1 \\ \vdots \\ 1\end{pmatrix}_{N \times 1}\\ \bar{x}=\frac{1}{N}X^{T}1_{N},S=\frac{1}{N}X^{T}\mathbb{H}X \end{gathered} X=(x1x2⋯xN)N×pT=⎝ ⎛x1Tx2T⋮xNT⎠ ⎞=⎝ ⎛x11x21⋮xN1x12x22⋮xN2⋯⋯⋯x1px2p⋮xNP⎠ ⎞N×pxi∈Rp,i=1,2,⋯,N记1N=⎝ ⎛11⋮1⎠