# 主成分分析（PCA）与Kernel PCA

Suppose are training samples with zero mean. The goal of PCA is to find a set of vectors in space containing the max amount of variance in the data.

The projections of the all pixels xj onto this normalized direction v are

The variance of the projections is

Projections of the data on the principal axes are called principal components, also known as PC scores; these can be seen as new, transformed, variables. The j-th principal component is given by j-th column of XV. The coordinates of the i-th data point in the new PC space are given by the i-th row of XV。

Kernel PCA

C和XTX具有相同的特征向量。但现在的问题是Φ是隐式的，我们并不知道。所以，我们需要设法借助核函数K来求解XTX

The eigenvalue problem of K=XXT is (XXT)u = λu。现在我们需要的是XTX，所以把上述式子的左右两边同时乘以一个XT，从而构造出我们想要的，于是有

XT(XXT)u = λXTu

(XTX)(XTu)=λ(XTu)

Solve the following eigenvalue problem:

The projection of the test sample Φ(xj) on the i-th eigenvector can be computed by

【1】http://blog.csdn.net/baimafujinji/article/details/50373143

【2】部分图片来自李政軒博士的在线授课视频

【3】http://blog.csdn.net/baimafujinji/article/details/50372906

【4】https://stats.stackexchange.com/questions/134282/relationship-between-svd-and-pca-how-to-use-svd-to-perform-pca

【5】http://blog.csdn.net/baimafujinji/article/details/79372911