图拉普拉斯矩阵与Locality Preserving Projection(LPP)

之前想在网上搜图拉普拉斯矩阵的中文资料,但却一篇都搜不到,只好自己写一篇了。

其实原理也不难。

首先是一些基本概念,PCA:



然后是LDA:



然后是图拉普拉斯的最优化问题的构造

图拉普拉斯是要找到高维向量的一个低维流形,该流形能够保持数据的分布特征,具体的说就是让高维空间相近的点(也就是高维向量,这里看作点比较直观)在低维空间也相近,在高维空间相隔远的点在低维空间也离得远。

在这篇Paper,保持数据特征的具体实现方法就是Locality Preserving Projection(LPP),它将问题简化,只考虑与每个点最邻近的点(k nearest neighbors,所以整个数据就变成了图的形式了,只考虑每个点的邻边),让它们在低维空间保持近邻的分布,别的点就不管了。

如下图的公式所示,高维空间两个点xi和xj映射到一维空间后结果为yi和yj,LPP希望找到一个w向量 (yi=w' * xi) ,使得以下目标函数最小



然后就是一通数学推导



这个L=D-S就是我们要的图拉普拉斯矩阵。这个推导只要注意一下Dii的下标就可以了,不算很复杂。

最后这个优化问题可以化作求特征值的问题:



后面这篇文章还分析了LPP与LDA和PCA的关系,以及用在人脸识别的性能。这里就不详细说了,我写这个博客只是想来填补图拉普拉斯矩阵这个空白的。

参考文献:

He, Xiaofei, et al. "Face recognition using Laplacianfaces." Pattern Analysis and Machine Intelligence, IEEE Transactions on 27.3 (2005): 328-340.


  • 8
    点赞
  • 29
    收藏
    觉得还不错? 一键收藏
  • 2
    评论
Locality Preserving Projection (LPP) is a dimensionality reduction technique that aims to preserve the local structure of the data. It uses the covariance matrix of the data to achieve this goal. Here are the steps to use covariance in LPP: 1. Compute the covariance matrix of the data. The covariance matrix is a matrix that quantifies the relationship between the different variables in the data. It can be computed using the formula: Covariance matrix = (1/n) * ((X - μ) * (X - μ)T) where X is the data matrix, μ is the mean of the data, and n is the number of data points. 2. Compute the affinity matrix. The affinity matrix is a matrix that quantifies the similarity between different data points. It can be computed using the formula: Affinity matrix = exp(-||xi - xj||^2 / σ^2) where xi and xj are two data points, ||.|| is the Euclidean distance between them, and σ is a parameter that controls the scale of the affinity matrix. 3. Compute the graph Laplacian. The graph Laplacian is a matrix that measures the smoothness of the data on the graph defined by the affinity matrix. It can be computed using the formula: Graph Laplacian = D - W where D is a diagonal matrix with the degree of each node on the diagonal, and W is the affinity matrix. 4. Compute the eigenvectors of the graph Laplacian. The eigenvectors of the graph Laplacian represent the new coordinates of the data in the reduced-dimensional space. 5. Select the k eigenvectors corresponding to the k smallest eigenvalues. These eigenvectors represent the k-dimensional subspace that preserves the local structure of the data. 6. Project the data onto the k-dimensional subspace. The projected data is obtained by multiplying the data matrix with the selected eigenvectors. In summary, covariance is used in LPP to compute the affinity matrix, which is then used to compute the graph Laplacian. The eigenvectors of the graph Laplacian are used to project the data onto a lower-dimensional subspace that preserves the local structure of the data.
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值