SINGULAR VALUE DECOMPOSITION

Singular Value Decomposition
Maths

Singular Value Decomposition (SVD) is considered as the final and best factorization of a matrix, as described by Professor Gilbert Strang from Massachusetts Institute of Technology. Here is what SVD says,

Any matrix A whatsoever can be decomposed into


where SIGMA is a diagonal matrix, U and V are two orthogonal matrices.

Question is, how could it become possible? After thorough meditation, I will give a natural motivation to this perfect decomposition of a matrix in this paper.


Figure 1. Four fundamental subspaces of matrix X

Proof on SVD

Suppose we are given an m-by-n matrix X whose rank is r. Since any matrix can be viewed as comprised of four fundamental subspaces, we can depict Figure 1 for matrix X.

Through Gram-Schmidt orthogonalization, we can always obtain all of the orthonormal bases in each subspace of X, which are,

v1, …,vr are orthonormal bases in the row space of X,

vr+1, …,vn are orthonormal bases in the null space of X,

u1, …,ur are orthonormal bases in the column space of X,

ur+1, …,um are orthonormal bases in the left null space of X.

Here comes a challenging problem which will motivate SVD naturally. We are asked to find a linear transform given by an m-by-n matrix A that makes a subtle connection between the row space and the column space, i.e.,

where


Furthermore, taking the null space and the left null space of X into consideration, we can express this problem explicitly into the following equation.


Letting


yields


Since V is an orthogonal matrix, V is invertible and V-1=VT. We can obtain Equation (*),


Thus, if we can find appropriate matrices U, V and SIGMA, the proof on SVD is done. Now, let’s find them.

Using Equation (*), we have,


Noting that ATA is a symmetric matrix, moreover, an n-order symmetric matrix S has n independent eigenvectors and can be decomposed as,


where the columns of Q are the eigenvectors of S and LAMBDA is a diagonal matrix with the eigenvalues of S, we can conclude that the columns of V are just the eigenvectors of ATA and the entries on the primary diagonal in the diagonal matrix between V and VT are just the eigenvalues of ATA.

Likewise, we can find U in exactly the same way,


Therefore, similarly, the columns of U are just the eigenvectors of AAand the entries on the primary diagonal in the diagonal matrix between U and UT are just the eigenvalues of AAT.

Since V, U and SIGMA have been nailed down, now it’s time to declare our victory of proving the Singular Value Decomposition. Any matrix A whatsoever can be decomposed into



Postscript

Laymen are always looking upon a matrix as a simple box of numbers, while mathematicians like us are able to see what’s typically inside of this magical box, i.e., four fundamental subspaces of a matrix.

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
### 回答1: 奇异值分解(Singular Value Decomposition,SVD)是一种矩阵分解的方法,将一个矩阵分解为三个矩阵的乘积,其中一个矩阵是正交矩阵,另外两个矩阵是对角矩阵。SVD在数据分析、信号处理、图像处理等领域有广泛的应用。它可以用于降维、数据压缩、矩阵近似、特征提取等任务。 ### 回答2: 奇异值分解(Singular Value Decomposition,SVD)是矩阵分解的一种方法,它可以将一个复杂的数据矩阵分解成三个简单的矩阵的乘积的形式。这三个矩阵包括:左奇异向量矩阵、奇异值矩阵和右奇异向量矩阵。 在SVD中,奇异值是矩阵的特征值,奇异向量是矩阵的特征向量,而左奇异向量和右奇异向量分别代表数据矩阵在两个不同空间上的特殊变换。在数据处理和分析中,SVD可以用于减少噪声,压缩数据,以及解决线性方程组等问题。 SVD最初由数学家Eckart和Young在1936年提出,而在20世纪60年代和70年代,它才得到了广泛的应用。目前,SVD已经成为了很多数据分析、机器学习和人工智能领域中最常用的技术之一。 在实际应用中,SVD可以用于图像处理、推荐系统、自然语言处理、文本分类、维度约简和信号处理等领域。例如,在推荐系统中,SVD可以用于预测用户对产品的评分,从而为用户推荐最符合他们兴趣的商品。在文本分类中,SVD可以将高维的单词向量映射到低维空间中,从而提高分类的性能。 虽然SVD在许多应用中取得了成功,但其计算代价很高,因此通常需要进行优化以提高效率。一些优化技术包括截断SVD(Truncated SVD)、随机SVD(Randomized SVD)和增量SVD(Incremental SVD)等。这些技术可以降低计算复杂度和内存消耗,提高SVD的速度和可用性。 ### 回答3: 奇异值分解(singular value decomposition, 简称SVD)是一种用于矩阵分解的数学方法,它将一个复杂的矩阵分解成三个部分:U、Σ、V。其中U和V都是正交矩阵,而Σ是一个对角矩阵,对角线上的元素称为奇异值。SVD的应用广泛,例如在图像压缩、信号处理、语音识别、推荐系统等领域都有重要的作用。 SVD的本质目标是将矩阵M表示为下述的累加形式: M = UΣV^T 其中,U和V都是矩阵,Σ是一个对角线上元素按从大到小排列的矩阵,它们的关系是这样的:矩阵M的秩r等于Σ中非零元素的个数。因此,奇异值从大到小表示了矩阵中的信号能量大小,而U和V则表示了信号在不同的方向上的分解。 SVD可以应用于很多问题中。例如,在图像压缩中,可以使用SVD对图像矩阵进行分解,并选取前k个奇异值对应的列向量,再把它们相乘,得到一个近似于原图像的低维矩阵,从而实现图像的压缩。在推荐系统中,SVD可以用来将用户评价和物品特征分解成低维矩阵,从而实现对用户和物品的推荐。此外,SVD还被广泛地应用于语音识别、图像识别等领域。 总的来说,SVD是一种强有力的数学工具,它可以对矩阵进行分解,并提取出有用的信息。由于它的广泛应用和独特的分解方式,SVD也成为了计算机科学和应用数学中的一个热门研究领域。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值