关闭

矩阵分解(rank decomposition)文章代码汇总

1739人阅读 评论(0) 收藏 举报
分类:

矩阵分解(rank decomposition)

本文收集了现有矩阵分解的几乎所有算法和应用,原文链接:https://sites.google.com/site/igorcarron2/matrixfactorizations

Matrix Decompositions has a long history and generally centers around a set of known factorizations such as LU, QR, SVD and eigendecompositions. More recent factorizations have seen the light of the day with work started with the advent of NMF, k-means and related algorithm [1]. However, with the advent of new methods based on random projections and convex optimization that started in part in the compressive sensing literature, we are seeing another surge of very diverse algorithms dedicated to many different kinds of  with new constraints based on rank and/or positivity and/or sparsity,… As a result of this large increase in interest, I have decided to keep a list of them here following the success of the big picture in compressive sensing.

The sources for this list include the following most excellent sites: Stephen Becker’s pageRaghunandan H. Keshavan‘ s pageNuclear Norm and Matrix Recovery through SDP by Christoph HelmbergArvind Ganesh‘s Low-Rank Matrix Recovery and Completion via Convex Optimization who provide more in-depth additional information.  Additional codes were featured also on Nuit Blanche. The following people provided additional inputs: Olivier GriselMatthieu Puigt.

Most of the algorithms listed below generally rely on using the nuclear norm as a proxy to the rank functional. It may not be optimal. Currently, CVX ( Michael Grant and Stephen  Boyd) consistently allows one to explore other proxies for the rank functional such as the log-det as found by Maryam  FazellHaitham HindiStephen Boyd. ** is used to show that the algorithm uses another heuristic than the nuclear norm.

In terms of notations, A refers to a matrix, L refers to a low rank matrix, S a sparse one and N to a noisy one. This page lists the different codes that implement the following matrix factorizations: Matrix Completion, Robust PCA , Noisy Robust PCA, Sparse PCA, NMF, Dictionary Learning, MMV, Randomized Algorithms and other factorizations. Some of these toolboxes can sometimes implement several of these decompositions and are listed accordingly. Before I list algorithm here, I generally feature them on Nuit Blanche under the MF tag: http://nuit-blanche.blogspot.com/search/label/MF or. you can also subscribe to the Nuit Blanche feed,


Matrix Completion, A = H.*L with H a known mask, L unknown solve for L lowest rank possible

The idea of this approach is to complete the unknown coefficients of a matrix based on the fact that the matrix is low rank:

Noisy Robust PCA,  A = L + S + N with L, S, N unknown, solve for L low rank, S sparse, N noise

Robust PCA : A = L + S with L, S, N unknown, solve for L low rank, S sparse

Sparse PCA: A = DX  with unknown D and X, solve for sparse D

Sparse PCA on wikipedia

  • R. Jenatton, G. Obozinski, F. Bach. Structured Sparse Principal Component Analysis. International Conference on Artificial Intelligence and Statistics (AISTATS). [pdf] [code]
  • SPAMs
  • DSPCA: Sparse PCA using SDP . Code is here.
  • PathPCA: A fast greedy algorithm for Sparse PCA. The code is here.

Dictionary Learning: A = DX  with unknown D and X, solve for sparse X

Some implementation of dictionary learning implement the NMF

NMF: A = DX with unknown D and X, solve for elements of D,X > 0

Non-negative Matrix Factorization (NMF) on wikipedia

Multiple Measurement Vector (MMV) Y = A X with unknown X and rows of X are sparse.

Blind Source Separation (BSS) Y = A X with unknown A and X and statistical independence between columns of X or subspaces of columns of X

Include Independent Component Analysis (ICA), Independent Subspace Analysis (ISA), and Sparse Component Analysis (SCA). There are many available codes for ICA and some for SCA. Here is a non-exhaustive list of some famous ones (which are not limited to linear instantaneous mixtures). TBC

ICA:

SCA:

Randomized Algorithms

These algorithms uses generally random projections to shrink very large problems into smaller ones that can be amenable to traditional matrix factorization methods.

Resource
Randomized algorithms for matrices and data by Michael W. Mahoney
Randomized Algorithms for Low-Rank Matrix Decomposition

Other factorization

D(T(.)) = L + E with unknown L, E and unknown transformation T and solve for transformation T, Low Rank L and Noise E

Frameworks featuring advanced Matrix factorizations

For the time being, few have integrated the most recent factorizations.

GraphLab / Hadoop

Books

Example of use

Sources

Arvind Ganesh‘s Low-Rank Matrix Recovery and Completion via Convex Optimization

Relevant links

Reference:

A Unified View of Matrix Factorization Models by Ajit P. Singh and Geoffrey J. Gordon


0
0
查看评论

Low-Rank Matrix Recovery and Completion via Convex Optimization

Low-Rank Matrix Recovery and Completion via Convex Optimization This website introduces new tools for recovering low-rank matrices from...
  • BingeCuiLab
  • BingeCuiLab
  • 2015-05-17 17:16
  • 731

常见的几种矩阵分解方式

1.三角分解(LU分解)矩阵的LU分解是将一个矩阵分解为一个下三角矩阵与上三角矩阵的乘积。本质上,LU分解是高斯消元的一种表达方式。首先,对矩阵A通过初等行变换将其变为一个上三角矩阵。对于学习过线性代数的同学来说,这个过程应该很熟悉,线性代数考试中求行列式求逆一般都是通过这种方式来求解。然后,将原始...
  • bitcarmanlee
  • bitcarmanlee
  • 2016-09-25 15:54
  • 27494

数值分析--矩阵QR分解的三种方法

QR分解法是目前求一般矩阵全部特征值的最有效并广泛应用的方法,一般矩阵先经过正交相似变化成为Hessenberg矩阵,然后再应用QR方法求特征值和特征向量。它是将矩阵分解成一个正规正交矩阵Q与上三角形矩阵R,所以称为QR分解法,与此正规正交矩阵的通用符号Q有关。
  • gggg_ggg
  • gggg_ggg
  • 2015-08-27 10:33
  • 2209

矩阵分解在推荐系统的应用以及python代码的实现

使用矩阵分解技术补充打分矩阵中空白部分
  • love_data_scientist
  • love_data_scientist
  • 2017-07-27 18:42
  • 895

矩阵分解笔记(Notes on Matrix Factorization)

本博文主要讨论 基本矩阵(Basic MF),非负矩阵(Non-negative MF)和正交非负矩阵(Orthogonal non-negative MF)三种常见的矩阵分解方法。并分别推导了它们的更新规则,收敛性,以及它们的应用。 本文地址:http://blog.csdn.net/shang...
  • shanglianlm
  • shanglianlm
  • 2015-07-04 14:39
  • 4751

scikit-learn:2.5.矩阵因子分解问题

http://scikit-learn.org/stable/modules/decomposition.html#lsa 2.5.1:PCA 标准PCA:只支持批量处理 incremental PCA:支持分批处理,对于内存容不下的情况很好 Approximate PCA:Randomiz...
  • mmc2015
  • mmc2015
  • 2015-07-13 20:55
  • 1333

scikit-learn:通过Non-negative matrix factorization (NMF or NNMF)实现LSA(隐含语义分析)

之前写过两篇文章,分别是 1)矩阵分解的综述:scikit-learn:2.5.矩阵因子分解问题 2)关于TruncatedSVD的简单介绍:scikit-learn:通过TruncatedSVD实现LSA(隐含语义分析) 今天发现NMF也是一个很好很实用的模型,就简单介绍一下,它也属于scikit...
  • mmc2015
  • mmc2015
  • 2015-08-20 09:02
  • 1674

【Python学习系列二十三】Scikit_Learn库降维方法(矩阵分解)-PCA&FA

1主成分分析PCA 1.1 精确PCA和似然估计 PCA基于最大方差的正交变量分解多维数据集。在scikit-learn库中,PCA的实现是先通过fit方法计算n维的特征值和特征向量,然后通过transformer对象做数据转换,转换后的新数据可以在n维上得到映射。 可选参数whiten =True...
  • fjssharpsword
  • fjssharpsword
  • 2017-07-11 14:34
  • 1707

sklearn学习笔记_目录

sklearn Release 0.17 1   Welcome to scikit-learn 1.1  Installing scikit-learn 安装sklearn 1.2  Frequently Asked Quest...
  • qq_23392165
  • qq_23392165
  • 2016-03-15 15:15
  • 2014

【Scikit-Learn 中文文档】分解成分中的信号(矩阵分解问题) - 无监督学习 - 用户指南 | ApacheCN

中文文档: http://sklearn.apachecn.org/cn/stable/modules/decomposition.html 英文文档: http://sklearn.apachecn.org/en/stable/modules/decomposition....
  • V_Princekin
  • V_Princekin
  • 2017-11-29 21:53
  • 148
    个人资料
    • 访问:767172次
    • 积分:9207
    • 等级:
    • 排名:第2409名
    • 原创:193篇
    • 转载:173篇
    • 译文:0篇
    • 评论:128条
    最新评论
    联系方式
    851980453@qq.com