Rachel Zhang
1. Brief Introduction
1. Whyuse Group Sparsity?
An observation that features or data items withina group are expected to share the same sparsity pattern in their latent factorrepresentation.
假设的Group sparsity就是从属于同一个group的数据项或者特征在low-rank 表示中有相似的sparsity pattern.
2. Differencefrom NMF.
As a variation of traditional NMF, Group NMF considersgroup sparsity regularization methods forNMF.
3. Application.
Dimension reduction, Noise removal in text mining,bioinformations, blind source separation, computer vision. Group NMFenable natural interpretation of discovered latent factors.
4. What is group?
Different types of features, such as in CV: pixelvalues, gradient features, 3D pose features, etc. 同种feature组成一个group
2. Relatedwork
5. Relatedwork on Group Sparsity
5.1. Lasso(The Least Absolute Shrinkage and Selection Operator)
l1-norm penalized linear regression
5.2. GroupLasso
Group sparsity using l1,2-norm regularization
where the sqrt(pl)terms accounts for the varying group sizes
5.3. Sparsegroup lasso
5.4. Hierarchicalregularization with tree structure,2010
R. Jenatton, J. Mairal, G. Obozinski, and F. Bach. “Proximal methods forsparse hierarchical dictionary learning”. ICML 2010
5.5. Thereare some other works focus on group sparsity on PCA
6. Relatedworks on NMF
6.1 AffineNMF: extending NMF with an offset vector. Affine NMF is used to simultaneouslyfactorize.
3. Problem to Solve
Consider a matrix X∈ Rm×n .Assume that the rows of Xrepresent features and the columns of Xrepresent data items.
7. In standard NMF, we are interested in discovering two low-rankfactor matrices W and H by minimizing an objective function: