Comprehensions on Group NMF

               

最近看了一下group sparsity和group structure方面的东西,本文主要针对了其中一种在NMF上的应用得到的group sparsity总结了一些东西。这篇理论上的文章没有被引用很多,但是其在EEG上用Group NMF做得一篇文章倒是有些影响力的。具体参考reference吧。总的来说,group sparsity或者单纯的sparsity对于一些有物理意义的东西比较好解释,我们通常觉得一些东西的基(basis,or say, feature)和令一些东西的基是不同的,所以可以按照sample分group,或者feature分group。现在group的东西也做得差不多了,从当初的group Lasso到group PCA,再到本文中的Group NMF。但不知道引用率少是关注的人少呢还是效果不好呢?!不忍吐槽下现在拼性能的坑爹研究环境,如果应用一个东西效果不好都没人发paper了……不说废话,看Group NMF!





Comprehensions on Group NMF

Rachel Zhang

 

1.      Brief Introduction

1.      Whyuse Group Sparsity?

An observation that features or data items withina group are expected to share the same sparsity pattern in their latent factorrepresentation.

假设的Group sparsity就是从属于同一个group的数据项或者特征在low-rank 表示中有相似的sparsity pattern.

2.      Differencefrom NMF.

As a variation of traditional NMF, Group NMF considersgroup sparsity regularization methods forNMF.

3.      Application.

Dimension reduction, Noise removal in text mining,bioinformations, blind source separation, computer vision. Group NMFenable natural interpretation of discovered latent factors.

4.      What is group?

Different types of features, such as in CV: pixelvalues, gradient features, 3D pose features, etc. 同种feature组成一个group

 

 

 

 

 

2.      Relatedwork

5.      Relatedwork on Group Sparsity

5.1.      Lasso(The Least Absolute Shrinkage and Selection Operator)

l1-norm penalized linear regression

(1)



5.2.      GroupLasso

Group sparsity using l1,2-norm regularization

(2)

where the sqrt(pl)terms accounts for the varying group sizes

5.3.      Sparsegroup lasso

(3)

5.4.      Hierarchicalregularization with tree structure,2010

          R. Jenatton, J. Mairal, G. Obozinski, and F. Bach. “Proximal methods forsparse hierarchical dictionary learning”. ICML 2010

5.5.      Thereare some other works focus on group sparsity on PCA



6.      Relatedworks on NMF

6.1      AffineNMF: extending NMF with an offset vector. Affine NMF is used to simultaneouslyfactorize.

 

 

 

 

 

3.       Problem to Solve

Consider a matrix X∈ Rm×n .Assume that the rows of Xrepresent features and the columns of Xrepresent data items.

7.       In standard NMF, we are interested in discovering two low-rankfactor matrices W and H by minimizing an objective function:

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值