LDA必读的资料

一个大牛写的介绍,貌似需翻墙

http://tedunderwood.wordpress.com/2012/04/07/topic-modeling-made-just-simple-enough/

David M.Blei主页http://www.cs.princeton.edu/~blei/publications.html,上面有布雷最新的文章:Introduction to probabilistic topic models

以下内容来自网络,但是作者已经不可考啦,抱歉没法找到原始引用

关于LDA并行化: 那么若利用MapReduce实现,怎样的近似方法好呢? 斯坦福的ScalaNLP项目值得一看: http://nlp.stanford.edu/javanlp/scala/scaladoc/scalanlp/cluster/DistributedGibbsLDA$object.html   另外还有NIPS2007的论文: Distributed Inference for Latent DirichletAllocation http://books.nips.cc/papers/files/nips20/NIPS2007_0672   ICML2008的论文: Fully Distributed EM for Very Large Datasetshttp://www.cs.berkeley.edu/~jawolfe/pubs/08-icml-em  

 

LDA和HLDA:   (1)D. M. Blei, et al., "Latent Dirichlet allocation," Journal of Machine Learning Research, vol. 3, pp. 993-1022, 2003.   (2)T. L. Griffiths and M. Steyvers, "Finding scientific topics," Proceedings of the National Academy of Sciences, vol. 101, pp. 5228-5235, 2004.   (3)D. M. Blei, et al., "Hierarchical Topic Models and the Nested Chinese Restaurant Process," NIPS, 2003.   (4)Blei的LDA视频教程:http://videolectures.net/mlss09uk_blei_tm/   (5)Teh的关于Dirichlet Processes的视频教程:http://videolectures.net/mlss07_teh_dp/   (6)Blei的毕业论文:http://www.cs.princeton.edu/~blei/papers/Blei2004.pdf   (7)Jordan的报告:http://www.icms.org.uk/downloads/mixtures/jordan_talk.pdf   (8)G. Heinrich, "Parameter Estimation for Text Analysis," http://www.arbylon.net/publications/text-est.pdf   基础知识:   (1)P. Johnson and M. Beverlin, “Beta Distribution,” http://pj.freefaculty.org/ps707/Distributions/Beta.pdf   (2)M. Beverlin and P. Johnson, “The Dirichlet Family,” http://pj.freefaculty.org/stat/Distributions/Dirichlet.pdf   (3)P. Johnson, “Conjugate Prior and Mixture Distributions”, http://pj.freefaculty.org/stat/TimeSeries/ConjugateDistributions.pdf   (4)P.J. Green, “Colouring and Breaking Sticks:Random Distributions and Heterogeneous Clustering”, http://www.maths.bris.ac.uk/~mapjg/papers/GreenCDP.pdf   (5)Y. W. Teh, "Dirichlet Process", http://www.gatsby.ucl.ac.uk/~ywteh/research/npbayes/dp.pdf   (6)Y. W. Teh and M. I. Jordan, "Hierarchical Bayesian Nonparametric Models with Applications,”   http://www.stat.berkeley.edu/tech-reports/770.pdf   (7)T. P. Minka, "Estimating a Dirichlet Distribution", http://research.microsoft.com/en-us/um/people/minka/papers/dirichlet/minka-dirichlet.pdf   (8)北邮论坛的LDA导读:[导读]文本处理、图像标注中的一篇重要论文Latent Dirichlet Allocation,http://bbs.byr.edu.cn/article/PR_AI/2530?p=1   (9)Zhou Li的LDA Note:http://lsa-lda.googlecode.com/files/Latent Dirichlet Allocation note.pdf   (10)C. M. Bishop, “Pattern Recognition And Machine Learning,” Springer, 2006.   代码:   (1)Blei的LDA代码(C):http://www.cs.princeton.edu/~blei/lda-c/index.html   (2)BLei的HLDA代码(C):http://www.cs.princeton.edu/~blei/downloads/hlda-c.tgz   (3)Gibbs LDA(C++):http://gibbslda.sourceforge.net/   (4)Delta LDA(Python):http://pages.cs.wisc.edu/~andrzeje/research/deltaLDA.tgz   (5)Griffiths和Steyvers的Topic Modeling工具箱:http://psiexp.ss.uci.edu/research/programs_data/toolbox.htm   (6)LDA(Java):http://www.arbylon.net/projects/   (7)Mochihashi的LDA(C,Matlab):http://chasen.org/~daiti-m/dist/lda/   (8)Chua的LDA(C#):http://www.mysmu.edu/phdis2009/freddy.chua.2009/programs/lda.zip   (9)Chua的HLDA(C#):http://www.mysmu.edu/phdis2009/freddy.chua.2009/programs/hlda.zip   其他:   (1)S. Geman and D. Geman, "Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images," Pattern Analysis and Machine Intelligence, IEEE Transactions on, vol. PAMI-6, pp. 721-741, 1984.   (2)B. C. Russell, et al., "Using Multiple Segmentations to Discover Objects and their Extent in Image Collections," in Computer Vision and Pattern Recognition, 2006 IEEE Computer Society Conference on, 2006, pp. 1605-1614.
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值