Deep Learning in NLP (一)词向量和语言模型
http://licstar.net/archives/328
Deep Learning方向的paper整理
http://www.douban.com/note/382064119/
【1】 word2vec Project Home
第一手的资料,代码:http://word2vec.googlecode.com/svn/trunk/,Papers:
[1] Tomas Mikolov, Kai Chen, Greg Corrado, and Jeffrey Dean. Efficient Estimation of Word Representations in Vector Space . In Proceedings of Workshop at ICLR, 2013.
[2] Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg Corrado, and Jeffrey Dean. Distributed Representations of Words and Phrases and their Compositionality . In Proceedings of NIPS, 2013.
[3] Tomas Mikolov, Wen-tau Yih, and Geoffrey Zweig. Linguistic Regularities in Continuous Space Word Representations . In Proceedings of NAACL HLT, 2013.
【2】 Deep Learning in NLP (一)词向量和语言模型
licstar 的经典之作,讲解了主要的NN相关语言模型。
【3】 Deep Learning实战之word2vec
有道几个人写的word2vec的解析文档,从基本的词向量/统计语言模型->NNLM->Log-Linear/Log-Bilinear->层次化Log-Bilinear,到CBOW和Skip-gram模型,再到word2vec的各种tricks,公式推导与代码齐飞,基本上是网上关于word2vec资料的大合集啦,对word2vec感兴趣的童鞋可以看下。(@王晓伟alex)
word2vec傻瓜剖析
http://xiaoquanzi.net/?p=1561
Word2vec在事件挖掘中的调研 session 应用
http://blog.csdn.net/shuishiman/article/details/20769437