Papers of Word Embeddings

首先解释一下什么叫做embedding。举个例子:地图就是对于现实地理的embedding,现实的地理地形的信息其实远远超过三维 但是地图通过颜色和等高线等来最大化表现现实的地理信息。 embedding就是用固定的维度来最大化表现原始信息。embedding可以翻译为向量或者表示。

 

1.Hashimoto, Tatsunori B., David Alvarez-Melis, and Tommi S. Jaakkola. "Word embeddings as metric recovery in semantic spaces." Transactions of the Association for Computational Linguistics 4 (2016): 273-286.

 

2.Arora, Sanjeev, et al. "Random walks on context spaces: Towards an explanation of the mysteries of semantic word embeddings." arXiv preprint arXiv:1502.03520 (2015).

 

3.Li, Shaohua, et al. "Generative topic embedding: a continuous representation of documents." the 54th annual meeting of the Association for Computational Linguistics (ACL 2016). 2016.

 

4. Levy, Omer, and Yoav Goldberg. "Neural word embedding as implicit matrix factorization."  Advances in neural information processing systems. 2014.
 

5. Levy, Omer, Yoav Goldberg, and Israel Ramat-Gan. "Linguistic Regularities in Sparse and Explicit Word Representations."  CoNLL. 2014.
 

6. Goldberg, Yoav, and Omer Levy. "word2vec Explained: deriving Mikolov et al.'s negative-sampling word-embedding method."  arXiv preprint arXiv:1402.3722 (2014).
 

7. Linking GloVe with word2vec(Shi, Tianze, and Zhiyuan Liu. "Linking GloVe with word2vec."  arXiv preprint arXiv:1411.5595 (2014).)
 

8. Li, Yitan, et al. "Word embedding revisited: A new representation learning and explicit matrix factorization perspective."  Proceedings of the Twenty-Fourth International Joint Conference on Artificial Intelligence, IJCAI. 2015.

转载于:https://www.cnblogs.com/huangshiyu13/p/6159624.html

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值