graph embedding相关论文阅读与实践

论文及效果记录

graph embedding 综述
https://cloud.tencent.com/developer/article/1427335

Graph Embedding Techniques, Applications, and Performance: A Survey

将现有方法分为三大类:因式分解、随机游走、深度学习

Graph Factorization: Distributed Large-scale Natural Graph Factorization



ALS:Large-scale Parallel Collaborative Filtering for the Netflix Prize
https://www.jianshu.com/p/fc250e969dcc

HOPE:Asymmetric Transitivity Preserving Graph Embedding
两个版本,第一版本时间复杂度是0(n^3),速度太慢,第二版本自己实现,可8小时迭代一版embedding,加上pmi矩阵替换权重为1的邻接矩阵,效果还不错

SGNS:Neural Word Embedding as Implicit Matrix Factorization

Consider for example a pair of relatively frequent words (high P(w) and P©) that occur only once together. There is strong evidence that the words tend not to appear together, resulting in a negative PMI value, and hence a negative matrix entry. On the other hand, a pair of frequent words (same P (w) and P ©) that is never observed occurring together in the corpus, will receive a value of 0. So the PPMI is presented.


论文用svd分解PMI矩阵来替代word2vec
http://www.readventurer.com/?p=755

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值