论文记录:DeepWalk: Online Learning of Social Representations

Graph embedding

Random walk + word2vec(Skip gram)

Random walk to generate(sample) sequences as sequential samples for node embedding training. It avoids sparsity problem of sequence sample set.

Word2vec(Skip gram) to learn the word(node) embedding, which is good at absorbing community(local) structure information.

Alibaba do commodity embedding for its e-commerce recommendation task within similar ideas.
Ali uses users’ traces of purchases, views and others to construct the initial activity graph.
Method in this paper is used to build BGE(base graph embedding) .
Then for tackling cold-start problem, side information of commodities is used to construct content embeddings. The enhanced graph embedding with both users’ behavior information and item content information are called GES( Graph Embedding with Side information) . While GES average-pools users’ behavior information embedding and side-information embeddings, EGES(Enhanced GES) weighted-sums over this embeddings to respect different significance of different information.
@See “Billion-scale Commodity Embedding for E-commerce Recommendation in Alibaba”.

DeepWalk presents good representation results, especially in data sparse fields.

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值