Graph embedding
Random walk + word2vec(Skip gram)
Random walk to generate(sample) sequences as sequential samples for node embedding training. It avoids sparsity problem of sequence sample set.
Word2vec(Skip gram) to learn the word(node) embedding, which is good at absorbing community(local) structure information.
Alibaba do commodity embedding for its e-commerce recommendation task within similar ideas.
Ali uses users’ traces of purchases, views and others to construct the initial activity graph.
Method in this paper is used to build BGE(base graph embedding) .
Then for tackling cold-start problem, side information of commodities is used to construct content embeddings. The enhanced graph embedding with both users’ behavior information and item content information are called GES( Graph Embedding with Side information) . While GES average-pools users’ behavior information embedding and side-information embeddings, EGES(Enhanced GES) weighted-sums over this embeddings to respect different significance of different information.
@See “Billion-scale Commodity Embedding for E-commerce Recommendation in Alibaba”.
DeepWalk presents good representation results, especially in data sparse fields.