Overall Architecture
获得embedding H
Pooling
- Graph Multiset Pooling (GMPool)
- Self-Attention Block (SelfAtt)
- GMPool with k = 1
由接下来几个部分逐一展开解释
Graph Multi-head Attention (MH)
standard multi-head attention
use GNN to generate Key and Value:preserve graph structure
Graph Multiset Pooling with Graph Multi-head Attention (GMPool)
parameterized seed matrix S as Query
interactions between k seed vectors (queries) in S and n nodes (keys) in H
Self-Attention for Inter-node Relationship (SelfAtt)
consider relationships between nodes (Ablation证明这部分非常重要)