2021EMNLP_Deep Attention Diffusion Graph Neural Networks for Text Classification阅读笔记

1 Main Contributions

  • Build a Deep Attention Diffusion Graph Neural Network (DADGNN) that ahcieves broad receptive fields via disffusion mechanism.

  • Propose to decouple the propagation and transformation processes of GNNs and thus construct a GNN layer that can be stacked for much more times.

  • Extensive experiment results demostrate the effectiveness of the present model.

2 Method

Decouple propogation and transformation

After decoupling, the message passing progess of DADGNN can be formulated as follows:

To have a clear comparison, the fomulation of traditional GNNs is presented below:

It shows that DAGNN transform the feature dimension at early stage, and remove it from the propagation progress.

Add diffusion mechanism

An is the powers of the attention matrix, which take into account the influence of all neighboring nodes j with path lenghts up to n on target node i based on the powers of the graph adjacency matrix.

Graph-Level Representation

 

Then the graph (document) representation is gotten using an attention-based summation operation. This graph representaion then can be used for classification.

3 Experiment

Overall Performance

 

The proposed model is compared with (1) Sequence-based DL models like CNN, BiLSTM, etc.. (2) word embedding-based models like PV-DM. (3) graph-based models like SGC, TextGCN, etc..

It show that DADGNN consistently achieves best results on all datasets. The authors attribute this to the disffusion mechnism and decoupling of propagation and transformations.

GPU consumption

 

  • 0
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值