【深层GNN研究】论文合集

目录

1、动机

2、资源

2021

2020

Before 2020

3、参考


1、动机

为什么要专门研究Deep GNN呢?这是由于GNN通常在1-2层效果较好,随着层数的增加,GNN的表现会大幅度下降。传统DNN中也有这个问题,Kaiming He的ResNet就是一个很著名的解法。

尽管这两年关于GNN的深度问题有各种研究和解释,比如过平滑,但是GNN深层退化现象是不是仅仅由于过拟合呢?比如,19ICLR PPNP这篇就提到了过拟合是Deep GNN退化的原因之一

2、资源

2021

[arXiv 2021] Two Sides of the Same Coin: Heterophily and Oversmoothing in Graph Convolutional Neural Networks

  • https://arxiv.org/abs/2102.06462v2

[arXiv 2021] Graph Neural Networks Inspired by Classical Iterative Algorithms

  • https://arxiv.org/abs/2103.06064

[ICML 2021] Training Graph Neural Networks with 1000 Layers

  • https://arxiv.org/abs/2106.07476

  • https://github.com/lightaime/deep_gcns_torch

[ICML 2021] Directional Graph Networks

  • https://arxiv.org/abs/2010.02863

  • https://github.com/Saro00/DGN

[ICLR 2021] On the Bottleneck of Graph Neural Networks and its Practical Implications

  • https://openreview.net/forum?id=i80OPhOCVH2

  • https://github.com/tech-srl/bottleneck/)

[ICLR 2021] Adaptive Universal Generalized PageRank Graph Neural Network

  • https://openreview.net/forum?id=n6jl7fLxrP

  • https://github.com/jianhao2016/GPRGNN

[ICLR 2021] Simple Spectral Graph Convolution

  • https://openreview.net/forum?id=CYO5T-YjWZV

2020

[arXiv 2020] Deep Graph Neural Networks with Shallow Subgraph Samplers

  • https://arxiv.org/abs/2012.01380

[arXiv 2020] Revisiting Graph Convolutional Network on Semi-Supervised Node Classification from an Optimization Perspective

  • https://arxiv.org/abs/2009.11469

[arXiv 2020] Tackling Over-Smoothing for General Graph Convolutional Networks

  • https://arxiv.org/abs/2008.09864

[arXiv 2020] DeeperGCN: All You Need to Train Deeper GCNs

  • https://arxiv.org/abs/2006.07739

  • https://github.com/lightaime/deep_gcns_torch

[arXiv 2020] Effective Training Strategies for Deep Graph Neural Networks

  • https://arxiv.org/abs/2006.07107

  • https://github.com/miafei/NodeNorm

[arXiv 2020] Revisiting Over-smoothing in Deep GCNs

  • https://arxiv.org/abs/2003.13663

[NeurIPS 2020] Graph Random Neural Networks for Semi-Supervised Learning on Graphs

  • https://proceedings.neurips.cc/paper/2020/hash/fb4c835feb0a65cc39739320d7a51c02-Abstract.html

  • https://github.com/THUDM/GRAND

[NeurIPS 2020] Scattering GCN: Overcoming Oversmoothness in Graph Convolutional Networks

  • https://proceedings.neurips.cc/paper/2020/hash/a6b964c0bb675116a15ef1325b01ff45-Abstract.html

  • https://github.com/dms-net/scatteringGCN

[NeurIPS 2020] Optimization and Generalization Analysis of Transduction through Gradient Boosting and Application to Multi-scale Graph Neural Networks

  • https://proceedings.neurips.cc/paper/2020/hash/dab49080d80c724aad5ebf158d63df41-Abstract.html

  • https://github.com/delta2323/GB-GNN

[NeurIPS 2020] Towards Deeper Graph Neural Networks with Differentiable Group Normalization

  • https://arxiv.org/abs/2006.06972

[ICML 2020 Workshop GRL+] A Note on Over-Smoothing for Graph Neural Networks

  • https://arxiv.org/abs/2006.13318

[ICML 2020] Bayesian Graph Neural Networks with Adaptive Connection Sampling

  • https://arxiv.org/abs/2006.04064

[ICML 2020] Continuous Graph Neural Networks

  • https://arxiv.org/abs/1912.00967

[ICML 2020] Simple and Deep Graph Convolutional Networks

  • https://arxiv.org/abs/2007.02133

  • https://github.com/chennnM/GCNII

[KDD 2020] Towards Deeper Graph Neural Networks

  • https://arxiv.org/abs/2007.09296

  • https://github.com/mengliu1998/DeeperGNN

[ICLR 2020] Graph Neural Networks Exponentially Lose Expressive Power for Node Classification

  • https://arxiv.org/abs/1905.10947

  • https://github.com/delta2323/gnn-asymptotics)

[ICLR 2020] DropEdge: Towards Deep Graph Convolutional Networks on Node Classification

  • https://openreview.net/forum?id=Hkx1qkrKPr

  • https://github.com/DropEdge/DropEdge

[ICLR 2020] PairNorm: Tackling Oversmoothing in GNNs

  • https://openreview.net/forum?id=rkecl1rtwB

  • https://github.com/LingxiaoShawn/PairNorm)

[ICLR 2020] Measuring and Improving the Use of Graph Information in Graph Neural Networks

  • https://openreview.net/forum?id=rkeIIkHKvS

  • https://github.com/yifan-h/CS-GNN

[AAAI 2020] Measuring and Relieving the Over-smoothing Problem for Graph Neural Networks from the Topological View

  • https://arxiv.org/abs/1909.03211

Before 2020

[arXiv 2019] Revisiting Graph Neural Networks: All We Have is Low-Pass Filters

  • https://arxiv.org/abs/1905.09550

[NeurIPS 2019] Break the Ceiling: Stronger Multi-scale Deep Graph Convolutional Networks

  • https://arxiv.org/abs/1906.02174

[ICLR 2019] Predict then Propagate: Graph Neural Networks meet Personalized PageRank

  • https://arxiv.org/abs/1810.05997

  • https://github.com/klicperajo/ppnp

[ICCV 2019] DeepGCNs: Can GCNs Go as Deep as CNNs?

  • https://arxiv.org/abs/1904.03751)

  • https://github.com/lightaime/deep_gcns_torch

  • https://github.com/lightaime/deep_gcns

[ICML 2018] Representation Learning on Graphs with Jumping Knowledge Networks

  • https://arxiv.org/abs/1806.03536

[AAAI 2018] Deeper Insights into Graph Convolutional Networks for Semi-Supervised Learning

  • https://arxiv.org/abs/1801.07606

3、参考

深度图神经网络论文大合集~

  • 0
    点赞
  • 6
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值