Paper Notes: A Comprehensive Survey on Graph Neural Networks

A Comprehensive Survey on Graph Neural Networks

  • LINK: https://arxiv.org/abs/1901.00596

  • CLASSIFICATION: GNN, SURVEY

  • YEAR: Submitted on 3 Jan 2019 (v1), last revised 4 Dec 2019 (this version, v4)

  • FROM: ArXiv 2019

  • WHAT PROBLEM TO SOLVE: Existing surveys only include some of the GNNs and examine a limited number of works, thereby missing the most recent development of GNNs.

  • SOLUTION:

    This paper makes notable contributions summarized as follows:

    • New taxonomy

      Recurrent graph neural networks, Convolutional graph neural networks, Graph autoencoders, and Spatial-temporal graph neural networks.

    • Comprehensive review

      Provide detailed descriptions on representative models, make the necessary comparison, and summarise the corresponding algorithms.

    • Abundant resources

      Including state-of-the-art models, benchmark data sets, open-source codes, and practical applications.

    • Future directions

      Model depth, scalability trade-off, heterogeneity, and dynamicity.

  • CORE POINT:

    • Taxonomy of GNNs

      • RecGNNs: Aim to learn node representations with recurrent neural architectures. They assume a node in a graph constantly exchanges information/message with its neighbors until a stable equilibrium is reached.

      • ConvGNNs: Generate a node v’s representation by aggregating its own features xv and neighbors’ features xu, where u ∈ N(v). Different from RecGNNs, ConvGNNs stack multiple graph convolutional layers to extract high-level node representations.

        image.png

        image.png

      • GAEs: Unsupervised learning frameworks which encode nodes/graphs into a latent vector space and reconstruct graph data from the encoded information. GAEs are used to learn network embeddings and graph generative distributions.

        image.png

      • STGNNs: Consider spatial dependency and temporal dependency at the same time. Many current approaches integrate graph convolutions to capture spatial dependency with RNNs or CNNs to model the temporal dependency.

        image.png

    • Level Tasks

      • Node-level: Outputs relate to node regression and node classification tasks with a multi-perceptron or a softmax layer as the output layer. RecGNNs and ConvGNNs can extract high-level node representations by information propagation/graph convolution.
      • Edge-level: With two nodes’ hidden representations from GNNs as inputs, a similarity function or a neural network can be utilized to predict the label/connection strength of an edge.
      • Graph-level: Outputs relate to the graph classification task. To obtain a compact representation on the graph level, GNNs are often combined with pooling and readout operations.
    • Training Framework

      • Semi-supervised learning for node-level classification: Given a single network with partial nodes being labeled and others remaining unlabeled, ConvGNNs can learn a robust model that effectively identifies the class labels for the unlabeled nodes.
      • Supervised learning for graph-level classification: Graph-level classification aims to predict the class label(s) for an entire graph. The end-to-end learning for this task can be realized with a combination of graph convolutional layers, graph pooling layers, and/or readout layers.
      • Unsupervised learning for graph embedding: When no class labels are available in graphs, we can learn the graph embedding in a purely unsupervised way in an end-to-end framework.
    • Representative RecGNNs and ConvGNNs

      O(m) if the graph adjacency matrix is sparse and is O(n^2) otherwise, O(n^3) due to some other operations.

      image.png

    • RECURRENT GRAPH NEURAL NETWORKS

      Apply the same set of parameters recurrently over nodes in a graph to extract high-level node representations.

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值