【论文】A Comprehensive Survey on Graph Neural Networks

Qustion:The data in these tasks are typically represented in the
Euclidean space. However, there is an increasing number of applications where data are generated from non-Euclidean domains and are represented as graphs with complex relationships and interdependency between objects. The complexity of graph data has
imposed significant challenges on existing machine learning algorithms.

Solution:Recently, many studies on extending deep learning
approaches for graph data have emerged.

Contribution:

  1. We provide a comprehensive overview of graph neural networks (GNNs) in data mining and machine learning fields.
  2. We propose a new taxonomy to divide the state-of-the-art graph neural networks into different categories.
  3. With a focus on graph convolutional networks, we alternative architectures that have recently been developed; these learning paradigms include graph attention networks, graph autoencoders, graph generative networks, and graph spatial-temporal networks.
  4. We further discuss the applications of graph neural networks across various domains and summarize the open source codes and benchmarks of the existing algorithms on different learning tasks.
  5. Finally, we propose potential research directions in this fast-growing field.

问题:图数据包含真实世界的复杂关系和相互依赖。传统的机器学习方法不适合处理这类数据。

解决:近年来,对图数据深度学习方法的扩展研究越来越多。

贡献:

  1. 图卷积网络(GNNs)的综述
  2. GNNs的分类
  3. 回顾GNNs和其他图网络架构(GAT,图自编码器,图生成网络,图时空网络)
  4. 讨论GNNs在不同领域的应用
  5. 未来有潜力的研究方向

1 Introduction

尽管深度学习在欧几里得数据上取得了巨大的成功,但越来越多的应用程序需要对非欧几里得域生成的数据进行有效的分析。每个图都有一个可变大小的无序节点,图中的每个节点都有不同数量的邻居,这就导致了一些重要的操作(如卷积),这些操作在图像域中很容易计算,但不再直接适用于图域。

借鉴传统的卷积网络、循环网络、自编码器网络,针对图数据设计图神经网络的架构。

GNN历史

年份 主要作者 文章 备注
2005 IJCNN Gori A new model for learning in graph domains First outline
2009 ITNN Scarselli The graph neural network model Further elaborated
2015 ICLR Li Gated graph sequence neural networks Computation Efficiency
2018 ICML Dai Learning steady-states of iterative algorithms over graphs Computation Efficiency
2014 ICLR Bruna Spectral networks and locally connected networks on graphs First prominent research on spectral-based GNNs
2016 NIPS Defferrard Convolutional neural networks on graphs with fast localized spectral filtering Improvement on (2014 ICLR)
2017 ICLR Kipf Semi-supervised classification with graph convolutional networks Improvement on (2014 ICLR)
2015 arXiv Henaff Deep convolutional networks on graph-structured data Improvement on (2014 ICLR)
2018 AAAI R. Li Adaptive graph convolutional neural networks Improvement on (2014 ICLR)
2017 arXiv Levie Cayleynets:Graph convolutional neural networks with complex rational spectral filters Improvement on (2014 ICLR)
2017 NIPS Hamilton Inductive representation learning on large graphs Spacial-based GNNs
2017 CVPR Monti Geometric deep learning on graphs and manifolds using mixture model cnns Spacial-based GNNs
2016 ICML Niepert Learning convolutional neural networks for graphs Spacial-based GNNs
2018 SIGKDD Gao Large-scale learnable graph convolutional networks Spacial-based GNNs

图神经网络相关综述

年份 主要作者 文章 备注
2017 SPM Bronstein Geometric deep learning: going beyond euclidean data 第一篇GCN综述,但是缺少对一些spacial-based GCN和GCN之外的其他方法的描述。
2018 arXiv Battaglia Relational inductive biases, deep learning, and graph networks 使用building block统一地描述GNN,缺点是有点抽象
2018 arXiv Lee Attention models in graphs: A survey 图注意网络的综述
2018 arXiv Zhang Deep learning on graphs: A survey 缺少图生成网络(graph generative network)时空网络(spatial-temporal network)

Graph Nerual Network vs. Network Embedding

Network embedding:

  1. aims to represent network vertices into a low-dimensional vector space,
  2. by preserving both network topology structure and node content information,
  3. so that any subsequent graph analytics tasks such as classification, clustering, and recommendation can be easily performed by using simple off-the-shelf learning machine algorithm.
  4. Many network embedding algorithms are typically unsupervised algorithms and they can be broadly classified into three groups [32], i.e., matrix factorization [38], [39], random walks [40], and deep learning approaches.

在这里插入图片描述
贡献

  1. New taxonomy(5 categories:GCNs, GATs, GAEs, GGNs, GSTNs)
  2. Comprehensive review(description, comparison, summarization)
  3. Abundant resources(algorithms, datasets, codes, applications)
  4. Future directions

Section 2 defines a list of graph-related concepts.
Section 3 clarifies the categorization of graph neural networks.
Section 4 and Section 5 provides an overview of graph neural network models.
Section 6 presents a gallery of applications across various domains.
Section 7 discusses the current challenges and suggests future directions.
Section 8 summarizes the paper.

2 Definition

Graph: G = ( A , X ) , A ∈ R N × N , X ∈ R N × D G=(A,X), A \in R^{N\times N}, X \in R^{N\times D} G=(A,X),ARN×N,XRN×D
Directed Graph: A i j ≠ A j i A_{ij}\neq A_{ji} Aij=Aji
Spatial-Temporal Graph: G = ( A , X ) , A ∈ R N × N , X ∈ R T × N × D G=(A,X), A \in R^{N\times N}, X \in R^{T\times N\times D} G=(A,X),ARN×N,XRT×N×D

3 Categorization

  • 0
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值