Learning Convolutional Neural Networks for Graphs

The paper proposed a framework for learning graph representations that are especially beneficial in conjunction with CNNs. It combines two complementary procedures:
(a) selecting a sequence of nodes that covers large parts of the graph
(b) generating local normalized neighborhood representations for each of the nodes in the sequence
在这里插入图片描述

Graph kernels
Kernels on graphs were originally defined as similarity functions on the nodes of a single graph.

Two representative classes of kernels are the skew spectrum kernel and kernels based on graphlets.
The latter builds kernels based on fixed-sized subgraphs.These subgraphs, which are often called motifs or graphlets, reflect functional network properties.
Due to the combinatorial complexity of subgraph enumeration, graphlet kernels are restricted to subgraphs with few nodes.

Weisfeiler-Lehman (WL) kernels: only support discrete features and use memory linear in the number of training examples at test time.
Deep graph kernels and graph invariant kernels: compare graphs based on the existence or count of small substructures such as shortest paths, graphlets, subtrees, and other graph invariants.

Graph neural networks (GNNs)
A recurrent neural network architecture defined on graphs.
GNNs support only discrete labels and perform as many backpropagation operations as there are edges and nodes in the graph per learning iteration.
Gated Graph Sequence Neural Networks modify GNNs to use gated recurrent units and to output sequences .

A brief introduction to graph theory
A graph G is a pair (V, E) with V = {v1, …, vn} the set of vertices and E ⊆ V × V the set of edges.
d(u, v):the distance between u and v, that is, the length of the shortest path between u and v.
N1(v) is the 1-neighborhood of a node, that is, all nodes that are adjacent to v.

PATCHY-SAN

learns substructures from graph data and is not limited to a predefined set of motifs. Moreover, while all graph kernels have a training complexity at least quadratic in the number of graphs, which is prohibitive for large-scale problems,PATCHY-SAN scales linearly with the number of graphs.

Given a collection of graphs, PATCHY-SAN (SELECT- SSEMBLE-NORMALIZE) applies the following steps to each graph:
(1) Select a fixed-length sequence of nodes from the graph;
(2) assemble a fixed-size neighborhood for each node in the selected sequence;
(3) normalize the extracted neighborhood graph;
(4) learn neighborhood representations with convolutional neural networks from the resulting sequence of patches.

1)Node Sequence Selection
在这里插入图片描述
If the number of nodes is smaller than w, the algorithm creates all-zero receptive fields for padding purposes.

2) Neighborhood Assembly
在这里插入图片描述
the size of N is possibly different to k.

3)Graph Normalization
在这里插入图片描述

The receptive field for a node is constructed by normalizing the neighborhood assembled in the previous step.

The basic idea is to leverage graph labeling procedures that assigns nodes of two different graphs to a similar relative position in the respective adjacency matrices if and only if their structural roles within the graphs are similar.

在这里插入图片描述
the normalization procedure:
在这里插入图片描述

在这里插入图片描述


Something Else
the restricted Boltzman machine (RBM)
1-dimensional Weisfeiler-Lehman (1-WL)
distance measure on graphs
distance measure on k × k matrices

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值