Python微信订餐小程序课程视频
https://edu.csdn.net/course/detail/36074
Python实战量化交易理财系统
https://edu.csdn.net/course/detail/35475
Paper information
- Tittle:《Spectral Networks and Locally Connected Networks on Graphs》
- Authors:Joan Bruna、Wojciech Zaremba、Arthur Szlam、Yann LeCun
- Source:2014, ICLR
- Paper:Download
- Code:Download
Abstract
Convolutional Neural Networks are extremely efficient architectures due to the local translational invariance. In order to generize it , we propose two constructions, one based upon a hierarchical clustering of the domain, and another based on the spectrum of the graph Laplacian.
CNN作用于 欧几里得数据 。 Convolutional Neural Networks (CNNs) have been extremely succesful in machine learning problems where the coordinates of the underlying data representation have a grid structure.
1 Introduction
CNN is able to exploit several structures that play nicely together to greatly reduce the number of parameters in the system:
-
- The translation structure(平移结构,类比于 1×n1×n 1 \times n 的滤波器), allowing the use of filters instead of generic linear maps and hence weight sharing.
- The metric on the grid, allowing compactly supported filters, whose support is typically much smaller than the size of the input signals.
- The multiscale dyadic clustering(二元聚类) of the grid, allowing subsampling, implemented through stride convolutions and pooling.
如果有 nnn 个网格数据,数据的维度为 ddd ,如果使用有 mmm 的输出节点的全连接网络就需要 n×mn×mn \times m 个参数,相当于 O(n2)O(n2)O\left(n^{2}\right) 的复杂度。使用任意的滤波器(也就是 1 而非全连接网路能将参数复杂度降低到 O(n)O(n)O(n) (注:重复使用一维滤波器) 。使用网格上的度量结构,也就是22 2 来构建局部连接网络也可以。而如果两种一起使用能够将复杂度降低到 O(k⋅S)O(k⋅S)O(k \cdot S) (注:使用小的滤波器,但可以根据需求设置不同的滤波器数量) ,这里的 kkk 代表数据 feature map 的数量, SSS 代表卷积核的数量,此时复杂度与 nnn 无关。最后使用网格的多尺度二元聚类,也就是 3 可以进一步降低复杂度。
Graphs offer a natural framework to generalize the low-dimensional grid structure, and by extension the notion of convolution.
We propose two different constructions. In the first one, we show that one can extend properties (2) and (3) to general graphs, and use them to define “locally” connected and pooling layers, which require O(n)O(n)O(n) parameters instead of O(n2)O(n2)O(n^2). We term this the spatial construction. The other construction, which we call spectral construction, draws on the properties of convolutions in the Fourier domain.
在本文中将会讨论将深度卷积应用于网络数据的方法。本文一共提供两种架构。第一种为空域架构(spatial construction),这种架构能够对网络数据应用上述 2 和 3 ,应用它们可以构建网络数据的局部连接网络,参数复杂度为 O(n)O(n)O(n) 而不是 O(n2)O(n2)O(n^2) 。另一种架构称为频域架构(spectral construction),能够在傅里叶域内应用卷积。频域架构对于每一个 feature map 最多需要 O(n)O(n)O(n) 的参数复杂度,也可以构建参数数量与 nnn 无关的架构。
Contributions:
-
- We show that from a weak geometric structure in the input domain it is possible to obtain efficient architectures using O(n)O(n)O(n) parameters, that we validate on low-dimensional graph datasets.
- We introduce a construction using O(1)O(1)O(1) parameters which we empirically verify, and we discuss its connections with an harmonic analysis problem on graphs.
- We show that from a weak geometric structure in the input domain it is possible to obtain efficient architectures using O(n)O(n)O(n) parameters, that we validate on low-dimensional graph datasets.
2 Spatial Construction
网络数据将由一个加权图 G=(Ω,W)G=(Ω,W)G=(\Omega, W) 来表示, ΩΩ\Omega 是一个离散的节点集合,大小为 mmm , WWW 是一个对称半正定矩阵,也就是加权邻接矩阵。将CNN泛化到网络数据的最直接想法是应用多尺度的、层级的局部感受野。
2.1 Locality via W
The notion of locality can be generalized easily in the context of a graph.
通过图上的 weight 设置 neighborhoods,定义如下:
Nδ(j)={i∈Ω:Wij>δ}.Nδ(j)={i∈Ω:Wij>δ}.N_{\delta}(j)=\left{i \in \Omega: W_{i j}>\delta\right} .
通过此方法,将参数限制在 O(S⋅n)O(S⋅n)O(S \cdot n) ,其中SSS 是 average neighborhood size。
2.2 Multiresolution Analysis on Graphs
CNNs reduce the size of the grid via pooling and subs