神经网络计算复杂度计算_图神经网络的计算复杂度解释

这篇博客探讨了神经网络,特别是图神经网络的计算复杂度,详细解释了其内在的计算需求和效率问题。
摘要由CSDN通过智能技术生成

神经网络计算复杂度计算

Unlike conventional convolutional neural networks, the cost of graph convolutions is “unstable” — as the choice of graph representation and edges corresponds to the graph convolutions complexity — an explanation why.

与传统的卷积神经网络不同,图卷积的成本是“不稳定的”(因为图表示和边的选择对应于图卷积的复杂性),这就是原因的解释。

Dense vs Sparse Graph Representation: expensive vs cheap?

密集vs稀疏图形表示法:昂贵还是便宜?

Graph data for a GNN input can be represented in two ways:

GNN输入的图形数据可以两种方式表示:

A) sparse: As a list of nodes and a list of edge indices

A)稀疏:作为节点列表和边缘索引列表

B) dense: As a list of nodes and an adjacency matrix

B)密集的:作为节点列表和邻接矩阵

For any graph G with N vertices of length F and M edges, the sparse version will operate on the nodes of size N*F and a list of edge indices of size 2*M.The dense representation in contrast will require an adjacency matrix of size N*N.

对于任何具有N个顶点(长度为F和M边)的图G,稀疏版本将在大小为N * F的节点上运行,并在边缘索引的大小为2 * M的列表上进行操作,而密集表示则需要邻接矩阵为尺寸N * N。

While in general the sparse representation is much less expensive in usage inside a graph neural network for forward and backward pass, edge modification operations require searching for the right edge-node pairs and possibly adjustment of the overall size of the list, which leads to variations on the RAM usage of the network. In other words, sparse representations minimize memory usage on graps with fixed edges.

虽然一般来说,稀疏表示在图神经网络内用于向前和向后传递的费用要便宜得多,但是边沿修改操作需要搜索右边的边节点对,并且可能需要调整列表的整体大小,这会导致变化网络的RAM使用情况。 换句话说,稀疏表示将具有固定边缘的抓图上的内存使用降至最低。

While being expensive to use, the dense representation has the following advantages: Edge weights are naturally included in the adjacency matrix, edge modification can be done in a smooth manner and integrated into the network, finding edges and changing edge values does not change the size of the matrix. Those properties are crucial for Graph Neural Networks that rely on in network edge modifications.

尽管使用起来昂贵,但密集表示具有以下优点:边缘权重自然包含在邻

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值