图神经网络
waiting&fighting
我喜欢高效率和一劳永逸
展开
-
小批量样本训练的GCN
import torch.nn as nnimport torchimport torch.nn.init as initimport scipy.sparse as spimport numpy as npimport torch.nn.functional as Fimport torch.optim as optimfrom random import shuffleclass GraphConvolutionLayerWithAttention(nn.Module): ''原创 2020-07-21 12:01:08 · 245 阅读 · 0 评论 -
单样本实现注意力机制和邻域采样的GCN
邻域采样import numpy as npdef sampling(src_nodes, sample_num, neighbor_table): ''' 根据源节点采样指定数量的邻居节点,注意使用的是有放回的采样; 某个节点的邻居节点数量少于采样数量时,采样结果出现重复的节点 :param src_nodes {list, ndarray}: 源节点列表 :param sample_num {int}: 需要采样的节点数 :param neighb原创 2020-07-18 17:05:04 · 223 阅读 · 0 评论 -
图编码器和解码器
图编码器和解码器原理python实现import torchimport torch.nn as nnimport torch.nn.init as initimport torch.nn.functional as Fclass StackGCNEncoder(nn.Module): '''基于拼接的编码器''' def __init__(self, input_dim, output_dim, num_support, use_bias=False, activation=原创 2020-05-18 15:24:49 · 225 阅读 · 0 评论 -
SAGPool原理和python实现
SAGPool原理python实现import osimport urllibimport torchimport torch.nn as nnimport torch.nn.init as initimport torch.nn.functional as Fimport torch.utils.data as dataimport numpy as npimport scipy.sparse as spfrom zipfile import ZipFilefrom sklearn原创 2020-05-18 09:35:50 · 188 阅读 · 0 评论 -
GraphSage原理与Python实现
GraphSage原理GraphSage的Python实现import numpy as npimport torch.nn as nnimport torchimport torch.nn.init as initimport torch.nn.functional as Fdef sampling(src_nodes, sample_num, neighbor_table): ''' 根据源节点采样指定数量的邻居节点,注意使用的是有放回的采样; 某个节点的邻居节原创 2020-05-16 11:40:17 · 202 阅读 · 0 评论 -
GCN原理与python复现
GCN原理python复现import torchimport torch.nn as nnimport torch.nn.init as initimport torch.sparseimport torch.nn.functional as Fimport torch.optim as optimimport scipy.sparse as spimport numpy a...原创 2020-05-04 12:32:29 · 158 阅读 · 0 评论