TGN:Temporal Graph Networks论文解读

本文深入解读了发表在ICLR2020上的TGN(Temporal Graph Networks)模型,重点探讨了其在动态图节点连接预测中的应用。TGN结合了图神经网络和时序信息,通过编码生成embedding,并利用记忆更新机制处理动态变化。文章介绍了模型结构、embedding生成及记忆更新的细节,展示了TGN如何捕获非欧结构和时序信息来预测节点间的连接可能性。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

公众号 异度侵入

图神经网络(GNNs)的研究已经成为今年机器学习领域的热门话题之一。GNNs最近在生物、化学、社会科学、物理等领域取得了一系列成功。但GNN模型目前以静态图上偏多,然而现实世界许多关系是动态的,比如社交网络、金融交易和推荐系统,动态的变化包含了许多重要信息,只考虑静态图,很难捕捉到其中信息。

本篇文章发表在ICLR2020上,对动态图的节点进行连接预测。TGN中,作者除利用传统的图神经网络捕捉非欧式结构生成embedding外,还利用动态图所中时序信息。在本篇公众号里主要介绍TGN中embedding以及memory的更新机制,包含一小部分源码。

图片

时序图示例

 

01

TGN模型结构

下图为TGN 的一个例子,包含了编码部分和解码部分,在文章中主要是介绍了如何使用TGN进行编码,生成embedding,文章中选择MLP层作为解码器。上面的图片可以理解为,对于节点2和节点4连接的可能性的预测,首先通过TGN生成t8时刻节点2和节点4的embedding,然后通过解码器生成节点2和节点4在t8时刻的连接的可能性。

图片

 

<
### TV-G Model in Machine Learning and Computer Vision The TV-G (Time-Varying Graph) model represents a specialized approach within the broader domain of machine learning that focuses on handling dynamic graph structures where nodes and edges can change over time. This type of model is particularly useful for scenarios involving temporal data analysis, such as social network evolution, traffic flow prediction, or any system with evolving relationships between entities. In terms of application areas: - **Predictive Maintenance**: By analyzing how different components interact over time, TV-G models help predict potential failures more accurately by considering both spatial and temporal dependencies[^1]. - **Real-Time Anomaly Detection**: These models excel at identifying unusual patterns in streaming data due to their ability to adapt dynamically to changes in underlying graphs representing interconnected systems. For implementing TV-G models, several key considerations apply: #### Data Preparation Data must be structured into sequences of snapshots capturing states at discrete points in time. Each snapshot includes information about node attributes and edge connections present during that period. ```python import numpy as np from scipy.sparse import csr_matrix def prepare_data(time_series_data): """Convert raw time series data into adjacency matrices.""" adj_matrices = [] for t in range(len(time_series_data)): # Assume each entry contains pairs of connected nodes edges_at_t = time_series_data[t] n_nodes = max(max(edges_at_t)) + 1 row = [e[0] for e in edges_at_t] col = [e[1] for e in edges_at_t] data = np.ones_like(row) adj_mat = csr_matrix((data, (row, col)), shape=(n_nodes, n_nodes)) adj_matrices.append(adj_mat) return adj_matrices ``` #### Model Selection Choosing an appropriate algorithm depends heavily on specific use case requirements but generally involves selecting methods capable of processing sequential inputs effectively while maintaining awareness of changing topologies. Popular choices include variants of Recurrent Neural Networks (RNNs), Long Short-Term Memory networks (LSTMs), Gated Recurrent Units (GRUs), along with recent advancements like Temporal Graph Networks (TGN). #### Training Process Training typically requires defining loss functions tailored towards optimizing predictions across multiple timestamps simultaneously rather than focusing solely on individual instances. ```python import torch.nn.functional as F from torch_geometric_temporal.signal import StaticGraphTemporalSignal class TimeVaryingGCN(torch.nn.Module): def __init__(self, num_features, hidden_size, output_size): super(TimeVaryingGCN, self).__init__() self.gcn_conv = GCNConv(num_features, hidden_size) self.linear = Linear(hidden_size, output_size) def forward(self, x, edge_index, batch): h = F.relu(self.gcn_conv(x, edge_index)) y_hat = self.linear(h) return y_hat # Example training loop snippet for epoch in range(epochs): optimizer.zero_grad() out = model(data.x, data.edge_index, data.batch) loss = criterion(out[data.train_mask], data.y[data.train_mask]) loss.backward() optimizer.step() ```
评论 3
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值