WL算法(来自graph-bert)

在这里插入图片描述
node_list:
在这里插入图片描述

node_color_dict
在这里插入图片描述

node_neighbor_dict
在这里插入图片描述
self.max_iter : 最大迭代次数

  • 0
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
Molecular-graph-BERT 是一种基于图神经网络的化学分子表示方法,可用于分子性质预测、分子设计等应用。以下是 Molecular-graph-BERT 的代码实现。 1. 安装依赖 ```python !pip install torch !pip install dgl !pip install rdkit ``` 2. 数据预处理 ```python import dgl from rdkit import Chem from dgl.data.utils import load_graphs, save_graphs from dgl.data.chem.utils import smiles_to_bigraph, CanonicalAtomFeaturizer # 将 SMILES 序列转换为 DGLGraph def graph_from_smiles(smiles): mol = Chem.MolFromSmiles(smiles) return smiles_to_bigraph(smiles, atom_featurizer=CanonicalAtomFeaturizer()) # 读取数据,并将 SMILES 序列转换为 DGLGraph data = [] with open('data.txt', 'r') as f: for line in f: smiles, label = line.strip().split('\t') g = graph_from_smiles(smiles) label = int(label) data.append((g, label)) # 将 DGLGraph 序列化并保存为二进制文件 save_graphs('data.bin', data) ``` 3. 定义模型 ```python import torch import torch.nn as nn import dgl.function as fn # 定义 GraphConvLayer class GraphConvLayer(nn.Module): def __init__(self, in_feats, out_feats): super(GraphConvLayer, self).__init__() self.linear = nn.Linear(in_feats, out_feats) self.activation = nn.ReLU() def forward(self, g, features): with g.local_scope(): g.ndata['h'] = features g.update_all(fn.copy_u('h', 'm'), fn.sum('m', 'neigh')) h_neigh = g.ndata['neigh'] h = self.linear(features + h_neigh) h = self.activation(h) return h # 定义 MolecularGraphBERT 模型 class MolecularGraphBERT(nn.Module): def __init__(self, hidden_size, num_layers): super(MolecularGraphBERT, self).__init__() self.embed = nn.Embedding(100, hidden_size) self.layers = nn.ModuleList([GraphConvLayer(hidden_size, hidden_size) for _ in range(num_layers)]) self.pool = dgl.nn.pytorch.glob.max_pool def forward(self, g): h = self.embed(g.ndata['feat']) for layer in self.layers: h = layer(g, h) g.ndata['h'] = h hg = self.pool(g, g.ndata['h']) return hg ``` 4. 训练模型 ```python from torch.utils.data import DataLoader from dgl.data.utils import load_graphs # 加载数据 data, _ = load_graphs('data.bin') labels = torch.tensor([d[1] for d in data]) # 划分训练集和测试集 train_data, test_data = data[:80], data[80:] train_labels, test_labels = labels[:80], labels[80:] # 定义训练参数 lr = 0.01 num_epochs = 50 hidden_size = 128 num_layers = 3 # 定义模型和优化器 model = MolecularGraphBERT(hidden_size, num_layers) optimizer = torch.optim.Adam(model.parameters(), lr=lr) # 训练模型 for epoch in range(num_epochs): model.train() for i, (g, label) in enumerate(train_data): pred = model(g) loss = nn.functional.binary_cross_entropy_with_logits(pred, label.unsqueeze(0).float()) optimizer.zero_grad() loss.backward() optimizer.step() model.eval() with torch.no_grad(): train_acc = 0 for g, label in train_data: pred = model(g) train_acc += ((pred > 0).long() == label).sum().item() train_acc /= len(train_data) test_acc = 0 for g, label in test_data: pred = model(g) test_acc += ((pred > 0).long() == label).sum().item() test_acc /= len(test_data) print('Epoch {:d} | Train Acc {:.4f} | Test Acc {:.4f}'.format(epoch, train_acc, test_acc)) ``` 以上就是 Molecular-graph-BERT 的代码实现。需要注意的是,由于 Molecular-graph-BERT 是基于图神经网络的方法,需要使用 DGL 库来构建和操作图数据,因此需要先安装 DGL 库。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值