自注意力机制(Self-Attention)
自注意力机制代码(pytorch版):
import torch
from torch import nn
class SelfAttention(nn.Module):
""" self attention module"""
def __init__(self, in_dim):
super(SelfAttention, self).__init__()
self.chanel_in = in_dim
self.query = nn
原创
2022-01-30 21:00:02 ·
1506 阅读 ·
0 评论