各种attention的代码实现

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/guotong1988/article/details/82902268

base attention
dot attention
mlp attention
multihead attention
no attention
pooling attention
https://github.com/pytorch/translate/tree/master/pytorch_translate/attention

attention
bilinear attention
cosine attention
dot product attention
legacy attention
linear attention
https://github.com/allenai/allennlp/tree/master/allennlp/modules/attention

intra sentence attention
multi head self attention
stacked self attention
https://github.com/allenai/allennlp/tree/master/allennlp/modules/seq2seq_encoders

bilinear matrix attention
cosine matrix attention
dot product matrix attention
legacy matrix attention
linear matrix attention
matrix attention
https://github.com/allenai/allennlp/tree/master/allennlp/modules/matrix_attention

展开阅读全文

没有更多推荐了,返回首页