Self attention-Multihead attention-Transformer
1.原理&代码(包含self-attention,multihead-attention)
https://blog.csdn.net/jiaowoshouzi/article/details/89073944
(论文)https://github.com/Qunima1120/transformer
‘’
‘’
2.elmo,open GPT,bert
https://zhuanlan.zhihu.com/p/49271699
(elmo)https://zhuanlan.zhihu.com/p/51879600
(elmo)https://github.com/allenai/bilm-tf
(openGPT)https://blog.csdn.net/fengzhou_/article/details/106556677
(论文)
链接:https://pan.baidu.com/s/1tVF6AY_MypNER8gPW9Ftbg
提取码:ka3x
复制这段内容后打开百度网盘手机App,操作更方便哦