NLP入门系列1:attention和transformer
本文参考来源:https://github.com/datawhalechina/Learn-NLP-with-Transformers
(教程里带的图片实在是太直观了,这里就照搬了)
NLP入门系列1:attention和transformer1.Attention1.1 Seq2Seq模型1.2 Attention2. Transformer2.1 Self-Attention2.1.1 Self-Attention的作用2.1.2 Self-Attention的具体结构2.2 multi-head.
原创
2021-08-18 21:58:25 ·
775 阅读 ·
0 评论