Notes On Attention Is All You Need
Notes On “Attention Is All You Need”
The fundamental constraint of sequential computation: The batching across examples limited by memory
NLP序列编码
NLP基本思路,先将句子分词,然后将每个词转化为对应词向量,这样由单词拼接而成的句子就对应了一个矩阵X=(x1,x2,…,xn)X=(x_1,x_2,\dots,x_n)X=(x1,x2,…,xn),其中 xix_
原创
2021-04-12 10:56:53 ·
125 阅读 ·
0 评论