BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 阅读笔记
BERT: 论文阅读笔记1. abstract BERT是由Google AI Language发布,BERT即 Bidirectional Encoder Representations from Transformers. 预先训练好的BERT模型只需要一个额外的输出曾就可以微调,无需对特定任务的体系结构进行大量修改。将GLUE score、MultiNLI accuracy、SQu...
原创
2020-04-21 15:32:25 ·
241 阅读 ·
0 评论