NLP系列模型解析:
Transformer:https://blog.csdn.net/lppfwl/article/details/121084602
GPT系列:https://blog.csdn.net/lppfwl/article/details/121010275
BERT:https://blog.csdn.net/lppfwl/article/details/121124617
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
2019年goole AI发表的文章,附上原文:https://paperswithcode.com/method/bert
pytorch版本代码:https://github.com/huggingface/transformers/tree/master/src/transformers/models/bert
BERT名字由来: Bidirectional Encoder