引用 参考文献 BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
引用 参考文献
很多数据库里都没有这篇文章,点引用都是那个pdf。特此记录一下
PDF全文 :https://arxiv.org/pdf/1810.04805.pdf
参考文献:Jacob Devlin,Chang Ming-wei,Kenton Lee,et al. BERT: pre-training of deep bidirectional transformersfor language understanding [C]Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies,2019: 4171-4186.
原文来源:https://aclanthology.org/N19-1423/