迁移学习在nlp领域的应用之pretrain language representation,四连载,建议按顺序看,看完对该方向一定会非常清楚的!
(一)ELMO:Deep contextualized word representations
(二)Universal Language Model Fine-tuning for Text Classification
(三)openAI GPT:Improving Language Understanding by Generative Pre-Training
(四)BERT: Pretraining of Deep Bidirectional Transformers for Language Understanding
一、 问题描述
为了方便nlp的迁移学习,人们提出采用无标注数据训练语言模型(language model),并在其后加上一层全连接和softmax组成分类器ÿ