Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction
https://www.aclweb.org/anthology/P19-1134/www.aclweb.org发表在ACL-2019上面。
后来发现这篇paper的作者,居然用差不多的思路在relation classification 上面也搞了篇paper。claim的点都差不多,服了。。。
论文地址如下:
https://arxiv.org/abs/1906.03088
看名字也知道是用预训练模型来做distant supervised RE的。
先说之前的工作如何:Current relation extraction methods try to alleviate the noise by multi-instance learning and by providing supporting linguistic and contextual information to more effificientl