2020问答系统(QA)最新论文、书籍、数据集、竞赛、课程资源分析

    问答系统是自然语言处理和信息检索相关的一个重要学科,在实际工业界有非常多的应用场景,其核心的算法涉及机器学习和深度学习相关知识。

    本资源详细问答系统相关的详细资源,涉及问答系统技术最近的一些发展趋势,典型的问答系统架构,相关的开源数据集,相关比赛,经典论文、书籍、视频教程、课程,重要一些开源项目和代码等等,分享给需要的朋友。

 

    资源整理自网络,源地址:https://github.com/seriousran/awesome-qa

 

    带链接版资源下载地址:

    链接: https://pan.baidu.com/s/1V6_sFpiyXRj13VEQkkQkTA 

    提取码: 93yv 

 

目录

    •最近研究趋势

    •问答系统介绍

    •开源系统

    •问答系统相关竞赛

    •相关论文

    •代码

    •课程

    •PPT

    •数据集整理

    •数据集

    •书籍

    •重要链接

 

最近研究趋势

    XLNet

    •Original paper

    oXLNet: Generalized Autoregressive Pretraining for Language Understanding, Zhilin Yang, et al., arXiv preprint, 2019.

 

    BERT

    •Language Model

    oBERT: Pre-training of Deep Bidirectional Transformers for Language Understanding, Jacob Devlin, et al., NAACL 2019, 2018.

    oRoBERTa: A Robustly Optimized BERT Pretraining Approach, Yinhan Liu, et al., arXiv preprint, 2019.

    oALBERT: A Lite BERT for Self-supervised Learning of Language Representations, Zhenzhong Lan, et al., arXiv preprint, 2019.

    •QA

    oInvestigating the Successes and Failures of BERT for Passage Re-Ranking, Harshith Padigela, et al., arXiv preprint, May 2019.

    oBERT with History Answer Embedding for Conversational Question Answering, Chen Qu, et al., arXiv preprint, May 2019.

    oUnderstanding the Behaviors of BERT in Ranking, Yifan Qiao, et al., arXiv preprint, Apr 2019.

    oBERT Post-Training for Review Reading Comprehension and Aspect-based Sentiment Analysis, Hu Xu, et al., arXiv preprint, Apr 2019.

    oEnd-to-End Open-Domain Question Answering with BERTserini, Wei Yang, et al., arXiv preprint, Feb 2019.

    oA BERT Baseline for the Natural Questions, Chris Alberti, et al., arXiv preprint, Jan 2019.

    oPassage Re-ranking with BERT, Rodrigo Nogueira, et al., arXiv preprint, Jan 2019.

    oSDNet: Contextualized Attention-based Deep Network for Conversational Question Answering, Chenguang Zhu, et al., arXiv, Dec 2018.

 

    AAAI 2020

    •TANDA: Transfer and Adapt Pre-Trained Transformer Models for Answer Sentence Selection, Siddhant Garg, et al., AAAI 2020, Nov 2019.

 

    ACL 2019

    •Overview of the MEDIQA 2019 Shared Task on Textual Inference, Question Entailment and Question Answering, Asma Ben Abacha, et al., ACL-W 2019, Aug 2019.

    •Towards Scalable and Reliable Capsule Networks for Challenging NLP Applications, Wei Zhao, et al., ACL 2019, Jun 2019.

    •Cognitive Graph for Multi-Hop Reading Comprehension at Scale, Ming Ding, et al., ACL 2019, Jun 2019.

    •Real-Time Open-Domain Question Answering with Dense-Sparse Phrase Index, Minjoon Seo, et al., ACL 2019, Jun 2019.

    •Unsupervised Question Answering by Cloze Translation, Patrick Lewis, et al., ACL 2019, Jun 2019.

    •SemEval-2019 Task 10: Math Question Answering, Mark Hopkins, et al., ACL-W 2019, Jun 2019.

    •Improving Question Answering over Incomplete KBs with Knowledge-Aware Reader, Wenhan Xiong, et al., ACL 2019, May 2019.

    •Matching Article Pairs with Graphical Decomposition and Convolutions, Bang Liu, et al., ACL 2019, May 2019.

    •Episodic Memory Reader: Learning what to Remember for Question Answering from Streaming Data, Moonsu Han, et al., ACL 2019, Mar 2019.

    •Natural Questions: a Benchmark for Question Answering Research, Tom Kwiatkowski, et al., TACL 2019, Jan 2019.

    •Textbook Question Answering with Multi-modal Context Graph Understanding and Self-supervised Open-set Comprehension, Daesik Kim, et al., ACL 2019, Nov 2018.

 

    EMNLP-IJCNLP 2019

    •Language Models as Knowledge Bases?, Fabio Petron, et al., EMNLP-IJCNLP 2019, Sep 2019.

    •LXMERT: Learning Cross-Modality Encoder Representations from Transformers, Hao Tan, et al., EMNLP-IJCNLP 2019, De

评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

lqfarmer

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值