Question Answering: Chen Danqi Stanford

Question Answering

主要贡献:
1. Model: 
SAR(Stanford Attentive Reader) ,Chap02 thesis
2. System: facebookresearch / DrQA
https://github.com/facebookresearch/DrQA

=================================================

1.SQuAD: 100,000+ Questions for Machine Comprehension of Text ,2016
   https://arxiv.org/pdf/1606.05250.pdf

2.Bidirectional Attention Flow for Machine Comprehension  ,2018
  https://arxiv.org/pdf/1611.01603.pdf

【中】 https://zhuanlan.zhihu.com/p/53626872  
【中】 https://blog.csdn.net/mottled233/article/details/104409697

3.Reading Wikipedia to Answer Open-Domain Questions ,2017
   https://arxiv.org/pdf/1704.00051.pdf

【中】https://blog.csdn.net/qq_28385535/article/details/105761817
【中】https://blog.csdn.net/shark803/article/details/96582622
https://jozeelin.github.io/2019/08/13/drqa/
https://zhuanlan.zhihu.com/p/93078867

4.Latent Retrieval for Weakly Supervised Open Domain Question Answering  ,ACL 2019,Goolge
   https://arxiv.org/pdf/1906.00300.pdf

【中】https://zhuanlan.zhihu.com/p/93580777

5.Dense Passage Retrieval for Open-Domain Question Answering  ,2020
   https://arxiv.org/pdf/2004.04906.pdf

  【中】https://blog.csdn.net/c9Yv2cf9I06K2A9E/article/details/106435207
在检索之前先用一个 dense encoder 给文档库中的所有文档都进行 encoding。在检索的时候用另一个
dense encoder 给 question 进行 encoding,之后根据下图公式算两个 representation 的 similarity,
取 top-k 作为结果。
作者用的 encoder 是 bert-base-uncased,然后拿 [CLS] 的 vector 作为 representation。由于文档库
可能会很大,所以作者用了 facebbook FAISS来索引 encode 之后的向量。

 

5.a Leveraging Passage Retrieval with Generative Models for Open Domain Question Answering,2020
https://arxiv.org/abs/2007.01282
5.b How Much Knowledge Can You Pack Into the Parameters of a Language Model,2020
https://arxiv.org/abs/2002.08910
6.1 Real-Time Open-Domain Question Answering with Dense-Sparse Phrase Index,2019
 https://arxiv.org/abs/1906.05807
6.Learning Dense Representations of Phrases at Scale ,2021
   https://arxiv.org/pdf/2012.12624.pdf

It is possible to encode all the phrases (60 billion phrases in Wikipedia) using dense vectors
and only do nearest neighbor search without a BERT model at inference time!

=================================================
Neural Reading Comprehension and Beyond ,2018,
https://www.cs.princeton.edu/~danqic/papers/thesis.pdf
https://github.com/danqi/thesis
=================================================
ACL2020 Tutorial: Open-Domain Question Answering
https://github.com/danqi/acl2020-openqa-tutorial
=

中译:
https://blog.csdn.net/Magical_Bubble/article/details/89488722
https://blog.csdn.net/mottled233/article/details/102995776
https://blog.csdn.net/cindy_1102/article/details/88714390
 

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值