搜索微信公众号:‘AI-ming3526’或者’计算机视觉这件小事’ 获取更多AI干货
csdn:https://blog.csdn.net/abcgkj
github:https://github.com/aimi-cn/AILearners
0. NLP入门
AI、神经网络、机器学习、深度学习和大数据的核心知识备忘录分享
深度学习与NLP 深度学习|机器学习|人工智能 精品视频教程合集分享
NLP研究入门之道(github)
自然语言处理相关推荐书目
自然语言处理简介
走近NLP学术界
如何通过文献掌握学术动态
如何写一篇合格的学术论文
1. 热点分析
根据对接收论文标题的词云分析,在ACL 2017和2018上持续热门的关键词有注意力机制(attention)、网络(network)、知识(knowledge)、序列(sequence)和语言(language)。而在今年的ACL中,句子(sentence)、**词嵌入(embedding)和情感(sentiment)**受到了更多的关注。交叉(cross)、领域(domain)、**无监督(unsupervised)**等关键词也在今年上榜,可以看到业界有更多的人开始着手不同领域之间的交叉迁移,以及无监督学习的工作。
由此可见,seq2seq+attention已成为NLP领域的一大热点
2. 自动文摘重要论文
传统方法近年优秀论文
- Improving Topic Quality by Promoting Named Entities in Topic Modeling
摘要:新闻相关内容已在主题建模研究和命名实体识别中得到广泛研究。然而,命名实体的表达能力及其提高已发现主题质量的潜力并未引起太多关注。在本文中,我们使用命名实体作为以新闻为中心的内容的特定领域术语,并为Latent Dirichlet Allocation提供了一个新的加权模型。我们的实验结果表明,在主题描述符中涉及更多命名实体会对主题的整体质量产生积极影响,从而提高其可解释性,特异性和多样性。
seq2seq近年优秀论文
- BERT-Two-Stage: “Pretraining-Based Natural Language Generation for Text Summarization”. arXiv(2019)(基于预训练的自然语言生成用于文本摘要)[PDF]⭐️⭐️⭐️
- Re^3Sum: “Retrieve, Rerank and Rewrite: Soft Template Based Neural Summarization”. ACL(2018)(“检索,重新排列和重写:基于软模板的神经总结”)[PDF]⭐️⭐️⭐️⭐️⭐️
- NeuSum: “Neural Document Summarization by Jointly Learning to Score and Select Sentences”. ACL(2018)(共同学习分数和选择句子的神经文档摘要)[PDF]⭐️⭐️⭐️⭐️⭐️
- rnn-ext+abs+RL+rerank: “Fast Abstractive Summarization with Reinforce-Selected Sentence Rewriting”. ACL(2018)(加速选择句子重写的快速抽象摘要)[PDF][code]⭐️⭐️⭐️⭐️⭐️
- Seq2Seq+CGU: “Global Encoding for Abstractive Summarization”. ACL(2018)(“用于抽象概括的全局编码”)[PDF][code]⭐️⭐️⭐️⭐️
- Autoencoder as Assistant Supervisor: Improving Text Representation for Chinese Social Media Text Summarization(Autoencoder作为监督者:改进中文社交媒体文本摘要的文本表示)[PDF][code]
摘要和生成算法专场(视频+实录+PPT)| AIS预讲会全程干货分享(摘要和生成算法专场(视频+实录+PPT)| AIS预讲会全程干货分享) - T-ConvS2S: “Don’t Give Me the Details, Just the Summary! Topic-Aware Convolutional Neural Networks for Extreme Summarization”. EMNLP(2018)(“不要给我详细信息,只是总结!用于极端总结的主题感知卷积神经网络”)[PDF][code]⭐️⭐️⭐️⭐️⭐️
- RL-Topic-ConvS2S: “A reinforced topic-aware convolutional sequence-to-sequence model for abstractive text summarization.” IJCAI (2018)[PDF]⭐️⭐️⭐️⭐️⭐️
- GANsum: “Generative Adversarial Network for Abstractive Text Summarization”. AAAI (2018)[PDF]⭐️⭐️⭐️
- FTSum: “Faithful to the Original: Fact Aware Neural Abstractive Summarization”. AAAI(2018)[PDF]⭐️⭐️⭐️⭐️
- PGC: “Get To The Point: Summarization with Pointer-Generator Networks”. ACL (2017)[PDF][code]⭐️⭐️⭐️⭐️⭐️
- ABS\ABS+: “A Neural Attention Model for Abstractive Sentence Summarization”. EMNLP (2015)[PDF]⭐️⭐️⭐️⭐️
- RAS-Elman\RAS-LSTM: "Abstractive Sentence Summarization with Attentive Recurrent Neural Networks. HLT-NAACL (2016)[PDF][code]⭐️⭐️⭐️⭐️
- words-lvt2k-1sent: “Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond”. CoNLL (2016)[PDF]⭐️⭐️⭐️⭐️
- Long Short-Term Memory as a Dynamically Computed Element-wise Weighted Sum[PDF][code]
- Construction of a Chinese Corpus for the Analysis of the Emotionality of Metaphorical ExpressionsPDF
- A Deep Reinforced Model for Abstractive Summarization官方文档