nlp-journey: 一段学习nlp的痛苦旅程

Github地址:https://github.com/msgi/nlp-journey

基础

经典书目(百度云 提取码:b5qq)

算法入门
深度学习
  • Deep Learning.深度学习必读. 原书地址
  • Neural Networks and Deep Learning. 入门必读. 原书地址
  • 复旦大学《神经网络与深度学习》邱锡鹏教授. 原书地址
自然语言处理
  • 斯坦福大学《语音与语言处理》第三版:NLP必读. 原书地址
  • CS224d: Deep Learning for Natural Language Processing. 课件地址

必读论文

算法模型与优化
  • LSTM(Long Short-term Memory). 地址
  • Dropout(Improving neural networks by preventing co-adaptation of feature detectors). 地址
  • Residual Network(Deep Residual Learning for Image Recognition). 地址
语言模型
  • A Neural Probabilistic Language Model. 地址
  • Language Models are Unsupervised Multitask Learners. 地址
文本增强
  • EDA: Easy Data Augmentation Techniques for Boosting Performance on Text Classification Tasks.地址
文本预训练
  • Efficient Estimation of Word Representations in Vector Space. 地址
  • Distributed Representations of Sentences and Documents. 地址
  • Enriching Word Vectors with Subword Information. 地址. 解读
  • GloVe: Global Vectors for Word Representation. 官网
  • ELMo (Deep contextualized word representations). 地址
  • BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. 地址
  • Pre-Training with Whole Word Masking for Chinese BERT. 地址
  • XLNet: Generalized Autoregressive Pretraining for Language Understanding地址
文本分类
  • A Sensitivity Analysis of (and Practitioners’ Guide to) Convolutional Neural Networks for Sentence Classification. 地址
  • Convolutional Neural Networks for Sentence Classification. 地址
  • Attention-Based Bidirectional Long Short-Term Memory Networks for Relation Classification. 地址
文本生成
  • A Deep Ensemble Model with Slot Alignment for Sequence-to-Sequence Natural Language Generation. 地址
  • SeqGAN: Sequence Generative Adversarial Nets with Policy Gradient. 地址
  • Generative Adversarial Text to Image Synthesis. 地址
文本相似性
  • Learning to Rank Short Text Pairs with Convolutional Deep Neural Networks. 地址
  • Learning Text Similarity with Siamese Recurrent Networks. 地址
短文本匹配
  • A Deep Architecture for Matching Short Texts. 地址
自动问答
  • A Question-Focused Multi-Factor Attention Network for Question Answering. 地址
  • The Design and Implementation of XiaoIce, an Empathetic Social Chatbot. 地址
  • A Knowledge-Grounded Neural Conversation Model. 地址
  • Neural Generative Question Answering. 地址
  • Sequential Matching Network A New Architecture for Multi-turn Response Selection in Retrieval-Based Chatbots.地址
  • Modeling Multi-turn Conversation with Deep Utterance Aggregation.地址
  • Multi-Turn Response Selection for Chatbots with Deep Attention Matching Network.地址
机器翻译
  • Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation. 地址
  • Transformer (Attention Is All You Need). 地址
  • Transformer-XL:Attentive Language Models Beyond a Fixed-Length Context. 地址
自动摘要
  • Get To The Point: Summarization with Pointer-Generator Networks. 地址
事件提取
  • Event Extraction via Dynamic Multi-Pooling Convolutional Neural. 地址

必读博文

  • The Illustrated Transformer.博文
  • Attention-based-model. 地址
  • KL divergence. 地址
  • Building Autoencoders in Keras. 地址
  • Modern Deep Learning Techniques Applied to Natural Language Processing. 地址
  • Node2vec embeddings for graph data. 地址
  • Bert解读. 地址 地址
  • XLNet:运行机制及和Bert的异同比较. 地址
  • 难以置信!LSTM和GRU的解析从未如此清晰(动图+视频)。地址

已实现算法

  • fasttext(skipgram+cbow)
  • gensim(word2vec)
  • eda
  • svm
  • fasttext
  • textcnn
  • bilstm+attention
  • rcnn
  • han
  • bilstm+crf
  • siamese

相关github项目

相关博客

相关会议

  • Association of Computational Linguistics(计算语言学协会). ACL
  • Empirical Methods in Natural Language Processing. EMNLP
  • International Conference on Computational Linguistics. COLING
  • Neural Information Processing Systems(神经信息处理系统会议). NIPS
  • AAAI Conference on Artificial Intelligence. AAAI
  • International Joint Conferences on AI. IJCAI
  • International Conference on Machine Learning(国际机器学习大会). ICML
  • 1
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值