点击上方,选择星标或置顶,每天给你送干货!
阅读大概需要9分钟
跟随小博主,每天进步一丢丢
转载自:AINLP
推荐Github上一个NLP相关的项目:msgi/nlp-journey
项目地址,阅读原文可以直达,欢迎参与和Star:
https://github.com/msgi/nlp-journey
这个项目的作者是AINLP交流群里的慢时光同学,该项目收集了NLP相关的一些代码, 包括词向量(Word Embedding)、命名实体识别(NER)、文本分类(Text Classificatin)、文本生成、文本相似性(Text Similarity)计算等,基于keras和tensorflow,也收集了相关的书目、论文、博文、算法、项目资源链接,并且很细致的做了分类。
以下来自该项目介绍页,点击阅读原文可以直达相关资源链接。
基础算法
基础知识
常见问题
实践笔记
经典书目(百度云 提取码:b5qq)
Deep Learning.深度学习必读.
斯坦福大学《语音与语言处理》第三版:NLP必读.
Neural Networks and Deep Learning. 入门必读.
复旦大学《神经网络与深度学习》邱锡鹏教授.
CS224d: Deep Learning for Natural Language Processing.
必读论文
EDA: Easy Data Augmentation Techniques for Boosting Performance on Text Classification Tasks.
A Neural Probabilistic Language Model.
Transformer.
Transformer-XL.
Convolutional Neural Networks for Sentence Classification.
Attention-Based Bidirectional Long Short-Term Memory Networks for Relation Classification.
A Question-Focused Multi-Factor Attention Network for Question Answering.
AutoCross: Automatic Feature Crossing for Tabular Data in Real-World Applications.
GloVe: Global Vectors for Word Representation.
A Deep Ensemble Model with Slot Alignment for Sequence-to-Sequence Natural Language Generation.
The Design and Implementation of XiaoIce, an Empathetic Social Chatbot.
A Knowledge-Grounded Neural Conversation Model.
Neural Generative Question Answering.
A Sensitivity Analysis of (and Practitioners’ Guide to) Convolutional Neural Networks for Sentence Classification.
ImageNet Classification with Deep Convolutional Neural Networks.
Network In Network.
Long Short-term Memory.
Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation.
Get To The Point: Summarization with Pointer-Generator Networks.
Generative Adversarial Text to Image Synthesis.
Image-to-Image Translation with Conditional Adversarial Networks.
Photo-Realistic Single Image Super-Resolution Using a Generative Adversarial Network.
Unsupervised Learning of Visual Structure using Predictive Generative Networks.
Learning to Rank Short Text Pairs with Convolutional Deep Neural Networks.
Event Extraction via Dynamic Multi-Pooling Convolutional Neural.
Low-Memory Neural Network Training:A Technical Report.
Language Models are Unsupervised Multitask Learners.
Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context.
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding.
SeqGAN: Sequence Generative Adversarial Nets with Policy Gradient.
必读博文
The Illustrated Transformer.
Attention-based-model.
KL divergence.
Building Autoencoders in Keras.
Modern Deep Learning Techniques Applied to Natural Language Processing.
Node2vec embeddings for graph data.
Bert解读.
已实现算法
构建词向量
fasttext(skipgram+cbow)
gensim(word2vec)
数据增强
eda
分类算法
svm
fasttext
textcnn
bilstm+attention
rcnn
han
NER
bilstm+crf
文本相似度
siamese
相关github项目
keras-gpt-2.
textClassifier.
attention-is-all-you-need-keras.
BERT_with_keras.
SeqGAN.
相关博客
莫坠青云志
彗双智能-Keras源码分析
机器之心
colah
ZHPMATRIX
wildml
徐阿衡
零基础入门深度学习
相关会议
Association of Computational Linguistics(计算语言学协会). ACL
Empirical Methods in Natural Language Processing. EMNLP
International Conference on Computational Linguistics. COLING
Neural Information Processing Systems(神经信息处理系统会议). NIPS
AAAI Conference on Artificial Intelligence. AAAI
International Joint Conferences on AI. IJCAI
International Conference on Machine Learning(国际机器学习大会). ICML
编辑不易,还望给个好看!
推荐阅读:
【一分钟论文】 NAACL2019-使用感知句法词表示的句法增强神经机器翻译
【一分钟论文】轻松解读Semi-supervised Sequence Learning半监督序列学习
详解Transition-based Dependency parser基于转移的依存句法解析器