深度学习自然语言处理模型实现大集合(精简版<100行)

    本资源整理了现有常见NLP深度学习模型,借鉴相关TensorFlow和Pytorch代码实现相关的模型代码,对绝大多数NLP模型进行精简,多数模型都是用不到100行代码实现的,(注释或空行除外)。

    资源整理自网络,带链接源地址:https://github.com/graykode/nlp-tutorial

    从NLP中的第一个语言模型NNLM开始,逐步包括RNN,LSTM,TextCNN,Word2Vec等经典模型。帮助读者更轻松地学习NLP模型,实现和训练各种seq2seq,attention注意力模型,bi-LSTM attenton,Transformer(self-attention)到BERT模型等等。

    1. Embedding 语言Model

    •1-1. NNLM(Neural Network Language Model) - Predict Next Word

    oPaper - A Neural Probabilistic Language Model(2003)

    oColab - NNLM_Tensor.ipynb, NNLM_Torch.ipynb

    •1-2. Word2Vec(Skip-gram) - Embedding Words and Show Graph

    oPaper - Distributed Representations of Words and Phrases and their Compositionality(2013)

    oColab - Word2Vec_Tensor(NCE_loss).ipynb, Word2Vec_Tensor(Softmax).ipynb, Word2Vec_Torch(Softmax).ipynb

    •1-3. FastText(Application Level) - Sentence Classification

    oPaper - Bag of Tricks for Efficient Text Classification(2016)

    oColab - FastText.ipynb

    2. CNN(Convolutional Neural Network)

    •2-1. TextCNN - Binary Sentiment Classification

    oPaper - Convolutional Neural Networks for Sentence Classification(2014)

    oColab - TextCNN_Tensor.ipynb, TextCNN_Torch.ipynb

    •2-2. DCNN(Dynamic Convolutional Neural Network)

    3. RNN(Recurrent Neural Network)

    •3-1. TextRNN - Predict Next Step

    oPaper - Finding Structure in Time(1990)

    oColab - TextRNN_Tensor.ipynb, TextRNN_Torch.ipynb

    •3-2. TextLSTM - Autocomplete

    oPaper - LONG SHORT-TERM MEMORY(1997)

    oColab - TextLSTM_Tensor.ipynb, TextLSTM_Torch.ipynb

    •3-3. Bi-LSTM - Predict Next Word in Long Sentence

    oColab - Bi_LSTM_Tensor.ipynb, Bi_LSTM_Torch.ipynb

    4. Attention Mechanism

    •4-1. Seq2Seq - Change Word

    oPaper - Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation(2014)

    oColab - Seq2Seq_Tensor.ipynb, Seq2Seq_Torch.ipynb

    •4-2. Seq2Seq with Attention - Translate

    oPaper - Neural Machine Translation by Jointly Learning to Align and Translate(2014)

    oColab - Seq2Seq(Attention)_Tensor.ipynb, Seq2Seq(Attention)_Torch.ipynb

    •4-3. Bi-LSTM with Attention - Binary Sentiment Classification

    oColab - Bi_LSTM(Attention)_Tensor.ipynb, Bi_LSTM(Attention)_Torch.ipynb

    5. Model based on Transformer

    •5-1. The Transformer - Translate

    oPaper - Attention Is All You Need(2017)

    oColab - Transformer_Torch.ipynb, Transformer(Greedy_decoder)_Torch.ipynb

    •5-2. BERT - Classification Next Sentence & Predict Masked Tokens

    oPaper - BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding(2018)

    oColab - BERT_Torch.ipynb

文章转载自深度学习与NLP

文章推荐

数据挖掘入门指南!!!

火了这么久的AI,现在怎么样了?

  • 0
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值