pytorch实现attention_深度学习自然语言处理模型实现大集合(精简版<100行)

aeaa56275a8dc682ebff34e8a4c271b3.png

本资源整理了现有常见NLP深度学习模型,借鉴相关TensorFlow和Pytorch代码实现相关的模型代码,对绝大多数NLP模型进行精简,多数模型都是用不到100行代码实现的,(注释或空行除外)。

从NLP中的第一个语言模型NNLM开始,逐步包括RNN,LSTM,TextCNN,Word2Vec等经典模型。帮助读者更轻松地学习NLP模型,实现和训练各种seq2seq,attention注意力模型,bi-LSTM attenton,Transformer(self-attention)到BERT模型等等。

1. Embedding 语言Model

•1-1. NNLM(Neural Network Language Model) - Predict Next Word

oPaper - A Neural Probabilistic Language Model(2003)

oColab - NNLM_Tensor.ipynb, NNLM_Torch.ipynb

•1-2. Word2Vec(Skip-gram) - Embedding Words and Show Graph

oPaper - Distributed Representations of Words and Phrases and their Compositionality(2013)

oColab - Word2Vec_Tensor(NCE_loss).ipynb, Word2Vec_Tensor(Softmax).ipynb, Word2Vec_Torch(Softmax).ipynb

•1-3. FastText(Application Level) - Sentence Classification

oPaper - Bag of Tricks for Efficient Text Classification(2016)

oColab - FastText.ipynb

2. CNN(Convolutional Neural Network)

•2-1. TextCNN - Binary Sentiment Classification

oPaper - Convolutional Neural Networks for Sentence Classification(2014)

oColab - TextCNN_Tensor.ipynb, TextCNN_Torch.ipynb

•2-2. DCNN(Dynamic Convolutional Neural Network)

3. RNN(Recurrent Neural Network)

•3-1. TextRNN - Predict Next Step

oPaper - Finding Structure in Time(1990)

oColab - TextRNN_Tensor.ipynb, TextRNN_Torch.ipynb

•3-2. TextLSTM - Autocomplete

oPaper - LONG SHORT-TERM MEMORY(1997)

oColab - TextLSTM_Tensor.ipynb, TextLSTM_Torch.ipynb

•3-3. Bi-LSTM - Predict Next Word in Long Sentence

oColab - Bi_LSTM_Tensor.ipynb, Bi_LSTM_Torch.ipynb

4. Attention Mechanism

•4-1. Seq2Seq - Change Word

oPaper - Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation(2014)

oColab - Seq2Seq_Tensor.ipynb, Seq2Seq_Torch.ipynb

•4-2. Seq2Seq with Attention - Translate

oPaper - Neural Machine Translation by Jointly Learning to Align and Translate(2014)

oColab - Seq2Seq(Attention)_Tensor.ipynb, Seq2Seq(Attention)_Torch.ipynb

•4-3. Bi-LSTM with Attention - Binary Sentiment Classification

oColab - Bi_LSTM(Attention)_Tensor.ipynb, Bi_LSTM(Attention)_Torch.ipynb

5. Model based on Transformer

•5-1. The Transformer - Translate

oPaper - Attention Is All You Need(2017)

oColab - Transformer_Torch.ipynb, Transformer(Greedy_decoder)_Torch.ipynb

•5-2. BERT - Classification Next Sentence & Predict Masked Tokens

oPaper - BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding(2018)

oColab - BERT_Torch.ipynb

2accfbf4e90e528393dad0bda857d17a.png

内容源地址:https://www.toutiao.com/a1664931447872520

往期精品内容推荐

【全网首发】京东AI三大NLP项目实战

342个中、英文等NLP开源数据集分享

20年2月新书-《贝叶斯算法分析技术第三版》免费分享

李宏毅-《深度学习/机器学习2020》中文视频课程及ppt分享

MIT新课-《6.824分布式系统2020春》视频及ppt分享

最新免费书推荐-《因果推理算法概述》pdf免费下载

40+机器学习教程分享-涵盖机器学习所有方面

机器学习必看经典教材-《统计机器学习(数据挖掘、推理和预测)核心元素》最新版免费分享

斯坦福大学新课CS224W-2019-图网络机器学习算法-视频及ppt资源分享

历史最全自然语言处理测评基准分享-数据集、基准(预训练)模型、语料库、排行榜

自然语言领域中图神经网络模型(GNN)应用现状(论文)

中文自然语言处理测评数据集、基准模型、语料库、排行榜整理分享

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值