算法
深度学习
- Implementing a CNN for Text Classification in TensorFlow
- Understanding Convolutional Neural Networks for NLP
- RNNs in Tensorflow, a Practical Guide and Undocumented Features
- Attention and Memory in Deep Learning and NLP
- 卷积神经网络CNN在自然语言处理中的应用
- 大牛教程
- ATTENTION MECHANISM
- Text Classification, Part 3 - Hierarchical attention network
- 《Attention is All You Need》浅读
RNN结构和实现
- LSTM神经网络输入输出究竟是怎样的?
- 双向循环神经网络及TensorFlow实现
- Understanding LSTM Networks
- The Unreasonable Effectiveness of Recurrent Neural Networks
- Recurrent Neural Networks Tutorial, Part 1 – Introduction to RNNs
- RNN加上Attention
- RNN 循环神经网络系列 4: 注意力机制
- Understanding LSTM and its diagrams
深度学习理论
- 深度学习最全优化方法总结比较(SGD,Adagrad,Adadelta,Adam,Adamax,Nadam)
- Softmax函数与交叉熵
- Soft & hard attention
- Deep learning - Computation & optimization.
- Deep learning - Linear algebra.
- Attention? Attention!
- The Transformer – Attention is all you need.
前沿论文
课程
工具使用
教程
Python
Numpy
TensorFlow
- tf.einsum
- tensor-to-tensor[理论篇]
- https://zhuanlan.zhihu.com/p/32870503