Pytorch-LSTM+Attention文本分类
https://blog.csdn.net/qq_34838643/article/details/110200332
What is attention mechanism?
https://towardsdatascience.com/what-is-attention-mechanism-can-i-have-your-attention-please-3333637f2eac
Adding A Custom Attention Layer To Recurrent Neural Network In Keras
https://machinelearningmastery.com/adding-a-custom-attention-layer-to-recurrent-neural-network-in-keras/
相关问题
Why the performance of LSTM decreases after the addition of attention using pytorch?