LSTM+Attention+Prediction

从Github上整理了几个关于时间序列预测的项目,简单记录一下:

@@@tensorflow
Time-series-prediction:Codebase for “Time-series prediction” with RNN, GRU, LSTM and Attention
https://github.com/jsyoon0823/Time-series-prediction
注:Attention class + GRU

@@Keras
Keras Attention Mechanism:Attention mechanism Implementation for Keras.
使用Keras实现 基于注意力机制(Attention)的 LSTM 时间序列预测
https://github.com/philipperemy/keras-attention-mechanism
使用Keras实现 基于注意力机制(Attention)的 LSTM 时间序列预测

@tensorflow
convlstm_Internet_traffic_prediction
https://github.com/EchoQer/convlstm_Internet_traffic_prediction
注:Encoder+ConvLSTMCell+Decoder

@tensorflow
AR-LSTMs:Predicting Transportation Demand based on AR-LSTMs Model with Multi-Head Attention
https://github.com/ncu-dart/AR-LSTMs
注:encoder_rnn_lstm+decoder_rnn_lstm+multihead_attention

@
LSTM_Attention
https://github.com/ningshixian/LSTM_Attention
注:各种Attention的model

@@
Stock Prediction Model using Attention Multilayer Recurrent Neural Networks with LSTM Cells
https://github.com/shoumo95/Stock_Prediction_Model_using_Attention_Multilayer_RNN_LSTM
注:这里的注意力机制使用Tensorflow中的AttentionCellWrapper实现。

attn_cell =tf.contrib.rnn.AttentionCellWrapper(cell=stacked_cell,attn_length=attn_win_size,state_is_tuple=False)

@Pytorch
Emotions-Predictions: Emotions prediction using RNN with LSTM cells and Attention Mechanisms.
https://github.com/ismail-mebsout/Emotions-Predictions
注:EncoderRNN+Attention Decoder

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值