- 博客(1)
- 资源 (6)
- 收藏
- 关注
原创 keras attention code
#lstm+attention inputs = Input(shape=(TIME_STEPS, INPUT_DIM,)) lstm_units = 32 lstm_out = LSTM(lstm_units, return_sequences=True)(inputs) a = Permute((2, 1))(lstm_out) a = Reshape((input_dim, TIME_ST...
2018-04-17 09:33:09 1313 1
空空如也
TA创建的收藏夹 TA关注的收藏夹
TA关注的人