官方文档如下:activation='relu'
可见默认的激活层为relu
class mxnet.gluon.rnn.RNN(hidden_size, num_layers=1, activation='relu', layout='TNC', dropout=0, bidirectional=False, i2h_weight_initializer=None, h2h_weight_initializer=None, i2h_bias_initializer='zeros', h2h_bias_initializer='zeros', input_size=0, **kwargs)
对activation
参数解释如下:
activation ({'relu' or 'tanh'}, default 'relu') – The activation function to use.
所以,在用这个循环神经网络时,选激活层一定要特别注意。