看起来问题出在您在添加层时没有使用括号()
来实例化层。在Keras中,当您向模型中添加层时,需要使用括号来创建层的一个实例。以下是修改后的代码:
def get_CNN_LSTM_Attention(input_shape, num_classes):
model = Sequential()
# 搭建输入层,卷积层
model.add(
Conv1D(filters=16, kernel_size=64, strides=16, padding='same', kernel_regularizer=l2(1e-4), name='Conv_layer_1',
input_shape=input_shape))
# model.add(BatchNormalization())
model.add(Activation('relu'))
model.add(MaxPooling1D(pool_size=2, strides=2))
# 第二个卷积层
model.add(
Conv1D(filters=32, kernel_size=3, strides=1, padding='same', kernel_regularizer=l2(1e-4), name='Conv_layer_2'))
# model.add(BatchNormalization())
model.add(Activation('relu'))
model.add(MaxPooling1D(pool_size=4, strides=4, padding='valid'))
# LSTM层
model.add(LSTM(16, activation='tanh', recurrent_activation='hard_sigmoid', kernel_initializer='glorot_uniform',
recurrent_initializer='orthogonal', bias_initializer='zeros', return_sequences=True,
name='LSTM_layer_1'))
# model.add(Dropout(0.5))
model.add(LSTM(16, activation='tanh', recurrent_activation='hard_sigmoid', kernel_initializer='glorot_uniform',
recurrent_initializer='orthogonal', bias_initializer='zeros', return_sequences=True,
name='LSTM_layer_2'))
model.add(Dropout(0.5))
# 添加注意力机制层
model.add(Attention(name='attention_layer'))
# model.get_weights()
# model.add(BatchNormalization())
model.add(Activation("relu"))
# 增加输出层
model.add(Dense(units=num_classes, activation='softmax', kernel_regularizer=l2(1e-4)))
return model
请确保您的Attention
层也是正确定义的,如果它是一个自定义层,那么它需要继承自Layer
类。如果问题仍然存在,请提供更多信息以便进一步调查。