使用keras搭建cnn-lstm时出现如下报错:
ValueError: Negative dimension size caused by subtracting 2 from 1 for '{{node conv1d_3/conv1d}} = Conv2D[T=DT_FLOAT, data_format="NHWC", dilations=[1, 1, 1, 1], explicit_paddings=[], padding="VALID", strides=[1, 1, 1, 1], use_cudnn_on_gpu=true](conv1d_3/conv1d/ExpandDims, conv1d_3/conv1d/ExpandDims_1)' with input shapes: [?,1,1,32], [1,2,32,64].
模型如下:
model = keras.Sequential()
model.add(layers.Conv1D(filters=32, kernel_size=3, strides=1, activation="relu",
#x_train.shape=(2496,3,17)
input_shape=(x_train.shape[1:])))
model.add(layers.MaxPooling1D(pool_size=2, strides=1,padding="same"))
model.add(layers.Conv1D(filters=64, kernel_size=2, strides=1, activation="relu"))
model.add(layers.MaxPooling1D(pool_size=3, strides=1,padding="same"))#3-2
model.add(layers.LSTM(10, return_sequences=True, activation='relu'))
model.add(layers.LSTM(10, return_sequences=False, activation='relu'))
model.add(layers.Dense(5, activation='relu'))
model.add(layers.Dense(1))
查了半天,最后发现问题出现在Conv1D中kernel_size的大小和time_step一样大,改成kernel_size=2完美解决。。。
可能只有我会犯这么蠢的错吧。。。