3种方法
- l2
from tensorflow.keras import layers,regularizers
x = layers.Conv2D( 128,3,padding='same',kernel_regularizer=regularizers.l2(0.01), )(x)
- dropout
x = layers.Dropout(0.5)(x) # 删去一半在全连接层和输出层
- 批处理
x = layers.BatchNormalization()(x)
Epoch 10/10
782/782 - 84s - loss: 1.9352 - accuracy: 0.2463
157/157 - 4s - loss: 1.8201 - accuracy: 0.3547
随便跑跑