在tf.keras高级API中添加Dropout层的方式与早期的Tensorflow+keras的方式有所不同。典型的案例如下,直接在tf.keras.layers.Sequential中相应的位置添加tf.keras.layers.Dropout(ratio)即可,使用非常方便:
dropout_model = tf.keras.Sequential([
tf.keras.layers.Dense(512, activation='elu', input_shape=(FEATURES,)),
tf.keras.layers.Dropout(0.5),
tf.keras.layers.Dense(512, activation='elu'),
tf.keras.layers.Dropout(0.5),
tf.keras.layers.Dense(512, activation='elu'),
tf.keras.layers.Dropout(0.5),
tf.keras.layers.Dense(512, activation='elu'),
tf.keras.layers.Dropout(0.5),
tf.keras.layers.Dense(1)
])
案例参考如下链接:https://tensorflow.google.cn/tutorials/keras/overfit_and_underfit
tf.keras.layers.Dropout的官方介绍归纳如下:
tf.keras.layers.Dropout(
rate, noise_shape=None, seed=None, **kwargs
)
Arguments:
rate
: Float between 0 and 1. Fraction of the input units to drop.noise_shape
: 1D integer tensor representing the shape of the binary dropout mask that will be multiplied with the input. For instance, if your inputs have shape(batch_size, timesteps, features)
and you want the dropout mask to be the same for all timesteps, you can usenoise_shape=(batch_size, 1, features)
.seed
: A Python integer to use as random seed.
Call arguments:
inputs
: Input tensor (of any rank).training
: Python boolean indicating whether the layer should behave in training mode (adding dropout) or in inference mode (doing nothing).
参考: https://tensorflow.google.cn/api_docs/python/tf/keras/layers/Dropout?hl=en#used-in-the-notebooks