tf.layers.dropout 和tf.nn.dropout区别
The only differences in the two functions are
tf.nn.dropout
has parameter keep_prob: “Probability that each element is kept”tf.layers.dropout
has parameter rate: “The dropout rate”
Thus, keep_prob = 1 - rate as defined here
The tf.layers.dropout hastraining
parameter: “Whether to return the output in training mode (apply dropout) or in inference mode (return the input untouched).”
tf.layers 是高层API,tf.nn是低层API
tf.layers is a higher-level wrapper, and tf.nn.dropout is from TensorFlow’s low-level library. tf.nn.dropout is there since the first public release of TensorFlow (version 0.6?), while tf.layers.dropout is there since about version 1.0 or so. As far as I know, the community develops new cool stuff in tf.contrib, which interfaces are likely to change. After a while, these are then transfered to tf.layers as soon as the interfaces (params, param-names, etc) are stable