tf.contrib.layers.fully_connected(
inputs,
num_outputs,
activation_fn=tf.nn.relu,
normalizer_fn=None,
normalizer_params=None,
weights_initializer=initializers.xavier_initializer(),
weights_regularizer=None,
biases_initializer=tf.zeros_initializer(),
biases_regularizer=None,
reuse=None,
variables_collections=None,
outputs_collections=None,
trainable=True,
scope=None
)
增加一个全连接层
自动初始化w和b
激活函数默认为relu函数
输出个数由num_outputs指定
参考:
https://tensorflow.google.cn/api_docs/python/tf/contrib/layers/fully_connected