relu函数:tf.nn.relu()
- 隐藏层用relu作为激活函数
- hidden_layer = tf.add(tf.matmul(features,hidden_weights),hidden_biases)
- hidden_layer = tf.nn.relu(hidden_layer)
- output = tf.add(tf.matmul(hidden_layer,output_weights),output_biases)
import tensorflow as tf
output = None
hidden_layer_weights = [
[0.1, 0.2, 0.4],
[0.4, 0.6, 0.6],
[0.5, 0.9, 0.1