tensorflow中卷积神经网络中,经过卷积和池化层,最后会拉伸一下参数,再与全连接层相连。
比如: re1 = tf.reshape(pool4, [-1, 6 * 6 * 128])
但有时,我会建着建着网络就记不得6 * 6 * 128这个是多少了,为了方便,发现其实有更简单的写法:
比如: re1 = tf.reshape(pool4, [1,-1])
下来验证两种写法是不是一样的效果:
代码:
import tensorflow as tf
import numpy as np
t =[[1, 2, 3],[4, 5, 6],[7, 8, 9]]print(t) # [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
t1 = tf.reshape(t, [-1,9])
print(t1) #Tensor("Reshape_17:0", shape=(1, 9), dtype=int32)
t3 = tf.reshape(t, [1,-1])
print(t3) #Tensor("Reshape_18:0", shape=(1, 9), dtype=int32)
t4 = t1,t3
a = [[[1, 1], [2, 2]],[[3, 3], [4, 4]]]
print(a) # [[[1, 1], [2, 2]], [[3, 3], [4, 4]]]
a1 = np.reshape(a, [-1,8])
print(a1) # [[1 1 2 2 3 3 4 4]]
a2 = np.reshape(a, [1,-1])
print(a2) # [[1 1 2 2 3 3 4 4]]
a3 = a1,a2
证明:就是一样的