对一个层的多次调用,就是在共享这个层。
input1 = Input(shape=[28,28])
input2 = Input(shape=[28,28])
x1 = Flatten()(input1)
x1 = Dense(60,activation="relu")(x1)
x2 = Flatten()(input2)
x2 = Dense(60,activation="relu")(x2)
d = Dense(10, activation='softmax')
output1 = d(x1)
output2 = d(x2)
model1 = Model(input=[input1], output=[output1])
model2 = Model(input=[input1], output=[output1])
print(model1.summary())
print(model2.summary())
结果:
____________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
====================================================================================================
input_1 (InputLayer) (None, 28, 28) 0
____________________________________________________________________________________________________
flatten_1 (Flatten) (None, 784) 0 input_1[0][0]
____________________________________________________________________________________________________
dense_1 (Dense) (None, 60) 47100 flatten_1[0][0]
____________________________________________________________________________________________________
dense_3 (Dense) (None, 10) 610 dense_1[0][0]
====================================================================================================
Total params: 47710
____________________________________________________________________________________________________
None
____________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
====================================================================================================
input_1 (InputLayer) (None, 28, 28) 0
____________________________________________________________________________________________________
flatten_1 (Flatten) (None, 784) 0 input_1[0][0]
____________________________________________________________________________________________________
dense_1 (Dense) (None, 60) 47100 flatten_1[0][0]
____________________________________________________________________________________________________
dense_3 (Dense) (None, 10) 610 dense_1[0][0]
====================================================================================================
Total params: 47710
____________________________________________________________________________________________________
None
如果修改x2:
x2 = Dense(70,activation="relu")(x2)
就会报错:
Exception: Input 0 is incompatible with layer dense_3: expected shape=(None, 60), found shape=(None, 70)
就是因为第一次使用Dense层的时候,他的参数w的大小已经定了是(28*28,60),共享层的目的不就是为了共享参数的吗,所以,我讲明白了吧!