Keras共享某个层

对一个层的多次调用,就是在共享这个层。

input1 = Input(shape=[28,28])
input2 = Input(shape=[28,28])
x1 = Flatten()(input1)
x1 = Dense(60,activation="relu")(x1)
x2 = Flatten()(input2)
x2 = Dense(60,activation="relu")(x2)

d = Dense(10, activation='softmax')

output1 = d(x1)
output2 = d(x2)
model1 = Model(input=[input1], output=[output1])
model2 = Model(input=[input1], output=[output1])
print(model1.summary())
print(model2.summary())

结果:

____________________________________________________________________________________________________
Layer (type)                     Output Shape          Param #     Connected to                     
====================================================================================================
input_1 (InputLayer)             (None, 28, 28)        0                                            
____________________________________________________________________________________________________
flatten_1 (Flatten)              (None, 784)           0           input_1[0][0]                    
____________________________________________________________________________________________________
dense_1 (Dense)                  (None, 60)            47100       flatten_1[0][0]                  
____________________________________________________________________________________________________
dense_3 (Dense)                  (None, 10)            610         dense_1[0][0]                    
====================================================================================================
Total params: 47710
____________________________________________________________________________________________________
None
____________________________________________________________________________________________________
Layer (type)                     Output Shape          Param #     Connected to                     
====================================================================================================
input_1 (InputLayer)             (None, 28, 28)        0                                            
____________________________________________________________________________________________________
flatten_1 (Flatten)              (None, 784)           0           input_1[0][0]                    
____________________________________________________________________________________________________
dense_1 (Dense)                  (None, 60)            47100       flatten_1[0][0]                  
____________________________________________________________________________________________________
dense_3 (Dense)                  (None, 10)            610         dense_1[0][0]                    
====================================================================================================
Total params: 47710
____________________________________________________________________________________________________
None

如果修改x2:

x2 = Dense(70,activation="relu")(x2)

就会报错:

Exception: Input 0 is incompatible with layer dense_3: expected shape=(None, 60), found shape=(None, 70)

就是因为第一次使用Dense层的时候,他的参数w的大小已经定了是(28*28,60),共享层的目的不就是为了共享参数的吗,所以,我讲明白了吧!

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值