keras调用自己训练的模型,并去掉全连接层

本文介绍了如何在Keras中调用已经训练好的模型,并详细讲解了如何去除模型的全连接层,展示了一个网络结构变化的过程。
摘要由CSDN通过智能技术生成

其实很简单

from keras.models import load_model

base_model = load_model('model_resenet.h5')#加载指定的模型
print(base_model.summary())#输出网络的结构图

这是我的网络模型的输出,其实就是它的结构图

__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_1 (InputLayer)            (None, 227, 227, 1)  0                                            
__________________________________________________________________________________________________
conv2d_1 (Conv2D)               (None, 225, 225, 32) 320         input_1[0][0]                    
__________________________________________________________________________________________________
batch_normalization_1 (BatchNor (None, 225, 225, 32) 128         conv2d_1[0][0]                   
__________________________________________________________________________________________________
activation_1 (Activation)       (None, 225, 225, 32) 0           batch_normalization_1[0][0]      
__________________________________________________________________________________________________
conv2d_2 (Conv2D)               (None, 225, 225, 32) 9248        activation_1[0][0]               
__________________________________________________________________________________________________
batch_normalization_2 (BatchNor (None, 225, 225, 32) 128         conv2d_2[0][0]                   
__________________________________________________________________________________________________
activation_2 (Activation)       (None, 225, 225, 32) 0           batch_normalization_2[0][0]      
__________________________________________________________________________________________________
conv2d_3 (Conv2D)               (None, 225, 225, 32) 9248        activation_2[0][0]               
__________________________________________________________________________________________________
batch_normalization_3 (BatchNor (None, 225, 225, 32) 128         conv2d_3[0][0]                   
__________________________________________________________________________________________________
merge_1 (Merge)                 (None, 225, 225, 32) 0           batch_normalization_3[0][0]      
                                                                 activation_1[0][0]               
__________________________________________________________________________________________________
activation_3 (Activation)       (None, 225, 225, 32) 0           merge_1[0][0]                    
__________________________________________________________________________________________________
conv2d_4 (Conv2D)               (None, 225, 225, 32) 9248        activation_3[0][0]               
__________________________________________________________________________________________________
batch_normalization_4 (BatchNor (None, 225, 225, 32) 128         conv2d_4[0][0]                   
__________________________________________________________________________________________________
activation_4 (Activation)       (None, 225, 225, 32) 0           batch_normalization_4[0][0]      
__________________________________________________________________________________________________
conv2d_5 (Conv2D)               (None, 225, 225, 32) 9248        activation_4[0][0]               
__________________________________________________________________________________________________
batch_normalization_5 (BatchNor (None, 225, 225, 32) 128         conv2d_5[0][0]                   
__________________________________________________________________________________________________
merge_2 (Merge)                 (None, 225, 225, 32) 0           batch_normalization_5[0][0]      
                                                                 activation_3[0][0]               
__________________________________________________________________________________________________
activation_5 (Activation)       (None, 225, 225, 32) 0           merge_2[0][0]                    
__________________________________________________________________________________________________
max_pooling2d_1 (MaxPooling2D)  (None, 112, 112, 32) 0           activation_5[0][0]               
__________________________________________________________________________________________________
conv2d_6 (Conv2D)               (None, 110, 110, 64) 18496       max_pooling2d_1[0][0]            
__________________________________________________________________________________________________
batch_normalization_6 (BatchNor (None, 110, 110, 64) 256         conv2d_6[0][0]                   
__________________________________________________________________________________________________
activation_6 (Activation)       (None, 110, 110, 64) 0           batch_normalization_6[0][0]      
__________________________________________________________________________________________________
conv2d_7 (Conv2D)               (None, 110, 110, 64) 36928       activation_6[0][0]               
__________________________________________________________________________________________________
batch_normalization_7 (BatchNor (None, 110, 110, 64) 256         conv2d_7[0][0]                   
__________________________________________________________________________________________________
activation_7 (Activation)       (None, 110, 110, 64) 0           batch_normalization_7[0][0]      
__________________________________________________________________________________________________
conv2d_8 (Conv2D)               (None, 110, 110, 64) 36928       activation_7[0][0]               
__________________________________________________________________________________________________
batch_normalization_8 (BatchNor (None, 110, 110, 64) 256         conv2d_8[0][0]                   
__________________________________________________________________________________________________
merge_3 (Merge)                 (None, 110, 110, 64) 0           batch_normalization_8[0][0]      
                                                                 activation_6[0][0]               
__________________________________________________________________________________________________
activation_8 (Activation)       (None, 110, 110, 64) 0           merge_3[0][0]                    
__________________________________________________________________________________________________
conv2d_9 (Conv2D)               (None, 110, 110, 64) 36928       activation_8[0][0]               
__________________________________________________________________________________________________
batch_normalization_9 (BatchNor (None, 110, 110, 64) 256         conv2d_9[0][0]                   
__________________________________________________________________________________________________
activation_9 (Activation)       (None, 110, 110, 64) 0           batch_normalization_9[0][0]      
__________________________________________________________________________________________________
conv2d_10 (Conv2D)              (None, 110, 110, 64) 36928       activation_9[0][0]               
__________________________________________________________________________________________________
batch_normalization_10 (BatchNo (None, 110, 110, 64) 256         conv2d_10[0][0]                  
__________________________________________________________________________________________________
merge_4 (Merge)                 (None, 110, 110, 64) 0           batch_normalization_10[0][0]     
                                                                 activation_8[0][0]               
__________________________________________________________________________________________________
activation_10 (Activation)      (None, 110, 110, 64) 0           merge_4[0][0]                    
__________________________________________________________________________________________________
max_pooling2d_2 (MaxPooling2D)  (None, 55, 55, 64) 
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值