《深度学习——实战caffe》——caffemodel可视化

通过前面的学习,我们已经能够正常训练各种数据了。设置好solver.prototxt后,我们可以把训练好的模型保存起来,如lenet_iter_10000.caffemodel。训练多少次就自动保存一下,这个是通过snapshot进行设置的,保存文件的路径及文件名前缀是由snapshot_prefix来设定的。这个文件里面存放的就是各层的参数,即net.params,里面没有数据(net.blobs)。顺带还生成了一个相应的solverstate文件,这个和caffemodel差不多,但它多了一些数据,如模型名称、当前迭代次数等。两者的功能不一样,训练完后保存起来的caffemodel,是在测试阶段用来分类的,而solverstate是用来恢复训练的,防止意外终止而保存的快照(有点像断点续传的感觉)。

既然我们知道了caffemodel里面保存的就是模型各层的参数,因此我们可以把这些参数提取出来,进行可视化,看一看究竟长什么样。

我们先训练cifar10数据(mnist也可以),迭代10000次,然后将训练好的 model保存起来,名称为my_iter_10000.caffemodel,然后使用jupyter notebook 来进行可视化。

import numpy as np
import matplotlib.pyplot as plt
import os,sys,caffe  #matplotlib inline
caffe_root='/caffe/'
os.chdir(caffe_root)
sys.path.insert(0,caffe_root+'python')


# 编写一个函数,用于显示各层的参数
def show_feature(data, padsize=1, padval=0):
    data -= data.min()
    data /= data.max()

    # force the number of filters to be square
    n = int(np.ceil(np.sqrt(data.shape[0])))
    padding = ((0, n ** 2 - data.shape[0]), (0, padsize), (0, padsize)) + ((0, 0),) * (data.ndim - 3)
    data = np.pad(data, padding, mode='constant', constant_values=(padval, padval))

    # tile the filters into an image
    data = data.reshape((n, n) + data.shape[1:]).transpose((0, 2, 1, 3) + tuple(range(4, data.ndim + 1)))
    data = data.reshape((n * data.shape[1], n * data.shape[3]) + data.shape[4:])
    plt.imshow(data)
    plt.axis('off')

if __name__ == '__main__':
    plt.rcParams['figure.figsize'] = (8, 8)
    plt.rcParams['image.interpolation'] = 'nearest'
    plt.rcParams['image.cmap'] = 'gray'
    #设置网络模型,并显示该模型中各层名称和参数的规模(注意此处是net.params, 而不是net.blobs)
    net = caffe.Net(caffe_root + 'examples/cifar10/cifar10_quick.prototxt',
                    caffe_root + 'examples/cifar10/cifar10_quick_iter_4000.caffemodel',
                    caffe.TEST)
    [(k, v[0].data.shape) for k, v in net.params.items()]
    # 第一个卷积层,参数规模为(32,3,5,5),即32个5*5的3通道filter
    weight = net.params["conv1"][0].data
    print(weight.shape)
    show_feature(weight.transpose(0, 2, 3, 1))
    plt.show()
参数有两种类型:权值参数和偏置项。分别用params["conv1"][0] 和params["conv1"][1] 表示 。
我们只显示权值参数,因此用params["conv1"][0] 

结果如下:

WARNING: Logging before InitGoogleLogging() is written to STDERR
W0608 20:29:20.503628 23539 _caffe.cpp:139] DEPRECATION WARNING - deprecated use of Python interface
W0608 20:29:20.503654 23539 _caffe.cpp:140] Use this instead (with the named "weights" parameter):
W0608 20:29:20.503659 23539 _caffe.cpp:142] Net('/caffe/examples/cifar10/cifar10_quick.prototxt', 1, weights='/caffe/examples/cifar10/cifar10_quick_iter_4000.caffemodel')
I0608 20:29:20.505218 23539 net.cpp:51] Initializing net from parameters: 
name: "CIFAR10_quick_test"
state {
  phase: TEST
  level: 0
}
layer {
  name: "data"
  type: "Input"
  top: "data"
  input_param {
    shape {
      dim: 1
      dim: 3
      dim: 32
      dim: 32
    }
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 32
    pad: 2
    kernel_size: 5
    stride: 1
  }
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "conv1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "pool1"
  top: "pool1"
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "pool1"
  top: "conv2"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 32
    pad: 2
    kernel_size: 5
    stride: 1
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "conv2"
  top: "pool2"
  pooling_param {
    pool: AVE
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "pool2"
  top: "conv3"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 64
    pad: 2
    kernel_size: 5
    stride: 1
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "pool3"
  type: "Pooling"
  bottom: "conv3"
  top: "pool3"
  pooling_param {
    pool: AVE
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "ip1"
  type: "InnerProduct"
  bottom: "pool3"
  top: "ip1"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  inner_product_param {
    num_output: 64
  }
}
layer {
  name: "ip2"
  type: "InnerProduct"
  bottom: "ip1"
  top: "ip2"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  inner_product_param {
    num_output: 10
  }
}
layer {
  name: "prob"
  type: "Softmax"
  bottom: "ip2"
  top: "prob"
}
I0608 20:29:20.505277 23539 layer_factory.hpp:77] Creating layer data
I0608 20:29:20.505292 23539 net.cpp:84] Creating Layer data
I0608 20:29:20.505302 23539 net.cpp:380] data -> data
I0608 20:29:20.505328 23539 net.cpp:122] Setting up data
I0608 20:29:20.505342 23539 net.cpp:129] Top shape: 1 3 32 32 (3072)
I0608 20:29:20.505350 23539 net.cpp:137] Memory required for data: 12288
I0608 20:29:20.505355 23539 layer_factory.hpp:77] Creating layer conv1
I0608 20:29:20.505364 23539 net.cpp:84] Creating Layer conv1
I0608 20:29:20.505369 23539 net.cpp:406] conv1 <- data
I0608 20:29:20.505375 23539 net.cpp:380] conv1 -> conv1
I0608 20:29:20.505400 23539 net.cpp:122] Setting up conv1
I0608 20:29:20.505409 23539 net.cpp:129] Top shape: 1 32 32 32 (32768)
I0608 20:29:20.505412 23539 net.cpp:137] Memory required for data: 143360
I0608 20:29:20.505424 23539 layer_factory.hpp:77] Creating layer pool1
I0608 20:29:20.505430 23539 net.cpp:84] Creating Layer pool1
I0608 20:29:20.505434 23539 net.cpp:406] pool1 <- conv1
I0608 20:29:20.505440 23539 net.cpp:380] pool1 -> pool1
I0608 20:29:20.505450 23539 net.cpp:122] Setting up pool1
I0608 20:29:20.505455 23539 net.cpp:129] Top shape: 1 32 16 16 (8192)
I0608 20:29:20.505460 23539 net.cpp:137] Memory required for data: 176128
I0608 20:29:20.505463 23539 layer_factory.hpp:77] Creating layer relu1
I0608 20:29:20.505468 23539 net.cpp:84] Creating Layer relu1
I0608 20:29:20.505472 23539 net.cpp:406] relu1 <- pool1
I0608 20:29:20.505477 23539 net.cpp:367] relu1 -> pool1 (in-place)
I0608 20:29:20.505483 23539 net.cpp:122] Setting up relu1
I0608 20:29:20.505489 23539 net.cpp:129] Top shape: 1 32 16 16 (8192)
I0608 20:29:20.505492 23539 net.cpp:137] Memory required for data: 208896
I0608 20:29:20.505496 23539 layer_factory.hpp:77] Creating layer conv2
I0608 20:29:20.505503 23539 net.cpp:84] Creating Layer conv2
I0608 20:29:20.505506 23539 net.cpp:406] conv2 <- pool1
I0608 20:29:20.505512 23539 net.cpp:380] conv2 -> conv2
I0608 20:29:20.505544 23539 net.cpp:122] Setting up conv2
I0608 20:29:20.505553 23539 net.cpp:129] Top shape: 1 32 16 16 (8192)
I0608 20:29:20.505556 23539 net.cpp:137] Memory required for data: 241664
I0608 20:29:20.505565 23539 layer_factory.hpp:77] Creating layer relu2
I0608 20:29:20.505570 23539 net.cpp:84] Creating Layer relu2
I0608 20:29:20.505574 23539 net.cpp:406] relu2 <- conv2
I0608 20:29:20.505579 23539 net.cpp:367] relu2 -> conv2 (in-place)
I0608 20:29:20.505585 23539 net.cpp:122] Setting up relu2
I0608 20:29:20.505590 23539 net.cpp:129] Top shape: 1 32 16 16 (8192)
I0608 20:29:20.505594 23539 net.cpp:137] Memory required for data: 274432
I0608 20:29:20.505597 23539 layer_factory.hpp:77] Creating layer pool2
I0608 20:29:20.505602 23539 net.cpp:84] Creating Layer pool2
I0608 20:29:20.505606 23539 net.cpp:406] pool2 <- conv2
I0608 20:29:20.505611 23539 net.cpp:380] pool2 -> pool2
I0608 20:29:20.505619 23539 net.cpp:122] Setting up pool2
I0608 20:29:20.505625 23539 net.cpp:129] Top shape: 1 32 8 8 (2048)
I0608 20:29:20.505627 23539 net.cpp:137] Memory required for data: 282624
I0608 20:29:20.505631 23539 layer_factory.hpp:77] Creating layer conv3
I0608 20:29:20.505637 23539 net.cpp:84] Creating Layer conv3
I0608 20:29:20.505641 23539 net.cpp:406] conv3 <- pool2
I0608 20:29:20.505646 23539 net.cpp:380] conv3 -> conv3
I0608 20:29:20.505695 23539 net.cpp:122] Setting up conv3
I0608 20:29:20.505703 23539 net.cpp:129] Top shape: 1 64 8 8 (4096)
I0608 20:29:20.505707 23539 net.cpp:137] Memory required for data: 299008
I0608 20:29:20.505715 23539 layer_factory.hpp:77] Creating layer relu3
I0608 20:29:20.505722 23539 net.cpp:84] Creating Layer relu3
I0608 20:29:20.505726 23539 net.cpp:406] relu3 <- conv3
I0608 20:29:20.505731 23539 net.cpp:367] relu3 -> conv3 (in-place)
I0608 20:29:20.505736 23539 net.cpp:122] Setting up relu3
I0608 20:29:20.505741 23539 net.cpp:129] Top shape: 1 64 8 8 (4096)
I0608 20:29:20.505745 23539 net.cpp:137] Memory required for data: 315392
I0608 20:29:20.505748 23539 layer_factory.hpp:77] Creating layer pool3
I0608 20:29:20.505753 23539 net.cpp:84] Creating Layer pool3
I0608 20:29:20.505758 23539 net.cpp:406] pool3 <- conv3
I0608 20:29:20.505762 23539 net.cpp:380] pool3 -> pool3
I0608 20:29:20.505769 23539 net.cpp:122] Setting up pool3
I0608 20:29:20.505774 23539 net.cpp:129] Top shape: 1 64 4 4 (1024)
I0608 20:29:20.505779 23539 net.cpp:137] Memory required for data: 319488
I0608 20:29:20.505781 23539 layer_factory.hpp:77] Creating layer ip1
I0608 20:29:20.505787 23539 net.cpp:84] Creating Layer ip1
I0608 20:29:20.505791 23539 net.cpp:406] ip1 <- pool3
I0608 20:29:20.505796 23539 net.cpp:380] ip1 -> ip1
I0608 20:29:20.505894 23539 net.cpp:122] Setting up ip1
I0608 20:29:20.505903 23539 net.cpp:129] Top shape: 1 64 (64)
I0608 20:29:20.505906 23539 net.cpp:137] Memory required for data: 319744
I0608 20:29:20.505913 23539 layer_factory.hpp:77] Creating layer ip2
I0608 20:29:20.505919 23539 net.cpp:84] Creating Layer ip2
I0608 20:29:20.505923 23539 net.cpp:406] ip2 <- ip1
I0608 20:29:20.505929 23539 net.cpp:380] ip2 -> ip2
I0608 20:29:20.505942 23539 net.cpp:122] Setting up ip2
I0608 20:29:20.505949 23539 net.cpp:129] Top shape: 1 10 (10)
I0608 20:29:20.505952 23539 net.cpp:137] Memory required for data: 319784
I0608 20:29:20.505960 23539 layer_factory.hpp:77] Creating layer prob
I0608 20:29:20.505967 23539 net.cpp:84] Creating Layer prob
I0608 20:29:20.505971 23539 net.cpp:406] prob <- ip2
I0608 20:29:20.505976 23539 net.cpp:380] prob -> prob
I0608 20:29:20.505985 23539 net.cpp:122] Setting up prob
I0608 20:29:20.505990 23539 net.cpp:129] Top shape: 1 10 (10)
I0608 20:29:20.505993 23539 net.cpp:137] Memory required for data: 319824
I0608 20:29:20.505997 23539 net.cpp:200] prob does not need backward computation.
I0608 20:29:20.506001 23539 net.cpp:200] ip2 does not need backward computation.
I0608 20:29:20.506005 23539 net.cpp:200] ip1 does not need backward computation.
I0608 20:29:20.506008 23539 net.cpp:200] pool3 does not need backward computation.
I0608 20:29:20.506012 23539 net.cpp:200] relu3 does not need backward computation.
I0608 20:29:20.506016 23539 net.cpp:200] conv3 does not need backward computation.
I0608 20:29:20.506021 23539 net.cpp:200] pool2 does not need backward computation.
I0608 20:29:20.506024 23539 net.cpp:200] relu2 does not need backward computation.
I0608 20:29:20.506028 23539 net.cpp:200] conv2 does not need backward computation.
I0608 20:29:20.506031 23539 net.cpp:200] relu1 does not need backward computation.
I0608 20:29:20.506036 23539 net.cpp:200] pool1 does not need backward computation.
I0608 20:29:20.506039 23539 net.cpp:200] conv1 does not need backward computation.
I0608 20:29:20.506043 23539 net.cpp:200] data does not need backward computation.
I0608 20:29:20.506047 23539 net.cpp:242] This network produces output prob
I0608 20:29:20.506058 23539 net.cpp:255] Network initialization done.
I0608 20:29:20.507083 23539 net.cpp:744] Ignoring source layer cifar
I0608 20:29:20.507220 23539 net.cpp:744] Ignoring source layer loss
(32, 3, 5, 5)

import numpy as np
import matplotlib.pyplot as plt
import os,sys,caffe  #matplotlib inline
caffe_root='/caffe/'
os.chdir(caffe_root)
sys.path.insert(0,caffe_root+'python')


# 编写一个函数,用于显示各层的参数
def show_feature(data, padsize=1, padval=0):
    data -= data.min()
    data /= data.max()

    # force the number of filters to be square
    n = int(np.ceil(np.sqrt(data.shape[0])))
    padding = ((0, n ** 2 - data.shape[0]), (0, padsize), (0, padsize)) + ((0, 0),) * (data.ndim - 3)
    data = np.pad(data, padding, mode='constant', constant_values=(padval, padval))

    # tile the filters into an image
    data = data.reshape((n, n) + data.shape[1:]).transpose((0, 2, 1, 3) + tuple(range(4, data.ndim + 1)))
    data = data.reshape((n * data.shape[1], n * data.shape[3]) + data.shape[4:])
    plt.imshow(data)
    plt.axis('off')

if __name__ == '__main__':
    plt.rcParams['figure.figsize'] = (8, 8)
    plt.rcParams['image.interpolation'] = 'nearest'
    plt.rcParams['image.cmap'] = 'gray'
    #设置网络模型,并显示该模型中各层名称和参数的规模(注意此处是net.params, 而不是net.blobs)
    net = caffe.Net(caffe_root + 'examples/cifar10/cifar10_quick.prototxt',
                    caffe_root + 'examples/cifar10/cifar10_quick_iter_4000.caffemodel',
                    caffe.TEST)
    [(k, v[0].data.shape) for k, v in net.params.items()]
    # 第一个卷积层,参数规模为(32,3,5,5),即32个5*5的3通道filter
    #weight = net.params["conv1"][0].data
    #print(weight.shape)
    #show_feature(weight.transpose(0, 2, 3, 1))

    # 第二个卷积层的权值参数,共有32*32个filter,每个filter大小为5*5
    weight = net.params["conv2"][0].data
    print(weight.shape)
    show_feature(weight.reshape(32 ** 2, 5, 5))

    plt.show()

结果如下:

WARNING: Logging before InitGoogleLogging() is written to STDERR
W0608 20:52:40.430717 24894 _caffe.cpp:139] DEPRECATION WARNING - deprecated use of Python interface
W0608 20:52:40.430750 24894 _caffe.cpp:140] Use this instead (with the named "weights" parameter):
W0608 20:52:40.430754 24894 _caffe.cpp:142] Net('/caffe/examples/cifar10/cifar10_quick.prototxt', 1, weights='/caffe/examples/cifar10/cifar10_quick_iter_4000.caffemodel')
I0608 20:52:40.432391 24894 net.cpp:51] Initializing net from parameters: 
name: "CIFAR10_quick_test"
state {
  phase: TEST
  level: 0
}
layer {
  name: "data"
  type: "Input"
  top: "data"
  input_param {
    shape {
      dim: 1
      dim: 3
      dim: 32
      dim: 32
    }
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 32
    pad: 2
    kernel_size: 5
    stride: 1
  }
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "conv1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "pool1"
  top: "pool1"
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "pool1"
  top: "conv2"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 32
    pad: 2
    kernel_size: 5
    stride: 1
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "conv2"
  top: "pool2"
  pooling_param {
    pool: AVE
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "pool2"
  top: "conv3"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 64
    pad: 2
    kernel_size: 5
    stride: 1
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "pool3"
  type: "Pooling"
  bottom: "conv3"
  top: "pool3"
  pooling_param {
    pool: AVE
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "ip1"
  type: "InnerProduct"
  bottom: "pool3"
  top: "ip1"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  inner_product_param {
    num_output: 64
  }
}
layer {
  name: "ip2"
  type: "InnerProduct"
  bottom: "ip1"
  top: "ip2"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  inner_product_param {
    num_output: 10
  }
}
layer {
  name: "prob"
  type: "Softmax"
  bottom: "ip2"
  top: "prob"
}
I0608 20:52:40.432447 24894 layer_factory.hpp:77] Creating layer data
I0608 20:52:40.432457 24894 net.cpp:84] Creating Layer data
I0608 20:52:40.432463 24894 net.cpp:380] data -> data
I0608 20:52:40.432482 24894 net.cpp:122] Setting up data
I0608 20:52:40.432490 24894 net.cpp:129] Top shape: 1 3 32 32 (3072)
I0608 20:52:40.432493 24894 net.cpp:137] Memory required for data: 12288
I0608 20:52:40.432497 24894 layer_factory.hpp:77] Creating layer conv1
I0608 20:52:40.432505 24894 net.cpp:84] Creating Layer conv1
I0608 20:52:40.432510 24894 net.cpp:406] conv1 <- data
I0608 20:52:40.432516 24894 net.cpp:380] conv1 -> conv1
I0608 20:52:40.432541 24894 net.cpp:122] Setting up conv1
I0608 20:52:40.432548 24894 net.cpp:129] Top shape: 1 32 32 32 (32768)
I0608 20:52:40.432552 24894 net.cpp:137] Memory required for data: 143360
I0608 20:52:40.432564 24894 layer_factory.hpp:77] Creating layer pool1
I0608 20:52:40.432570 24894 net.cpp:84] Creating Layer pool1
I0608 20:52:40.432574 24894 net.cpp:406] pool1 <- conv1
I0608 20:52:40.432579 24894 net.cpp:380] pool1 -> pool1
I0608 20:52:40.432590 24894 net.cpp:122] Setting up pool1
I0608 20:52:40.432595 24894 net.cpp:129] Top shape: 1 32 16 16 (8192)
I0608 20:52:40.432598 24894 net.cpp:137] Memory required for data: 176128
I0608 20:52:40.432602 24894 layer_factory.hpp:77] Creating layer relu1
I0608 20:52:40.432607 24894 net.cpp:84] Creating Layer relu1
I0608 20:52:40.432611 24894 net.cpp:406] relu1 <- pool1
I0608 20:52:40.432615 24894 net.cpp:367] relu1 -> pool1 (in-place)
I0608 20:52:40.432621 24894 net.cpp:122] Setting up relu1
I0608 20:52:40.432626 24894 net.cpp:129] Top shape: 1 32 16 16 (8192)
I0608 20:52:40.432631 24894 net.cpp:137] Memory required for data: 208896
I0608 20:52:40.432633 24894 layer_factory.hpp:77] Creating layer conv2
I0608 20:52:40.432641 24894 net.cpp:84] Creating Layer conv2
I0608 20:52:40.432643 24894 net.cpp:406] conv2 <- pool1
I0608 20:52:40.432649 24894 net.cpp:380] conv2 -> conv2
I0608 20:52:40.432680 24894 net.cpp:122] Setting up conv2
I0608 20:52:40.432690 24894 net.cpp:129] Top shape: 1 32 16 16 (8192)
I0608 20:52:40.432693 24894 net.cpp:137] Memory required for data: 241664
I0608 20:52:40.432701 24894 layer_factory.hpp:77] Creating layer relu2
I0608 20:52:40.432708 24894 net.cpp:84] Creating Layer relu2
I0608 20:52:40.432711 24894 net.cpp:406] relu2 <- conv2
I0608 20:52:40.432716 24894 net.cpp:367] relu2 -> conv2 (in-place)
I0608 20:52:40.432723 24894 net.cpp:122] Setting up relu2
I0608 20:52:40.432729 24894 net.cpp:129] Top shape: 1 32 16 16 (8192)
I0608 20:52:40.432732 24894 net.cpp:137] Memory required for data: 274432
I0608 20:52:40.432735 24894 layer_factory.hpp:77] Creating layer pool2
I0608 20:52:40.432740 24894 net.cpp:84] Creating Layer pool2
I0608 20:52:40.432744 24894 net.cpp:406] pool2 <- conv2
I0608 20:52:40.432749 24894 net.cpp:380] pool2 -> pool2
I0608 20:52:40.432756 24894 net.cpp:122] Setting up pool2
I0608 20:52:40.432761 24894 net.cpp:129] Top shape: 1 32 8 8 (2048)
I0608 20:52:40.432765 24894 net.cpp:137] Memory required for data: 282624
I0608 20:52:40.432768 24894 layer_factory.hpp:77] Creating layer conv3
I0608 20:52:40.432775 24894 net.cpp:84] Creating Layer conv3
I0608 20:52:40.432778 24894 net.cpp:406] conv3 <- pool2
I0608 20:52:40.432783 24894 net.cpp:380] conv3 -> conv3
I0608 20:52:40.432834 24894 net.cpp:122] Setting up conv3
I0608 20:52:40.432842 24894 net.cpp:129] Top shape: 1 64 8 8 (4096)
I0608 20:52:40.432845 24894 net.cpp:137] Memory required for data: 299008
I0608 20:52:40.432857 24894 layer_factory.hpp:77] Creating layer relu3
I0608 20:52:40.432866 24894 net.cpp:84] Creating Layer relu3
I0608 20:52:40.432873 24894 net.cpp:406] relu3 <- conv3
I0608 20:52:40.432880 24894 net.cpp:367] relu3 -> conv3 (in-place)
I0608 20:52:40.432889 24894 net.cpp:122] Setting up relu3
I0608 20:52:40.432895 24894 net.cpp:129] Top shape: 1 64 8 8 (4096)
I0608 20:52:40.432898 24894 net.cpp:137] Memory required for data: 315392
I0608 20:52:40.432902 24894 layer_factory.hpp:77] Creating layer pool3
I0608 20:52:40.432907 24894 net.cpp:84] Creating Layer pool3
I0608 20:52:40.432910 24894 net.cpp:406] pool3 <- conv3
I0608 20:52:40.432915 24894 net.cpp:380] pool3 -> pool3
I0608 20:52:40.432922 24894 net.cpp:122] Setting up pool3
I0608 20:52:40.432927 24894 net.cpp:129] Top shape: 1 64 4 4 (1024)
I0608 20:52:40.432931 24894 net.cpp:137] Memory required for data: 319488
I0608 20:52:40.432934 24894 layer_factory.hpp:77] Creating layer ip1
I0608 20:52:40.432940 24894 net.cpp:84] Creating Layer ip1
I0608 20:52:40.432945 24894 net.cpp:406] ip1 <- pool3
I0608 20:52:40.432950 24894 net.cpp:380] ip1 -> ip1
I0608 20:52:40.433044 24894 net.cpp:122] Setting up ip1
I0608 20:52:40.433053 24894 net.cpp:129] Top shape: 1 64 (64)
I0608 20:52:40.433056 24894 net.cpp:137] Memory required for data: 319744
I0608 20:52:40.433063 24894 layer_factory.hpp:77] Creating layer ip2
I0608 20:52:40.433070 24894 net.cpp:84] Creating Layer ip2
I0608 20:52:40.433074 24894 net.cpp:406] ip2 <- ip1
I0608 20:52:40.433080 24894 net.cpp:380] ip2 -> ip2
I0608 20:52:40.433094 24894 net.cpp:122] Setting up ip2
I0608 20:52:40.433099 24894 net.cpp:129] Top shape: 1 10 (10)
I0608 20:52:40.433102 24894 net.cpp:137] Memory required for data: 319784
I0608 20:52:40.433110 24894 layer_factory.hpp:77] Creating layer prob
I0608 20:52:40.433116 24894 net.cpp:84] Creating Layer prob
I0608 20:52:40.433120 24894 net.cpp:406] prob <- ip2
I0608 20:52:40.433128 24894 net.cpp:380] prob -> prob
I0608 20:52:40.433137 24894 net.cpp:122] Setting up prob
I0608 20:52:40.433142 24894 net.cpp:129] Top shape: 1 10 (10)
I0608 20:52:40.433145 24894 net.cpp:137] Memory required for data: 319824
I0608 20:52:40.433149 24894 net.cpp:200] prob does not need backward computation.
I0608 20:52:40.433152 24894 net.cpp:200] ip2 does not need backward computation.
I0608 20:52:40.433156 24894 net.cpp:200] ip1 does not need backward computation.
I0608 20:52:40.433161 24894 net.cpp:200] pool3 does not need backward computation.
I0608 20:52:40.433163 24894 net.cpp:200] relu3 does not need backward computation.
I0608 20:52:40.433167 24894 net.cpp:200] conv3 does not need backward computation.
I0608 20:52:40.433171 24894 net.cpp:200] pool2 does not need backward computation.
I0608 20:52:40.433174 24894 net.cpp:200] relu2 does not need backward computation.
I0608 20:52:40.433177 24894 net.cpp:200] conv2 does not need backward computation.
I0608 20:52:40.433182 24894 net.cpp:200] relu1 does not need backward computation.
(32, 32, 5, 5)
I0608 20:52:40.433184 24894 net.cpp:200] pool1 does not need backward computation.
I0608 20:52:40.433188 24894 net.cpp:200] conv1 does not need backward computation.
I0608 20:52:40.433192 24894 net.cpp:200] data does not need backward computation.
I0608 20:52:40.433195 24894 net.cpp:242] This network produces output prob
I0608 20:52:40.433204 24894 net.cpp:255] Network initialization done.
I0608 20:52:40.434226 24894 net.cpp:744] Ignoring source layer cifar
I0608 20:52:40.434352 24894 net.cpp:744] Ignoring source layer loss

import numpy as np
import matplotlib.pyplot as plt
import os,sys,caffe  #matplotlib inline
caffe_root='/caffe/'
os.chdir(caffe_root)
sys.path.insert(0,caffe_root+'python')


# 编写一个函数,用于显示各层的参数
def show_feature(data, padsize=1, padval=0):
    data -= data.min()
    data /= data.max()

    # force the number of filters to be square
    n = int(np.ceil(np.sqrt(data.shape[0])))
    padding = ((0, n ** 2 - data.shape[0]), (0, padsize), (0, padsize)) + ((0, 0),) * (data.ndim - 3)
    data = np.pad(data, padding, mode='constant', constant_values=(padval, padval))

    # tile the filters into an image
    data = data.reshape((n, n) + data.shape[1:]).transpose((0, 2, 1, 3) + tuple(range(4, data.ndim + 1)))
    data = data.reshape((n * data.shape[1], n * data.shape[3]) + data.shape[4:])
    plt.imshow(data)
    plt.axis('off')

if __name__ == '__main__':
    plt.rcParams['figure.figsize'] = (8, 8)
    plt.rcParams['image.interpolation'] = 'nearest'
    plt.rcParams['image.cmap'] = 'gray'
    #设置网络模型,并显示该模型中各层名称和参数的规模(注意此处是net.params, 而不是net.blobs)
    net = caffe.Net(caffe_root + 'examples/cifar10/cifar10_quick.prototxt',
                    caffe_root + 'examples/cifar10/cifar10_quick_iter_4000.caffemodel',
                    caffe.TEST)
    [(k, v[0].data.shape) for k, v in net.params.items()]
    # 第一个卷积层,参数规模为(32,3,5,5),即32个5*5的3通道filter
    #weight = net.params["conv1"][0].data
    #print(weight.shape)
    #show_feature(weight.transpose(0, 2, 3, 1))

    # 第二个卷积层的权值参数,共有32*32个filter,每个filter大小为5*5
    #weight = net.params["conv2"][0].data
    #print(weight.shape)
    #show_feature(weight.reshape(32 ** 2, 5, 5))

    # 第三个卷积层的权值,共有64*32个filter,每个filter大小为5*5,取其前1024个进行可视化
    weight = net.params["conv3"][0].data
    print(weight.shape)
    show_feature(weight.reshape(64 * 32, 5, 5)[:1024])

    plt.show()

结果如下:

WARNING: Logging before InitGoogleLogging() is written to STDERR
W0608 20:57:26.513041 25238 _caffe.cpp:139] DEPRECATION WARNING - deprecated use of Python interface
W0608 20:57:26.513070 25238 _caffe.cpp:140] Use this instead (with the named "weights" parameter):
W0608 20:57:26.513073 25238 _caffe.cpp:142] Net('/caffe/examples/cifar10/cifar10_quick.prototxt', 1, weights='/caffe/examples/cifar10/cifar10_quick_iter_4000.caffemodel')
I0608 20:57:26.514796 25238 net.cpp:51] Initializing net from parameters: 
name: "CIFAR10_quick_test"
state {
  phase: TEST
  level: 0
}
layer {
  name: "data"
  type: "Input"
  top: "data"
  input_param {
    shape {
      dim: 1
      dim: 3
      dim: 32
      dim: 32
    }
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 32
    pad: 2
    kernel_size: 5
    stride: 1
  }
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "conv1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "pool1"
  top: "pool1"
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "pool1"
  top: "conv2"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 32
    pad: 2
    kernel_size: 5
    stride: 1
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "conv2"
  top: "pool2"
  pooling_param {
    pool: AVE
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "pool2"
  top: "conv3"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 64
    pad: 2
    kernel_size: 5
    stride: 1
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "pool3"
  type: "Pooling"
  bottom: "conv3"
  top: "pool3"
  pooling_param {
    pool: AVE
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "ip1"
  type: "InnerProduct"
  bottom: "pool3"
  top: "ip1"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  inner_product_param {
    num_output: 64
  }
}
layer {
  name: "ip2"
  type: "InnerProduct"
  bottom: "ip1"
  top: "ip2"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  inner_product_param {
    num_output: 10
  }
}
layer {
  name: "prob"
  type: "Softmax"
  bottom: "ip2"
  top: "prob"
}
I0608 20:57:26.514855 25238 layer_factory.hpp:77] Creating layer data
I0608 20:57:26.514871 25238 net.cpp:84] Creating Layer data
I0608 20:57:26.514881 25238 net.cpp:380] data -> data
I0608 20:57:26.514909 25238 net.cpp:122] Setting up data
I0608 20:57:26.514924 25238 net.cpp:129] Top shape: 1 3 32 32 (3072)
I0608 20:57:26.514943 25238 net.cpp:137] Memory required for data: 12288
I0608 20:57:26.514950 25238 layer_factory.hpp:77] Creating layer conv1
I0608 20:57:26.514964 25238 net.cpp:84] Creating Layer conv1
I0608 20:57:26.514972 25238 net.cpp:406] conv1 <- data
I0608 20:57:26.514983 25238 net.cpp:380] conv1 -> conv1
I0608 20:57:26.515027 25238 net.cpp:122] Setting up conv1
I0608 20:57:26.515039 25238 net.cpp:129] Top shape: 1 32 32 32 (32768)
I0608 20:57:26.515048 25238 net.cpp:137] Memory required for data: 143360
I0608 20:57:26.515064 25238 layer_factory.hpp:77] Creating layer pool1
I0608 20:57:26.515076 25238 net.cpp:84] Creating Layer pool1
I0608 20:57:26.515084 25238 net.cpp:406] pool1 <- conv1
I0608 20:57:26.515091 25238 net.cpp:380] pool1 -> pool1
I0608 20:57:26.515106 25238 net.cpp:122] Setting up pool1
I0608 20:57:26.515118 25238 net.cpp:129] Top shape: 1 32 16 16 (8192)
I0608 20:57:26.515125 25238 net.cpp:137] Memory required for data: 176128
I0608 20:57:26.515132 25238 layer_factory.hpp:77] Creating layer relu1
I0608 20:57:26.515141 25238 net.cpp:84] Creating Layer relu1
I0608 20:57:26.515147 25238 net.cpp:406] relu1 <- pool1
I0608 20:57:26.515153 25238 net.cpp:367] relu1 -> pool1 (in-place)
I0608 20:57:26.515159 25238 net.cpp:122] Setting up relu1
I0608 20:57:26.515166 25238 net.cpp:129] Top shape: 1 32 16 16 (8192)
I0608 20:57:26.515168 25238 net.cpp:137] Memory required for data: 208896
I0608 20:57:26.515172 25238 layer_factory.hpp:77] Creating layer conv2
I0608 20:57:26.515179 25238 net.cpp:84] Creating Layer conv2
I0608 20:57:26.515183 25238 net.cpp:406] conv2 <- pool1
I0608 20:57:26.515189 25238 net.cpp:380] conv2 -> conv2
I0608 20:57:26.515231 25238 net.cpp:122] Setting up conv2
I0608 20:57:26.515244 25238 net.cpp:129] Top shape: 1 32 16 16 (8192)
I0608 20:57:26.515250 25238 net.cpp:137] Memory required for data: 241664
I0608 20:57:26.515264 25238 layer_factory.hpp:77] Creating layer relu2
I0608 20:57:26.515274 25238 net.cpp:84] Creating Layer relu2
I0608 20:57:26.515280 25238 net.cpp:406] relu2 <- conv2
I0608 20:57:26.515286 25238 net.cpp:367] relu2 -> conv2 (in-place)
I0608 20:57:26.515296 25238 net.cpp:122] Setting up relu2
I0608 20:57:26.515307 25238 net.cpp:129] Top shape: 1 32 16 16 (8192)
I0608 20:57:26.515313 25238 net.cpp:137] Memory required for data: 274432
I0608 20:57:26.515319 25238 layer_factory.hpp:77] Creating layer pool2
I0608 20:57:26.515327 25238 net.cpp:84] Creating Layer pool2
I0608 20:57:26.515336 25238 net.cpp:406] pool2 <- conv2
I0608 20:57:26.515344 25238 net.cpp:380] pool2 -> pool2
I0608 20:57:26.515357 25238 net.cpp:122] Setting up pool2
I0608 20:57:26.515369 25238 net.cpp:129] Top shape: 1 32 8 8 (2048)
I0608 20:57:26.515377 25238 net.cpp:137] Memory required for data: 282624
I0608 20:57:26.515383 25238 layer_factory.hpp:77] Creating layer conv3
I0608 20:57:26.515395 25238 net.cpp:84] Creating Layer conv3
I0608 20:57:26.515403 25238 net.cpp:406] conv3 <- pool2
I0608 20:57:26.515414 25238 net.cpp:380] conv3 -> conv3
I0608 20:57:26.515506 25238 net.cpp:122] Setting up conv3
I0608 20:57:26.515519 25238 net.cpp:129] Top shape: 1 64 8 8 (4096)
I0608 20:57:26.515527 25238 net.cpp:137] Memory required for data: 299008
I0608 20:57:26.515542 25238 layer_factory.hpp:77] Creating layer relu3
I0608 20:57:26.515554 25238 net.cpp:84] Creating Layer relu3
I0608 20:57:26.515563 25238 net.cpp:406] relu3 <- conv3
I0608 20:57:26.515573 25238 net.cpp:367] relu3 -> conv3 (in-place)
I0608 20:57:26.515583 25238 net.cpp:122] Setting up relu3
I0608 20:57:26.515592 25238 net.cpp:129] Top shape: 1 64 8 8 (4096)
I0608 20:57:26.515599 25238 net.cpp:137] Memory required for data: 315392
I0608 20:57:26.515605 25238 layer_factory.hpp:77] Creating layer pool3
I0608 20:57:26.515615 25238 net.cpp:84] Creating Layer pool3
I0608 20:57:26.515622 25238 net.cpp:406] pool3 <- conv3
I0608 20:57:26.515632 25238 net.cpp:380] pool3 -> pool3
I0608 20:57:26.515645 25238 net.cpp:122] Setting up pool3
I0608 20:57:26.515655 25238 net.cpp:129] Top shape: 1 64 4 4 (1024)
I0608 20:57:26.515662 25238 net.cpp:137] Memory required for data: 319488
I0608 20:57:26.515669 25238 layer_factory.hpp:77] Creating layer ip1
I0608 20:57:26.515681 25238 net.cpp:84] Creating Layer ip1
I0608 20:57:26.515687 25238 net.cpp:406] ip1 <- pool3
I0608 20:57:26.515698 25238 net.cpp:380] ip1 -> ip1
I0608 20:57:26.515851 25238 net.cpp:122] Setting up ip1
I0608 20:57:26.515863 25238 net.cpp:129] Top shape: 1 64 (64)
I0608 20:57:26.515869 25238 net.cpp:137] Memory required for data: 319744
I0608 20:57:26.515882 25238 layer_factory.hpp:77] Creating layer ip2
I0608 20:57:26.515892 25238 net.cpp:84] Creating Layer ip2
I0608 20:57:26.515900 25238 net.cpp:406] ip2 <- ip1
I0608 20:57:26.515911 25238 net.cpp:380] ip2 -> ip2
I0608 20:57:26.515933 25238 net.cpp:122] Setting up ip2
I0608 20:57:26.515944 25238 net.cpp:129] Top shape: 1 10 (10)
I0608 20:57:26.515951 25238 net.cpp:137] Memory required for data: 319784
I0608 20:57:26.515966 25238 layer_factory.hpp:77] Creating layer prob
I0608 20:57:26.515980 25238 net.cpp:84] Creating Layer prob
I0608 20:57:26.515987 25238 net.cpp:406] prob <- ip2
I0608 20:57:26.515997 25238 net.cpp:380] prob -> prob
I0608 20:57:26.516012 25238 net.cpp:122] Setting up prob
I0608 20:57:26.516021 25238 net.cpp:129] Top shape: 1 10 (10)
I0608 20:57:26.516028 25238 net.cpp:137] Memory required for data: 319824
I0608 20:57:26.516036 25238 net.cpp:200] prob does not need backward computation.
I0608 20:57:26.516042 25238 net.cpp:200] ip2 does not need backward computation.
I0608 20:57:26.516050 25238 net.cpp:200] ip1 does not need backward computation.
I0608 20:57:26.516057 25238 net.cpp:200] pool3 does not need backward computation.
I0608 20:57:26.516064 25238 net.cpp:200] relu3 does not need backward computation.
I0608 20:57:26.516072 25238 net.cpp:200] conv3 does not need backward computation.
I0608 20:57:26.516079 25238 net.cpp:200] pool2 does not need backward computation.
I0608 20:57:26.516086 25238 net.cpp:200] relu2 does not need backward computation.
I0608 20:57:26.516093 25238 net.cpp:200] conv2 does not need backward computation.
I0608 20:57:26.516100 25238 net.cpp:200] relu1 does not need backward computation.
I0608 20:57:26.516108 25238 net.cpp:200] pool1 does not need backward computation.
I0608 20:57:26.516115 25238 net.cpp:200] conv1 does not need backward computation.
I0608 20:57:26.516124 25238 net.cpp:200] data does not need backward computation.
I0608 20:57:26.516129 25238 net.cpp:242] This network produces output prob
I0608 20:57:26.516146 25238 net.cpp:255] Network initialization done.
I0608 20:57:26.517181 25238 net.cpp:744] Ignoring source layer cifar
I0608 20:57:26.517324 25238 net.cpp:744] Ignoring source layer loss
(64, 32, 5, 5)


  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值