全卷积神经网络 图像语义分割实验:FCN数据集制作,网络模型定义,网络训练(提供数据集和模型文件,以供参考)
更新于2016年12月29.
下载:自制数据集供参考
下载: 自己改的网络
论文:《Fully Convolutional Networks for Semantic Segmentation》
代码:FCN的Caffe 实现
数据集:PascalVOC
一 数据集制作
PascalVOC数据下载下来后,制作用以图像分割的图像数据集和标签数据集,LMDB或者LEVELDB格式。
最好resize一下(填充的方式)。
1. 数据文件夹构成
包括原始图片和标签图片,如下。
然后,构建对应的lmdb文件。可以将所有图片4:1分为train:val的比例。每个txt文件列出图像路径就可以,不用给label,因为image的label还是image,在caffe中指定就行。
Img_train.txt
SegmentationImage/002120.png
SegmentationImage/002132.png
SegmentationImage/002142.png
SegmentationImage/002212.png
SegmentationImage/002234.png
SegmentationImage/002260.png
SegmentationImage/002266.png
SegmentationImage/002268.png
SegmentationImage/002273.png
SegmentationImage/002281.png
SegmentationImage/002284.png
SegmentationImage/002293.png
SegmentationImage/002361.png
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
Label_train.txt
SegmentationClass/002120.png
SegmentationClass/002132.png
SegmentationClass/002142.png
SegmentationClass/002212.png
SegmentationClass/002234.png
SegmentationClass/002260.png
SegmentationClass/002266.png
SegmentationClass/002268.png
SegmentationClass/002273.png
SegmentationClass/002281.png
SegmentationClass/002284.png
SegmentationClass/002293.png
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
注意:label要自己生成,根据SegmentationClass下的gt图片。
每个类别的像素值如下:
类别名称 R G B
background 0 0 0 背景
aeroplane 128 0 0 飞机
bicycle 0 128 0
bird 128 128 0
boat 0 0 128
bottle 128 0 128 瓶子
bus 0 128 128 大巴
car 128 128 128
cat 64 0 0 猫
chair 192 0 0
cow 64 128 0
diningtable 192 128 0 餐桌
dog 64 0 128
horse 192 0 128
motorbike 64 128 128
person 192 128 128
pottedplant 0 64 0 盆栽
sheep 128 64 0
sofa 0 192 0
train 128 192 0
tvmonitor 0 64 128 显示器
对数据集中的grand truth 图像进行处理,生成用以训练的label图像。
需要注意的是,label文件要是gray格式,不然会出错:scores层输出与label的数据尺寸不一致,通道问题导致的。
然后生成lmdb就行了。数据集准备完毕。
二 网络模型定义
这里主要考虑的是数据输入的问题,指定data和label,如下。
layer {
name: "data"
type: "Data"
top:"data"
include {
phase: TRAIN
}
transform_param {
scale: 0.00390625
mean_file: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_train_mean.binaryproto"
}
data_param {
source: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_train"
batch_size: 1
backend: LMDB
}
}
layer {
name: "label"
type: "Data"
top:"label"
include {
phase: TRAIN
}
transform_param {
scale: 0.00390625
mean_file: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_train_mean.binaryproto"
}
data_param {
source: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_train"
batch_size: 1
backend: LMDB
}
}
layer {
name: "data"
type: "Data"
top: "data"
include {
phase: TEST
}
transform_param {
scale: 0.00390625
mean_file: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_val_mean.binaryproto"
}
data_param {
source: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_val"
batch_size: 1
backend: LMDB
}
}
layer {
name: "label"
type: "Data"
top: "label"
include {
phase: TEST
}
transform_param {
scale: 0.00390625
mean_file: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_val_mean.binaryproto"
}
data_param {
source: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_val"
batch_size: 1
backend: LMDB
}
}
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 30
- 31
- 32
- 33
- 34
- 35
- 36
- 37
- 38
- 39
- 40
- 41
- 42
- 43
- 44
- 45
- 46
- 47
- 48
- 49
- 50
- 51
- 52
- 53
- 54
- 55
- 56
- 57
- 58
- 59
- 60
- 61
- 62
- 63
- 64
- 65
- 66
- 67
- 68
- 69
- 70
三 网络训练
最好fintune,不然loss下降太慢。
Log file created at: 2016/12/13 12:14:07
Running on machine: DESKTOP
Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
I1213 12:14:07.177220 1380 caffe.cpp:218] Using GPUs 0
I1213 12:14:07.436894 1380 caffe.cpp:223] GPU 0: GeForce GTX 960
I1213 12:14:07.758122 1380 common.cpp:36] System entropy source not available, using fallback algorithm to generate seed instead.
I1213 12:14:07.758623 1380 solver.cpp:48] Initializing solver from parameters:
test_iter: 84
test_interval: 338
base_lr: 1e-014
display: 20
max_iter: 100000
lr_policy: "fixed"
momentum: 0.95
weight_decay: 0.0005
snapshot: 4000
snapshot_prefix: "FCN"
solver_mode: GPU
device_id: 0
net: "train_val.prototxt"
train_state {
level: 0
stage: ""
}
iter_size: 1
I1213 12:14:07.759624 1380 solver.cpp:91] Creating training net from net file: train_val.prototxt
I1213 12:14:07.760124 1380 net.cpp:332] The NetState phase (0) differed from the phase (1) specified by a rule in layer data
I1213 12:14:07.760124 1380 net.cpp:332] The NetState phase (0) differed from the phase (1) specified by a rule in layer label
I1213 12:14:07.760124 1380 net.cpp:332] The NetState phase (0) differed from the phase (1) specified by a rule in layer accuracy
I1213 12:14:07.761126 1380 net.cpp:58] Initializing net from parameters:
state {
phase: TRAIN
level: 0
stage: ""
}
layer {
name: "data"
type: "Data"
top: "data"
include {
phase: TRAIN
}
transform_param {
scale: 0.00390625
mean_file: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_train_mean.binaryproto"
}
data_param {
source: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_train"
batch_size: 1
backend: LMDB
}
}
layer {
name: "label"
type: "Data"
top: "label"
include {
phase: TRAIN
}
transform_param {
scale: 0.00390625
mean_file: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_train_mean.binaryproto"
}
data_param {
source: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_train"
batch_size: 1
backend: LMDB
}
}
layer {
name: "conv1_1"
type: "Convolution"
bottom: "data"
top: "conv1_1"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 64
pad: 100
kernel_size: 3
stride: 1
}
}
layer {
name: "relu1_1"
type: "ReLU"
bottom: "conv1_1"
top: "conv1_1"
}
layer {
name: "conv1_2"
type: "Convolution"
bottom: "conv1_1"
top: "conv1_2"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 64
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu1_2"
type: "ReLU"
bottom: "conv1_2"
top: "conv1_2"
}
layer {
name: "pool1"
type: "Pooling"
bottom: "conv1_2"
top: "pool1"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv2_1"
type: "Convolution"
bottom: "pool1"
top: "conv2_1"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 128
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu2_1"
type: "ReLU"
bottom: "conv2_1"
top: "conv2_1"
}
layer {
name: "conv2_2"
type: "Convolution"
bottom: "conv2_1"
top: "conv2_2"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 128
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu2_2"
type: "ReLU"
bottom: "conv2_2"
top: "conv2_2"
}
layer {
name: "pool2"
type: "Pooling"
bottom: "conv2_2"
top: "pool2"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv3_1"
type: "Convolution"
bottom: "pool2"
top: "conv3_1"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu3_1"
type: "ReLU"
bottom: "conv3_1"
top: "conv3_1"
}
layer {
name: "conv3_2"
type: "Convolution"
bottom: "conv3_1"
top: "conv3_2"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu3_2"
type: "ReLU"
bottom: "conv3_2"
top: "conv3_2"
}
layer {
name: "conv3_3"
type: "Convolution"
bottom: "conv3_2"
top: "conv3_3"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu3_3"
type: "ReLU"
bottom: "conv3_3"
top: "conv3_3"
}
layer {
name: "pool3"
type: "Pooling"
bottom: "conv3_3"
top: "pool3"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv4_1"
type: "Convolution"
bottom: "pool3"
top: "conv4_1"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu4_1"
type: "ReLU"
bottom: "conv4_1"
top: "conv4_1"
}
layer {
name: "conv4_2"
type: "Convolution"
bottom: "conv4_1"
top: "conv4_2"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu4_2"
type: "ReLU"
bottom: "conv4_2"
top: "conv4_2"
}
layer {
name: "conv4_3"
type: "Convolution"
bottom: "conv4_2"
top: "conv4_3"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu4_3"
type: "ReLU"
bottom: "conv4_3"
top: "conv4_3"
}
layer {
name: "pool4"
type: "Pooling"
bottom: "conv4_3"
top: "pool4"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv5_1"
type: "Convolution"
bottom: "pool4"
top: "conv5_1"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu5_1"
type: "ReLU"
bottom: "conv5_1"
top: "conv5_1"
}
layer {
name: "conv5_2"
type: "Convolution"
bottom: "conv5_1"
top: "conv5_2"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu5_2"
type: "ReLU"
bottom: "conv5_2"
top: "conv5_2"
}
layer {
name: "conv5_3"
type: "Convolution"
bottom: "conv5_2"
top: "conv5_3"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu5_3"
type: "ReLU"
bottom: "conv5_3"
top: "conv5_3"
}
layer {
name: "pool5"
type: "Pooling"
bottom: "conv5_3"
top: "pool5"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "fc6"
type: "Convolution"
bottom: "pool5"
top: "fc6"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 4096
pad: 0
kernel_size: 7
stride: 1
}
}
layer {
name: "relu6"
type: "ReLU"
bottom: "fc6"
top: "fc6"
}
layer {
name: "drop6"
type: "Dropout"
bottom: "fc6"
top: "fc6"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "fc7"
type: "Convolution"
bottom: "fc6"
top: "fc7"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 4096
pad: 0
kernel_size: 1
stride: 1
}
}
layer {
name: "relu7"
type: "ReLU"
bottom: "fc7"
top: "fc7"
}
layer {
name: "drop7"
type: "Dropout"
bottom: "fc7"
top: "fc7"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "score_fr"
type: "Convolution"
bottom: "fc7"
top: "score_fr"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 21
pad: 0
kernel_size: 1
}
}
layer {
name: "upscore2"
type: "Deconvolution"
bottom: "score_fr"
top: "upscore2"
param {
lr_mult: 0
}
convolution_param {
num_output: 21
bias_term: false
kernel_size: 4
stride: 2
}
}
layer {
name: "score_pool4"
type: "Convolution"
bottom: "pool4"
top: "score_pool4"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 21
pad: 0
kernel_size: 1
}
}
layer {
name: "score_pool4c"
type: "Crop"
bottom: "score_pool4"
bottom: "upscore2"
top: "score_pool4c"
crop_param {
axis: 2
offset: 5
}
}
layer {
name: "fuse_pool4"
type: "Eltwise"
bottom: "upscore2"
bottom: "score_pool4c"
top: "fuse_pool4"
eltwise_param {
operation: SUM
}
}
layer {
name: "upscore_pool4"
type: "Deconvolution"
bottom: "fuse_pool4"
top: "upscore_pool4"
param {
lr_mult: 0
}
convolution_param {
num_output: 21
bias_term: false
kernel_size: 4
stride: 2
}
}
layer {
name: "score_pool3"
type: "Convolution"
bottom: "pool3"
top: "score_pool3"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 21
pad: 0
kernel_size: 1
}
}
layer {
name: "score_pool3c"
type: "Crop"
bottom: "score_pool3"
bottom: "upscore_pool4"
top: "score_pool3c"
crop_param {
axis: 2
offset: 9
}
}
layer {
name: "fuse_pool3"
type: "Eltwise"
bottom: "upscore_pool4"
bottom: "score_pool3c"
top: "fuse_pool3"
eltwise_param {
operation: SUM
}
}
layer {
name: "upscore8"
type: "Deconvolution"
bottom: "fuse_pool3"
top: "upscore8"
param {
lr_mult: 0
}
convolution_param {
num_output: 21
bias_term: false
kernel_size: 16
stride: 8
}
}
layer {
name: "score"
type: "Crop"
bottom: "upscore8"
bottom: "data"
top: "score"
crop_param {
axis: 2
offset: 31
}
}
layer {
name: "loss"
type: "SoftmaxWithLoss"
bottom: "score"
bottom: "label"
top: "loss"
loss_param {
ignore_label: 255
normalize: false
}
}
I1213 12:14:07.787643 1380 layer_factory.hpp:77] Creating layer data
I1213 12:14:07.788645 1380 common.cpp:36] System entropy source not available, using fallback algorithm to generate seed instead.
I1213 12:14:07.789145 1380 net.cpp:100] Creating Layer data
I1213 12:14:07.789645 1380 net.cpp:418] data -> data
I1213 12:14:07.790145 12764 common.cpp:36] System entropy source not available, using fallback algorithm to generate seed instead.
I1213 12:14:07.790145 1380 data_transformer.cpp:25] Loading mean file from: G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_train_mean.binaryproto
I1213 12:14:07.791647 12764 db_lmdb.cpp:40] Opened lmdb G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_train
I1213 12:14:07.841182 1380 data_layer.cpp:41] output data size: 1,3,224,224
I1213 12:14:07.846186 1380 net.cpp:150] Setting up data
I1213 12:14:07.846688 1380 net.cpp:157] Top shape: 1 3 224 224 (150528)
I1213 12:14:07.849189 11676 common.cpp:36] System entropy source not available, using fallback algorithm to generate seed instead.
I1213 12:14:07.849689 1380 net.cpp:165] Memory required for data: 602112
I1213 12:14:07.852190 1380 layer_factory.hpp:77] Creating layer data_data_0_split
I1213 12:14:07.853691 1380 net.cpp:100] Creating Layer data_data_0_split
I1213 12:14:07.855195 1380 net.cpp:444] data_data_0_split <- data
I1213 12:14:07.856194 1380 net.cpp:418] data_data_0_split -> data_data_0_split_0
I1213 12:14:07.857697 1380 net.cpp:418] data_data_0_split -> data_data_0_split_1
I1213 12:14:07.858695 1380 net.cpp:150] Setting up data_data_0_split
I1213 12:14:07.859695 1380 net.cpp:157] Top shape: 1 3 224 224 (150528)
I1213 12:14:07.862702 1380 net.cpp:157] Top shape: 1 3 224 224 (150528)
I1213 12:14:07.864199 1380 net.cpp:165] Memory required for data: 1806336
I1213 12:14:07.865211 1380 layer_factory.hpp:77] Creating layer label
I1213 12:14:07.866701 1380 net.cpp:100] Creating Layer label
I1213 12:14:07.867712 1380 net.cpp:418] label -> label
I1213 12:14:07.869706 2072 common.cpp:36] System entropy source not available, using fallback algorithm to generate seed instead.
I1213 12:14:07.870203 1380 data_transformer.cpp:25] Loading mean file from: G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_train_mean.binaryproto
I1213 12:14:07.873206 2072 db_lmdb.cpp:40] Opened lmdb G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_train
I1213 12:14:07.875710 1380 data_layer.cpp:41] output data size: 1,1,224,224
I1213 12:14:07.877709 1380 net.cpp:150] Setting up label
I1213 12:14:07.879212 1380 net.cpp:157] Top shape: 1 1 224 224 (50176)
I1213 12:14:07.881211 7064 common.cpp:36] System entropy source not available, using fallback algorithm to generate seed instead.
I1213 12:14:07.882211 1380 net.cpp:165] Memory required for data: 2007040
I1213 12:14:07.883713 1380 layer_factory.hpp:77] Creating layer conv1_1
I1213 12:14:07.884716 1380 net.cpp:100] Creating Layer conv1_1
I1213 12:14:07.885215 1380 net.cpp:444] conv1_1 <- data_data_0_split_0
I1213 12:14:07.886214 1380 net.cpp:418] conv1_1 -> conv1_1
I1213 12:14:08.172420 1380 net.cpp:150] Setting up conv1_1
I1213 12:14:08.172919 1380 net.cpp:157] Top shape: 1 64 422 422 (11397376)
I1213 12:14:08.173419 1380 net.cpp:165] Memory required for data: 47596544
I1213 12:14:08.173919 1380 layer_factory.hpp:77] Creating layer relu1_1
I1213 12:14:08.173919 1380 net.cpp:100] Creating Layer relu1_1
I1213 12:14:08.173919 1380 net.cpp:444] relu1_1 <- conv1_1
I1213 12:14:08.174420 1380 net.cpp:405] relu1_1 -> conv1_1 (in-place)
I1213 12:14:08.174921 1380 net.cpp:150] Setting up relu1_1
I1213 12:14:08.175420 1380 net.cpp:157] Top shape: 1 64 422 422 (11397376)
I1213 12:14:08.175921 1380 net.cpp:165] Memory required for data: 93186048
I1213 12:14:08.175921 1380 layer_factory.hpp:77] Creating layer conv1_2
I1213 12:14:08.176421 1380 net.cpp:100] Creating Layer conv1_2
I1213 12:14:08.176421 1380 net.cpp:444] conv1_2 <- conv1_1
I1213 12:14:08.176421 1380 net.cpp:418] conv1_2 -> conv1_2
I1213 12:14:08.178923 1380 net.cpp:150] Setting up conv1_2
I1213 12:14:08.179424 1380 net.cpp:157] Top shape: 1 64 422 422 (11397376)
I1213 12:14:08.179424 1380 net.cpp:165] Memory required for data: 138775552
I1213 12:14:08.179424 1380 layer_factory.hpp:77] Creating layer relu1_2
I1213 12:14:08.179424 1380 net.cpp:100] Creating Layer relu1_2
I1213 12:14:08.180424 1380 net.cpp:444] relu1_2 <- conv1_2
I1213 12:14:08.180424 1380 net.cpp:405] relu1_2 -> conv1_2 (in-place)
I1213 12:14:08.180924 1380 net.cpp:150] Setting up relu1_2
I1213 12:14:08.181426 1380 net.cpp:157] Top shape: 1 64 422 422 (11397376)
I1213 12:14:08.181426 1380 net.cpp:165] Memory required for data: 184365056
I1213 12:14:08.181426 1380 layer_factory.hpp:77] Creating layer pool1
I1213 12:14:08.181426 1380 net.cpp:100] Creating Layer pool1
I1213 12:14:08.182425 1380 net.cpp:444] pool1 <- conv1_2
I1213 12:14:08.182425 1380 net.cpp:418] pool1 -> pool1
I1213 12:14:08.182425 1380 net.cpp:150] Setting up pool1
I1213 12:14:08.183426 1380 net.cpp:157] Top shape: 1 64 211 211 (2849344)
I1213 12:14:08.183426 1380 net.cpp:165] Memory required for data: 195762432
I1213 12:14:08.183426 1380 layer_factory.hpp:77] Creating layer conv2_1
I1213 12:14:08.183926 1380 net.cpp:100] Creating Layer conv2_1
I1213 12:14:08.183926 1380 net.cpp:444] conv2_1 <- pool1
I1213 12:14:08.183926 1380 net.cpp:418] conv2_1 -> conv2_1
I1213 12:14:08.189931 1380 net.cpp:150] Setting up conv2_1
I1213 12:14:08.189931 1380 net.cpp:157] Top shape: 1 128 211 211 (5698688)
I1213 12:14:08.190433 1380 net.cpp:165] Memory required for data: 218557184
I1213 12:14:08.190932 1380 layer_factory.hpp:77] Creating layer relu2_1
I1213 12:14:08.191432 1380 net.cpp:100] Creating Layer relu2_1
I1213 12:14:08.191432 1380 net.cpp:444] relu2_1 <- conv2_1
I1213 12:14:08.191432 1380 net.cpp:405] relu2_1 -> conv2_1 (in-place)
I1213 12:14:08.192433 1380 net.cpp:150] Setting up relu2_1
I1213 12:14:08.192934 1380 net.cpp:157] Top shape: 1 128 211 211 (5698688)
I1213 12:14:08.192934 1380 net.cpp:165] Memory required for data: 241351936
I1213 12:14:08.193434 1380 layer_factory.hpp:77] Creating layer conv2_2
I1213 12:14:08.193434 1380 net.cpp:100] Creating Layer conv2_2
I1213 12:14:08.194434 1380 net.cpp:444] conv2_2 <- conv2_1
I1213 12:14:08.194434 1380 net.cpp:418] conv2_2 -> conv2_2
I1213 12:14:08.197937 1380 net.cpp:150] Setting up conv2_2
I1213 12:14:08.197937 1380 net.cpp:157] Top shape: 1 128 211 211 (5698688)
I1213 12:14:08.198437 1380 net.cpp:165] Memory required for data: 264146688
I1213 12:14:08.198437 1380 layer_factory.hpp:77] Creating layer relu2_2
I1213 12:14:08.198437 1380 net.cpp:100] Creating Layer relu2_2
I1213 12:14:08.199439 1380 net.cpp:444] relu2_2 <- conv2_2
I1213 12:14:08.199439 1380 net.cpp:405] relu2_2 -> conv2_2 (in-place)
I1213 12:14:08.199939 1380 net.cpp:150] Setting up relu2_2
I1213 12:14:08.200939 1380 net.cpp:157] Top shape: 1 128 211 211 (5698688)
I1213 12:14:08.200939 1380 net.cpp:165] Memory required for data: 286941440
I1213 12:14:08.200939 1380 layer_factory.hpp:77] Creating layer pool2
I1213 12:14:08.200939 1380 net.cpp:100] Creating Layer pool2
I1213 12:14:08.202940 1380 net.cpp:444] pool2 <- conv2_2
I1213 12:14:08.203441 1380 net.cpp:418] pool2 -> pool2
I1213 12:14:08.203441 1380 net.cpp:150] Setting up pool2
I1213 12:14:08.203441 1380 net.cpp:157] Top shape: 1 128 106 106 (1438208)
I1213 12:14:08.203441 1380 net.cpp:165] Memory required for data: 292694272
I1213 12:14:08.203941 1380 layer_factory.hpp:77] Creating layer conv3_1
I1213 12:14:08.203941 1380 net.cpp:100] Creating Layer conv3_1
I1213 12:14:08.203941 1380 net.cpp:444] conv3_1 <- pool2
I1213 12:14:08.207443 1380 net.cpp:418] conv3_1 -> conv3_1
I1213 12:14:08.214949 1380 net.cpp:150] Setting up conv3_1
I1213 12:14:08.214949 1380 net.cpp:157] Top shape: 1 256 106 106 (2876416)
I1213 12:14:08.215450 1380 net.cpp:165] Memory required for data: 304199936
I1213 12:14:08.215450 1380 layer_factory.hpp:77] Creating layer relu3_1
I1213 12:14:08.215450 1380 net.cpp:100] Creating Layer relu3_1
I1213 12:14:08.216450 1380 net.cpp:444] relu3_1 <- conv3_1
I1213 12:14:08.216450 1380 net.cpp:405] relu3_1 -> conv3_1 (in-place)
I1213 12:14:08.217952 1380 net.cpp:150] Setting up relu3_1
I1213 12:14:08.219452 1380 net.cpp:157] Top shape: 1 256 106 106 (2876416)
I1213 12:14:08.219452 1380 net.cpp:165] Memory required for data: 315705600
I1213 12:14:08.219452 1380 layer_factory.hpp:77] Creating layer conv3_2
I1213 12:14:08.220453 1380 net.cpp:100] Creating Layer conv3_2
I1213 12:14:08.222455 1380 net.cpp:444] conv3_2 <- conv3_1
I1213 12:14:08.222455 1380 net.cpp:418] conv3_2 -> conv3_2
I1213 12:14:08.227458 1380 net.cpp:150] Setting up conv3_2
I1213 12:14:08.227458 1380 net.cpp:157] Top shape: 1 256 106 106 (2876416)
I1213 12:14:08.228458 1380 net.cpp:165] Memory required for data: 327211264
I1213 12:14:08.228458 1380 layer_factory.hpp:77] Creating layer relu3_2
I1213 12:14:08.228458 1380 net.cpp:100] Creating Layer relu3_2
I1213 12:14:08.228958 1380 net.cpp:444] relu3_2 <- conv3_2
I1213 12:14:08.229960 1380 net.cpp:405] relu3_2 -> conv3_2 (in-place)
I1213 12:14:08.230460 1380 net.cpp:150] Setting up relu3_2
I1213 12:14:08.230959 1380 net.cpp:157] Top shape: 1 256 106 106 (2876416)
I1213 12:14:08.230959 1380 net.cpp:165] Memory required for data: 338716928
I1213 12:14:08.231461 1380 layer_factory.hpp:77] Creating layer conv3_3
I1213 12:14:08.231961 1380 net.cpp:100] Creating Layer conv3_3
I1213 12:14:08.231961 1380 net.cpp:444] conv3_3 <- conv3_2
I1213 12:14:08.231961 1380 net.cpp:418] conv3_3 -> conv3_3
I1213 12:14:08.240967 1380 net.cpp:150] Setting up conv3_3
I1213 12:14:08.240967 1380 net.cpp:157] Top shape: 1 256 106 106 (2876416)
I1213 12:14:08.241467 1380 net.cpp:165] Memory required for data: 350222592
I1213 12:14:08.241467 1380 layer_factory.hpp:77] Creating layer relu3_3
I1213 12:14:08.242468 1380 net.cpp:100] Creating Layer relu3_3
I1213 12:14:08.242468 1380 net.cpp:444] relu3_3 <- conv3_3
I1213 12:14:08.242468 1380 net.cpp:405] relu3_3 -> conv3_3 (in-place)
I1213 12:14:08.243969 1380 net.cpp:150] Setting up relu3_3
I1213 12:14:08.244971 1380 net.cpp:157] Top shape: 1 256 106 106 (2876416)
I1213 12:14:08.244971 1380 net.cpp:165] Memory required for data: 361728256
I1213 12:14:08.244971 1380 layer_factory.hpp:77] Creating layer pool3
I1213 12:14:08.245970 1380 net.cpp:100] Creating Layer pool3
I1213 12:14:08.245970 1380 net.cpp:444] pool3 <- conv3_3
I1213 12:14:08.245970 1380 net.cpp:418] pool3 -> pool3
I1213 12:14:08.245970 1380 net.cpp:150] Setting up pool3
I1213 12:14:08.246471 1380 net.cpp:157] Top shape: 1 256 53 53 (719104)
I1213 12:14:08.246471 1380 net.cpp:165] Memory required for data: 364604672
I1213 12:14:08.246471 1380 layer_factory.hpp:77] Creating layer pool3_pool3_0_split
I1213 12:14:08.246471 1380 net.cpp:100] Creating Layer pool3_pool3_0_split
I1213 12:14:08.246971 1380 net.cpp:444] pool3_pool3_0_split <- pool3
I1213 12:14:08.247473 1380 net.cpp:418] pool3_pool3_0_split -> pool3_pool3_0_split_0
I1213 12:14:08.247473 1380 net.cpp:418] pool3_pool3_0_split -> pool3_pool3_0_split_1
I1213 12:14:08.247473 1380 net.cpp:150] Setting up pool3_pool3_0_split
I1213 12:14:08.247473 1380 net.cpp:157] Top shape: 1 256 53 53 (719104)
I1213 12:14:08.247473 1380 net.cpp:157] Top shape: 1 256 53 53 (719104)
I1213 12:14:08.247473 1380 net.cpp:165] Memory required for data: 370357504
I1213 12:14:08.249974 1380 layer_factory.hpp:77] Creating layer conv4_1
I1213 12:14:08.249974 1380 net.cpp:100] Creating Layer conv4_1
I1213 12:14:08.249974 1380 net.cpp:444] conv4_1 <- pool3_pool3_0_split_0
I1213 12:14:08.249974 1380 net.cpp:418] conv4_1 -> conv4_1
I1213 12:14:08.260982 1380 net.cpp:150] Setting up conv4_1
I1213 12:14:08.261482 1380 net.cpp:157] Top shape: 1 512 53 53 (1438208)
I1213 12:14:08.262984 1380 net.cpp:165] Memory required for data: 376110336
I1213 12:14:08.262984 1380 layer_factory.hpp:77] Creating layer relu4_1
I1213 12:14:08.266486 1380 net.cpp:100] Creating Layer relu4_1
I1213 12:14:08.266985 1380 net.cpp:444] relu4_1 <- conv4_1
I1213 12:14:08.266985 1380 net.cpp:405] relu4_1 -> conv4_1 (in-place)
I1213 12:14:08.269989 1380 net.cpp:150] Setting up relu4_1
I1213 12:14:08.269989 1380 net.cpp:157] Top shape: 1 512 53 53 (1438208)
I1213 12:14:08.270488 1380 net.cpp:165] Memory required for data: 381863168
I1213 12:14:08.270488 1380 layer_factory.hpp:77] Creating layer conv4_2
I1213 12:14:08.270488 1380 net.cpp:100] Creating Layer conv4_2
I1213 12:14:08.272990 1380 net.cpp:444] conv4_2 <- conv4_1
I1213 12:14:08.275492 1380 net.cpp:418] conv4_2 -> conv4_2
I1213 12:14:08.287000 1380 net.cpp:150] Setting up conv4_2
I1213 12:14:08.287000 1380 net.cpp:157] Top shape: 1 512 53 53 (1438208)
I1213 12:14:08.287500 1380 net.cpp:165] Memory required for data: 387616000
I1213 12:14:08.287500 1380 layer_factory.hpp:77] Creating layer relu4_2
I1213 12:14:08.287500 1380 net.cpp:100] Creating Layer relu4_2
I1213 12:14:08.288501 1380 net.cpp:444] relu4_2 <- conv4_2
I1213 12:14:08.288501 1380 net.cpp:405] relu4_2 -> conv4_2 (in-place)
I1213 12:14:08.289504 1380 net.cpp:150] Setting up relu4_2
I1213 12:14:08.290503 1380 net.cpp:157] Top shape: 1 512 53 53 (1438208)
I1213 12:14:08.290503 1380 net.cpp:165] Memory required for data: 393368832
I1213 12:14:08.290503 1380 layer_factory.hpp:77] Creating layer conv4_3
I1213 12:14:08.290503 1380 net.cpp:100] Creating Layer conv4_3
I1213 12:14:08.291503 1380 net.cpp:444] conv4_3 <- conv4_2
I1213 12:14:08.291503 1380 net.cpp:418] conv4_3 -> conv4_3
I1213 12:14:08.301012 1380 net.cpp:150] Setting up conv4_3
I1213 12:14:08.301512 1380 net.cpp:157] Top shape: 1 512 53 53 (1438208)
I1213 12:14:08.303011 1380 net.cpp:165] Memory required for data: 399121664
I1213 12:14:08.303011 1380 layer_factory.hpp:77] Creating layer relu4_3
I1213 12:14:08.306015 1380 net.cpp:100] Creating Layer relu4_3
I1213 12:14:08.307015 1380 net.cpp:444] relu4_3 <- conv4_3
I1213 12:14:08.307515 1380 net.cpp:405] relu4_3 -> conv4_3 (in-place)
I1213 12:14:08.309517 1380 net.cpp:150] Setting up relu4_3
I1213 12:14:08.312518 1380 net.cpp:157] Top shape: 1 512 53 53 (1438208)
I1213 12:14:08.312518 1380 net.cpp:165] Memory required for data: 404874496
I1213 12:14:08.313519 1380 layer_factory.hpp:77] Creating layer pool4
I1213 12:14:08.313519 1380 net.cpp:100] Creating Layer pool4
I1213 12:14:08.313519 1380 net.cpp:444] pool4 <- conv4_3
I1213 12:14:08.313519 1380 net.cpp:418] pool4 -> pool4
I1213 12:14:08.314019 1380 net.cpp:150] Setting up pool4
I1213 12:14:08.314019 1380 net.cpp:157] Top shape: 1 512 27 27 (373248)
I1213 12:14:08.314019 1380 net.cpp:165] Memory required for data: 406367488
I1213 12:14:08.314019 1380 layer_factory.hpp:77] Creating layer pool4_pool4_0_split
I1213 12:14:08.314019 1380 net.cpp:100] Creating Layer pool4_pool4_0_split
I1213 12:14:08.315521 1380 net.cpp:444] pool4_pool4_0_split <- pool4
I1213 12:14:08.315521 1380 net.cpp:418] pool4_pool4_0_split -> pool4_pool4_0_split_0
I1213 12:14:08.315521 1380 net.cpp:418] pool4_pool4_0_split -> pool4_pool4_0_split_1
I1213 12:14:08.316522 1380 net.cpp:150] Setting up pool4_pool4_0_split
I1213 12:14:08.316522 1380 net.cpp:157] Top shape: 1 512 27 27 (373248)
I1213 12:14:08.317023 1380 net.cpp:157] Top shape: 1 512 27 27 (373248)
I1213 12:14:08.317023 1380 net.cpp:165] Memory required for data: 409353472
I1213 12:14:08.317023 1380 layer_factory.hpp:77] Creating layer conv5_1
I1213 12:14:08.317523 1380 net.cpp:100] Creating Layer conv5_1
I1213 12:14:08.318022 1380 net.cpp:444] conv5_1 <- pool4_pool4_0_split_0
I1213 12:14:08.318522 1380 net.cpp:418] conv5_1 -> conv5_1
I1213 12:14:08.326529 1380 net.cpp:150] Setting up conv5_1
I1213 12:14:08.327530 1380 net.cpp:157] Top shape: 1 512 27 27 (373248)
I1213 12:14:08.327530 1380 net.cpp:165] Memory required for data: 410846464
I1213 12:14:08.327530 1380 layer_factory.hpp:77] Creating layer relu5_1
I1213 12:14:08.327530 1380 net.cpp:100] Creating Layer relu5_1
I1213 12:14:08.327530 1380 net.cpp:444] relu5_1 <- conv5_1
I1213 12:14:08.328531 1380 net.cpp:405] relu5_1 -> conv5_1 (in-place)
I1213 12:14:08.329530 1380 net.cpp:150] Setting up relu5_1
I1213 12:14:08.330030 1380 net.cpp:157] Top shape: 1 512 27 27 (373248)
I1213 12:14:08.330030 1380 net.cpp:165] Memory required for data: 412339456
I1213 12:14:08.330030 1380 layer_factory.hpp:77] Creating layer conv5_2
I1213 12:14:08.330030 1380 net.cpp:100] Creating Layer conv5_2
I1213 12:14:08.331032 1380 net.cpp:444] conv5_2 <- conv5_1
I1213 12:14:08.331032 1380 net.cpp:418] conv5_2 -> conv5_2
I1213 12:14:08.339037 1380 net.cpp:150] Setting up conv5_2
I1213 12:14:08.339539 1380 net.cpp:157] Top shape: 1 512 27 27 (373248)
I1213 12:14:08.339539 1380 net.cpp:165] Memory required for data: 413832448
I1213 12:14:08.339539 1380 layer_factory.hpp:77] Creating layer relu5_2
I1213 12:14:08.340538 1380 net.cpp:100] Creating Layer relu5_2
I1213 12:14:08.340538 1380 net.cpp:444] relu5_2 <- conv5_2
I1213 12:14:08.340538 1380 net.cpp:405] relu5_2 -> conv5_2 (in-place)
I1213 12:14:08.341539 1380 net.cpp:150] Setting up relu5_2
I1213 12:14:08.342039 1380 net.cpp:157] Top shape: 1 512 27 27 (373248)
I1213 12:14:08.342039 1380 net.cpp:165] Memory required for data: 415325440
I1213 12:14:08.342039 1380 layer_factory.hpp:77] Creating layer conv5_3
I1213 12:14:08.342039 1380 net.cpp:100] Creating Layer conv5_3
I1213 12:14:08.342039 1380 net.cpp:444] conv5_3 <- conv5_2
I1213 12:14:08.342540 1380 net.cpp:418] conv5_3 -> conv5_3
I1213 12:14:08.348544 1380 net.cpp:150] Setting up conv5_3
I1213 12:14:08.348544 1380 net.cpp:157] Top shape: 1 512 27 27 (373248)
I1213 12:14:08.349545 1380 net.cpp:165] Memory required for data: 416818432
I1213 12:14:08.349545 1380 layer_factory.hpp:77] Creating layer relu5_3
I1213 12:14:08.349545 1380 net.cpp:100] Creating Layer relu5_3
I1213 12:14:08.350545 1380 net.cpp:444] relu5_3 <- conv5_3
I1213 12:14:08.350545 1380 net.cpp:405] relu5_3 -> conv5_3 (in-place)
I1213 12:14:08.352046 1380 net.cpp:150] Setting up relu5_3
I1213 12:14:08.352547 1380 net.cpp:157] Top shape: 1 512 27 27 (373248)
I1213 12:14:08.352547 1380 net.cpp:165] Memory required for data: 418311424
I1213 12:14:08.352547 1380 layer_factory.hpp:77] Creating layer pool5
I1213 12:14:08.352547 1380 net.cpp:100] Creating Layer pool5
I1213 12:14:08.353049 1380 net.cpp:444] pool5 <- conv5_3
I1213 12:14:08.353049 1380 net.cpp:418] pool5 -> pool5
I1213 12:14:08.353049 1380 net.cpp:150] Setting up pool5
I1213 12:14:08.353548 1380 net.cpp:157] Top shape: 1 512 14 14 (100352)
I1213 12:14:08.353548 1380 net.cpp:165] Memory required for data: 418712832
I1213 12:14:08.354048 1380 layer_factory.hpp:77] Creating layer fc6
I1213 12:14:08.354048 1380 net.cpp:100] Creating Layer fc6
I1213 12:14:08.354048 1380 net.cpp:444] fc6 <- pool5
I1213 12:14:08.354048 1380 net.cpp:418] fc6 -> fc6
I1213 12:14:08.565698 1380 net.cpp:150] Setting up fc6
I1213 12:14:08.566198 1380 net.cpp:157] Top shape: 1 4096 8 8 (262144)
I1213 12:14:08.566699 1380 net.cpp:165] Memory required for data: 419761408
I1213 12:14:08.567199 1380 layer_factory.hpp:77] Creating layer relu6
I1213 12:14:08.567700 1380 net.cpp:100] Creating Layer relu6
I1213 12:14:08.567700 1380 net.cpp:444] relu6 <- fc6
I1213 12:14:08.568200 1380 net.cpp:405] relu6 -> fc6 (in-place)
I1213 12:14:08.568701 1380 net.cpp:150] Setting up relu6
I1213 12:14:08.569201 1380 net.cpp:157] Top shape: 1 4096 8 8 (262144)
I1213 12:14:08.569701 1380 net.cpp:165] Memory required for data: 420809984
I1213 12:14:08.569701 1380 layer_factory.hpp:77] Creating layer drop6
I1213 12:14:08.569701 1380 net.cpp:100] Creating Layer drop6
I1213 12:14:08.570201 1380 net.cpp:444] drop6 <- fc6
I1213 12:14:08.570703 1380 net.cpp:405] drop6 -> fc6 (in-place)
I1213 12:14:08.571703 1380 net.cpp:150] Setting up drop6
I1213 12:14:08.572703 1380 net.cpp:157] Top shape: 1 4096 8 8 (262144)
I1213 12:14:08.573204 1380 net.cpp:165] Memory required for data: 421858560
I1213 12:14:08.573204 1380 layer_factory.hpp:77] Creating layer fc7
I1213 12:14:08.573204 1380 net.cpp:100] Creating Layer fc7
I1213 12:14:08.573704 1380 net.cpp:444] fc7 <- fc6
I1213 12:14:08.574204 1380 net.cpp:418] fc7 -> fc7
I1213 12:14:08.610230 1380 net.cpp:150] Setting up fc7
I1213 12:14:08.610730 1380 net.cpp:157] Top shape: 1 4096 8 8 (262144)
I1213 12:14:08.611232 1380 net.cpp:165] Memory required for data: 422907136
I1213 12:14:08.611732 1380 layer_factory.hpp:77] Creating layer relu7
I1213 12:14:08.612232 1380 net.cpp:100] Creating Layer relu7
I1213 12:14:08.612232 1380 net.cpp:444] relu7 <- fc7
I1213 12:14:08.612232 1380 net.cpp:405] relu7 -> fc7 (in-place)
I1213 12:14:08.612732 1380 net.cpp:150] Setting up relu7
I1213 12:14:08.613232 1380 net.cpp:157] Top shape: 1 4096 8 8 (262144)
I1213 12:14:08.613232 1380 net.cpp:165] Memory required for data: 423955712
I1213 12:14:08.613232 1380 layer_factory.hpp:77] Creating layer drop7
I1213 12:14:08.613232 1380 net.cpp:100] Creating Layer drop7
I1213 12:14:08.613732 1380 net.cpp:444] drop7 <- fc7
I1213 12:14:08.614733 1380 net.cpp:405] drop7 -> fc7 (in-place)
I1213 12:14:08.615234 1380 net.cpp:150] Setting up drop7
I1213 12:14:08.615734 1380 net.cpp:157] Top shape: 1 4096 8 8 (262144)
I1213 12:14:08.616235 1380 net.cpp:165] Memory required for data: 425004288
I1213 12:14:08.616235 1380 layer_factory.hpp:77] Creating layer score_fr
I1213 12:14:08.616235 1380 net.cpp:100] Creating Layer score_fr
I1213 12:14:08.616235 1380 net.cpp:444] score_fr <- fc7
I1213 12:14:08.617235 1380 net.cpp:418] score_fr -> score_fr
I1213 12:14:08.619237 1380 net.cpp:150] Setting up score_fr
I1213 12:14:08.619237 1380 net.cpp:157] Top shape: 1 21 8 8 (1344)
I1213 12:14:08.619237 1380 net.cpp:165] Memory required for data: 425009664
I1213 12:14:08.619237 1380 layer_factory.hpp:77] Creating layer upscore2
I1213 12:14:08.620237 1380 net.cpp:100] Creating Layer upscore2
I1213 12:14:08.620237 1380 net.cpp:444] upscore2 <- score_fr
I1213 12:14:08.620237 1380 net.cpp:418] upscore2 -> upscore2
I1213 12:14:08.621739 1380 net.cpp:150] Setting up upscore2
I1213 12:14:08.622740 1380 net.cpp:157] Top shape: 1 21 18 18 (6804)
I1213 12:14:08.622740 1380 net.cpp:165] Memory required for data: 425036880
I1213 12:14:08.622740 1380 layer_factory.hpp:77] Creating layer upscore2_upscore2_0_split
I1213 12:14:08.622740 1380 net.cpp:100] Creating Layer upscore2_upscore2_0_split
I1213 12:14:08.623240 1380 net.cpp:444] upscore2_upscore2_0_split <- upscore2
I1213 12:14:08.623740 1380 net.cpp:418] upscore2_upscore2_0_split -> upscore2_upscore2_0_split_0
I1213 12:14:08.623740 1380 net.cpp:418] upscore2_upscore2_0_split -> upscore2_upscore2_0_split_1
I1213 12:14:08.623740 1380 net.cpp:150] Setting up upscore2_upscore2_0_split
I1213 12:14:08.627243 1380 net.cpp:157] Top shape: 1 21 18 18 (6804)
I1213 12:14:08.628244 1380 net.cpp:157] Top shape: 1 21 18 18 (6804)
I1213 12:14:08.628744 1380 net.cpp:165] Memory required for data: 425091312
I1213 12:14:08.628744 1380 layer_factory.hpp:77] Creating layer score_pool4
I1213 12:14:08.630245 1380 net.cpp:100] Creating Layer score_pool4
I1213 12:14:08.631748 1380 net.cpp:444] score_pool4 <- pool4_pool4_0_split_1
I1213 12:14:08.631748 1380 net.cpp:418] score_pool4 -> score_pool4
I1213 12:14:08.634748 1380 net.cpp:150] Setting up score_pool4
I1213 12:14:08.634748 1380 net.cpp:157] Top shape: 1 21 27 27 (15309)
I1213 12:14:08.636250 1380 net.cpp:165] Memory required for data: 425152548
I1213 12:14:08.636250 1380 layer_factory.hpp:77] Creating layer score_pool4c
I1213 12:14:08.636250 1380 net.cpp:100] Creating Layer score_pool4c
I1213 12:14:08.636250 1380 net.cpp:444] score_pool4c <- score_pool4
I1213 12:14:08.637750 1380 net.cpp:444] score_pool4c <- upscore2_upscore2_0_split_0
I1213 12:14:08.637750 1380 net.cpp:418] score_pool4c -> score_pool4c
I1213 12:14:08.637750 1380 net.cpp:150] Setting up score_pool4c
I1213 12:14:08.638751 1380 net.cpp:157] Top shape: 1 21 18 18 (6804)
I1213 12:14:08.638751 1380 net.cpp:165] Memory required for data: 425179764
I1213 12:14:08.639251 1380 layer_factory.hpp:77] Creating layer fuse_pool4
I1213 12:14:08.639251 1380 net.cpp:100] Creating Layer fuse_pool4
I1213 12:14:08.640751 1380 net.cpp:444] fuse_pool4 <- upscore2_upscore2_0_split_1
I1213 12:14:08.645756 1380 net.cpp:444] fuse_pool4 <- score_pool4c
I1213 12:14:08.645756 1380 net.cpp:418] fuse_pool4 -> fuse_pool4
I1213 12:14:08.646756 1380 net.cpp:150] Setting up fuse_pool4
I1213 12:14:08.646756 1380 net.cpp:157] Top shape: 1 21 18 18 (6804)
I1213 12:14:08.646756 1380 net.cpp:165] Memory required for data: 425206980
I1213 12:14:08.646756 1380 layer_factory.hpp:77] Creating layer upscore_pool4
I1213 12:14:08.647258 1380 net.cpp:100] Creating Layer upscore_pool4
I1213 12:14:08.647258 1380 net.cpp:444] upscore_pool4 <- fuse_pool4
I1213 12:14:08.647758 1380 net.cpp:418] upscore_pool4 -> upscore_pool4
I1213 12:14:08.649258 1380 net.cpp:150] Setting up upscore_pool4
I1213 12:14:08.649760 1380 net.cpp:157] Top shape: 1 21 38 38 (30324)
I1213 12:14:08.650259 1380 net.cpp:165] Memory required for data: 425328276
I1213 12:14:08.650259 1380 layer_factory.hpp:77] Creating layer upscore_pool4_upscore_pool4_0_split
I1213 12:14:08.650259 1380 net.cpp:100] Creating Layer upscore_pool4_upscore_pool4_0_split
I1213 12:14:08.651260 1380 net.cpp:444] upscore_pool4_upscore_pool4_0_split <- upscore_pool4
I1213 12:14:08.651260 1380 net.cpp:418] upscore_pool4_upscore_pool4_0_split -> upscore_pool4_upscore_pool4_0_split_0
I1213 12:14:08.651260 1380 net.cpp:418] upscore_pool4_upscore_pool4_0_split -> upscore_pool4_upscore_pool4_0_split_1
I1213 12:14:08.652261 1380 net.cpp:150] Setting up upscore_pool4_upscore_pool4_0_split
I1213 12:14:08.652261 1380 net.cpp:157] Top shape: 1 21 38 38 (30324)
I1213 12:14:08.652261 1380 net.cpp:157] Top shape: 1 21 38 38 (30324)
I1213 12:14:08.652261 1380 net.cpp:165] Memory required for data: 425570868
I1213 12:14:08.652261 1380 layer_factory.hpp:77] Creating layer score_pool3
I1213 12:14:08.653261 1380 net.cpp:100] Creating Layer score_pool3
I1213 12:14:08.656764 1380 net.cpp:444] score_pool3 <- pool3_pool3_0_split_1
I1213 12:14:08.656764 1380 net.cpp:418] score_pool3 -> score_pool3
I1213 12:14:08.659765 1380 net.cpp:150] Setting up score_pool3
I1213 12:14:08.660266 1380 net.cpp:157] Top shape: 1 21 53 53 (58989)
I1213 12:14:08.666271 1380 net.cpp:165] Memory required for data: 425806824
I1213 12:14:08.666271 1380 layer_factory.hpp:77] Creating layer score_pool3c
I1213 12:14:08.666271 1380 net.cpp:100] Creating Layer score_pool3c
I1213 12:14:08.666771 1380 net.cpp:444] score_pool3c <- score_pool3
I1213 12:14:08.667271 1380 net.cpp:444] score_pool3c <- upscore_pool4_upscore_pool4_0_split_0
I1213 12:14:08.667271 1380 net.cpp:418] score_pool3c -> score_pool3c
I1213 12:14:08.667271 1380 net.cpp:150] Setting up score_pool3c
I1213 12:14:08.667271 1380 net.cpp:157] Top shape: 1 21 38 38 (30324)
I1213 12:14:08.667271 1380 net.cpp:165] Memory required for data: 425928120
I1213 12:14:08.667271 1380 layer_factory.hpp:77] Creating layer fuse_pool3
I1213 12:14:08.667271 1380 net.cpp:100] Creating Layer fuse_pool3
I1213 12:14:08.668272 1380 net.cpp:444] fuse_pool3 <- upscore_pool4_upscore_pool4_0_split_1
I1213 12:14:08.668772 1380 net.cpp:444] fuse_pool3 <- score_pool3c
I1213 12:14:08.669273 1380 net.cpp:418] fuse_pool3 -> fuse_pool3
I1213 12:14:08.669273 1380 net.cpp:150] Setting up fuse_pool3
I1213 12:14:08.670274 1380 net.cpp:157] Top shape: 1 21 38 38 (30324)
I1213 12:14:08.670274 1380 net.cpp:165] Memory required for data: 426049416
I1213 12:14:08.670274 1380 layer_factory.hpp:77] Creating layer upscore8
I1213 12:14:08.670274 1380 net.cpp:100] Creating Layer upscore8
I1213 12:14:08.670274 1380 net.cpp:444] upscore8 <- fuse_pool3
I1213 12:14:08.670274 1380 net.cpp:418] upscore8 -> upscore8
I1213 12:14:08.671274 1380 net.cpp:150] Setting up upscore8
I1213 12:14:08.671775 1380 net.cpp:157] Top shape: 1 21 312 312 (2044224)
I1213 12:14:08.671775 1380 net.cpp:165] Memory required for data: 434226312
I1213 12:14:08.671775 1380 layer_factory.hpp:77] Creating layer score
I1213 12:14:08.671775 1380 net.cpp:100] Creating Layer score
I1213 12:14:08.671775 1380 net.cpp:444] score <- upscore8
I1213 12:14:08.672274 1380 net.cpp:444] score <- data_data_0_split_1
I1213 12:14:08.673275 1380 net.cpp:418] score -> score
I1213 12:14:08.673275 1380 net.cpp:150] Setting up score
I1213 12:14:08.674276 1380 net.cpp:157] Top shape: 1 21 224 224 (1053696)
I1213 12:14:08.674276 1380 net.cpp:165] Memory required for data: 438441096
I1213 12:14:08.674276 1380 layer_factory.hpp:77] Creating layer loss
I1213 12:14:08.674276 1380 net.cpp:100] Creating Layer loss
I1213 12:14:08.675277 1380 net.cpp:444] loss <- score
I1213 12:14:08.675277 1380 net.cpp:444] loss <- label
I1213 12:14:08.675277 1380 net.cpp:418] loss -> loss
I1213 12:14:08.675277 1380 layer_factory.hpp:77] Creating layer loss
I1213 12:14:08.678781 1380 net.cpp:150] Setting up loss
I1213 12:14:08.679280 1380 net.cpp:157] Top shape: (1)
I1213 12:14:08.679780 1380 net.cpp:160] with loss weight 1
I1213 12:14:08.680280 1380 net.cpp:165] Memory required for data: 438441100
I1213 12:14:08.680280 1380 net.cpp:226] loss needs backward computation.
I1213 12:14:08.680280 1380 net.cpp:226] score needs backward computation.
I1213 12:14:08.680280 1380 net.cpp:226] upscore8 needs backward computation.
I1213 12:14:08.680280 1380 net.cpp:226] fuse_pool3 needs backward computation.
I1213 12:14:08.680280 1380 net.cpp:226] score_pool3c needs backward computation.
I1213 12:14:08.680280 1380 net.cpp:226] score_pool3 needs backward computation.
I1213 12:14:08.682281 1380 net.cpp:226] upscore_pool4_upscore_pool4_0_split needs backward computation.
I1213 12:14:08.682782 1380 net.cpp:226] upscore_pool4 needs backward computation.
I1213 12:14:08.682782 1380 net.cpp:226] fuse_pool4 needs backward computation.
I1213 12:14:08.682782 1380 net.cpp:226] score_pool4c needs backward computation.
I1213 12:14:08.683282 1380 net.cpp:226] score_pool4 needs backward computation.
I1213 12:14:08.683282 1380 net.cpp:226] upscore2_upscore2_0_split needs backward computation.
I1213 12:14:08.683282 1380 net.cpp:226] upscore2 needs backward computation.
I1213 12:14:08.683282 1380 net.cpp:226] score_fr needs backward computation.
I1213 12:14:08.683282 1380 net.cpp:226] drop7 needs backward computation.
I1213 12:14:08.683784 1380 net.cpp:226] relu7 needs backward computation.
I1213 12:14:08.683784 1380 net.cpp:226] fc7 needs backward computation.
I1213 12:14:08.683784 1380 net.cpp:226] drop6 needs backward computation.
I1213 12:14:08.683784 1380 net.cpp:226] relu6 needs backward computation.
I1213 12:14:08.684284 1380 net.cpp:226] fc6 needs backward computation.
I1213 12:14:08.684284 1380 net.cpp:226] pool5 needs backward computation.
I1213 12:14:08.684284 1380 net.cpp:226] relu5_3 needs backward computation.
I1213 12:14:08.684783 1380 net.cpp:226] conv5_3 needs backward computation.
I1213 12:14:08.685284 1380 net.cpp:226] relu5_2 needs backward computation.
I1213 12:14:08.685784 1380 net.cpp:226] conv5_2 needs backward computation.
I1213 12:14:08.686285 1380 net.cpp:226] relu5_1 needs backward computation.
I1213 12:14:08.686285 1380 net.cpp:226] conv5_1 needs backward computation.
I1213 12:14:08.686285 1380 net.cpp:226] pool4_pool4_0_split needs backward computation.
I1213 12:14:08.686285 1380 net.cpp:226] pool4 needs backward computation.
I1213 12:14:08.687286 1380 net.cpp:226] relu4_3 needs backward computation.
I1213 12:14:08.687286 1380 net.cpp:226] conv4_3 needs backward computation.
I1213 12:14:08.687286 1380 net.cpp:226] relu4_2 needs backward computation.
I1213 12:14:08.687286 1380 net.cpp:226] conv4_2 needs backward computation.
I1213 12:14:08.687286 1380 net.cpp:226] relu4_1 needs backward computation.
I1213 12:14:08.688787 1380 net.cpp:226] conv4_1 needs backward computation.
I1213 12:14:08.688787 1380 net.cpp:226] pool3_pool3_0_split needs backward computation.
I1213 12:14:08.688787 1380 net.cpp:226] pool3 needs backward computation.
I1213 12:14:08.689286 1380 net.cpp:226] relu3_3 needs backward computation.
I1213 12:14:08.690287 1380 net.cpp:226] conv3_3 needs backward computation.
I1213 12:14:08.690287 1380 net.cpp:226] relu3_2 needs backward computation.
I1213 12:14:08.690287 1380 net.cpp:226] conv3_2 needs backward computation.
I1213 12:14:08.690287 1380 net.cpp:226] relu3_1 needs backward computation.
I1213 12:14:08.691288 1380 net.cpp:226] conv3_1 needs backward computation.
I1213 12:14:08.691288 1380 net.cpp:226] pool2 needs backward computation.
I1213 12:14:08.691288 1380 net.cpp:226] relu2_2 needs backward computation.
I1213 12:14:08.691288 1380 net.cpp:226] conv2_2 needs backward computation.
I1213 12:14:08.691788 1380 net.cpp:226] relu2_1 needs backward computation.
I1213 12:14:08.692291 1380 net.cpp:226] conv2_1 needs backward computation.
I1213 12:14:08.692790 1380 net.cpp:226] pool1 needs backward computation.
I1213 12:14:08.692790 1380 net.cpp:226] relu1_2 needs backward computation.
I1213 12:14:08.692790 1380 net.cpp:226] conv1_2 needs backward computation.
I1213 12:14:08.693289 1380 net.cpp:226] relu1_1 needs backward computation.
I1213 12:14:08.693289 1380 net.cpp:226] conv1_1 needs backward computation.
I1213 12:14:08.693289 1380 net.cpp:228] label does not need backward computation.
I1213 12:14:08.693789 1380 net.cpp:228] data_data_0_split does not need backward computation.
I1213 12:14:08.694290 1380 net.cpp:228] data does not need backward computation.
I1213 12:14:08.694290 1380 net.cpp:270] This network produces output loss
I1213 12:14:08.694290 1380 net.cpp:283] Network initialization done.
I1213 12:14:08.695791 1380 solver.cpp:181] Creating test net (#0) specified by net file: train_val.prototxt
I1213 12:14:08.695791 1380 net.cpp:332] The NetState phase (1) differed from the phase (0) specified by a rule in layer data
I1213 12:14:08.698796 1380 net.cpp:332] The NetState phase (1) differed from the phase (0) specified by a rule in layer label
I1213 12:14:08.699795 1380 net.cpp:58] Initializing net from parameters:
state {
phase: TEST
}
layer {
name: "data"
type: "Data"
top: "data"
include {
phase: TEST
}
transform_param {
scale: 0.00390625
mean_file: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_val_mean.binaryproto"
}
data_param {
source: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_val"
batch_size: 1
backend: LMDB
}
}
layer {
name: "label"
type: "Data"
top: "label"
include {
phase: TEST
}
transform_param {
scale: 0.00390625
mean_file: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_val_mean.binaryproto"
}
data_param {
source: "G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_val"
batch_size: 1
backend: LMDB
}
}
layer {
name: "conv1_1"
type: "Convolution"
bottom: "data"
top: "conv1_1"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 64
pad: 100
kernel_size: 3
stride: 1
}
}
layer {
name: "relu1_1"
type: "ReLU"
bottom: "conv1_1"
top: "conv1_1"
}
layer {
name: "conv1_2"
type: "Convolution"
bottom: "conv1_1"
top: "conv1_2"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 64
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu1_2"
type: "ReLU"
bottom: "conv1_2"
top: "conv1_2"
}
layer {
name: "pool1"
type: "Pooling"
bottom: "conv1_2"
top: "pool1"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv2_1"
type: "Convolution"
bottom: "pool1"
top: "conv2_1"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 128
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu2_1"
type: "ReLU"
bottom: "conv2_1"
top: "conv2_1"
}
layer {
name: "conv2_2"
type: "Convolution"
bottom: "conv2_1"
top: "conv2_2"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 128
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu2_2"
type: "ReLU"
bottom: "conv2_2"
top: "conv2_2"
}
layer {
name: "pool2"
type: "Pooling"
bottom: "conv2_2"
top: "pool2"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv3_1"
type: "Convolution"
bottom: "pool2"
top: "conv3_1"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu3_1"
type: "ReLU"
bottom: "conv3_1"
top: "conv3_1"
}
layer {
name: "conv3_2"
type: "Convolution"
bottom: "conv3_1"
top: "conv3_2"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu3_2"
type: "ReLU"
bottom: "conv3_2"
top: "conv3_2"
}
layer {
name: "conv3_3"
type: "Convolution"
bottom: "conv3_2"
top: "conv3_3"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu3_3"
type: "ReLU"
bottom: "conv3_3"
top: "conv3_3"
}
layer {
name: "pool3"
type: "Pooling"
bottom: "conv3_3"
top: "pool3"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv4_1"
type: "Convolution"
bottom: "pool3"
top: "conv4_1"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu4_1"
type: "ReLU"
bottom: "conv4_1"
top: "conv4_1"
}
layer {
name: "conv4_2"
type: "Convolution"
bottom: "conv4_1"
top: "conv4_2"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu4_2"
type: "ReLU"
bottom: "conv4_2"
top: "conv4_2"
}
layer {
name: "conv4_3"
type: "Convolution"
bottom: "conv4_2"
top: "conv4_3"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu4_3"
type: "ReLU"
bottom: "conv4_3"
top: "conv4_3"
}
layer {
name: "pool4"
type: "Pooling"
bottom: "conv4_3"
top: "pool4"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv5_1"
type: "Convolution"
bottom: "pool4"
top: "conv5_1"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu5_1"
type: "ReLU"
bottom: "conv5_1"
top: "conv5_1"
}
layer {
name: "conv5_2"
type: "Convolution"
bottom: "conv5_1"
top: "conv5_2"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu5_2"
type: "ReLU"
bottom: "conv5_2"
top: "conv5_2"
}
layer {
name: "conv5_3"
type: "Convolution"
bottom: "conv5_2"
top: "conv5_3"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
stride: 1
}
}
layer {
name: "relu5_3"
type: "ReLU"
bottom: "conv5_3"
top: "conv5_3"
}
layer {
name: "pool5"
type: "Pooling"
bottom: "conv5_3"
top: "pool5"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "fc6"
type: "Convolution"
bottom: "pool5"
top: "fc6"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 4096
pad: 0
kernel_size: 7
stride: 1
}
}
layer {
name: "relu6"
type: "ReLU"
bottom: "fc6"
top: "fc6"
}
layer {
name: "drop6"
type: "Dropout"
bottom: "fc6"
top: "fc6"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "fc7"
type: "Convolution"
bottom: "fc6"
top: "fc7"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 4096
pad: 0
kernel_size: 1
stride: 1
}
}
layer {
name: "relu7"
type: "ReLU"
bottom: "fc7"
top: "fc7"
}
layer {
name: "drop7"
type: "Dropout"
bottom: "fc7"
top: "fc7"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "score_fr"
type: "Convolution"
bottom: "fc7"
top: "score_fr"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 21
pad: 0
kernel_size: 1
}
}
layer {
name: "upscore2"
type: "Deconvolution"
bottom: "score_fr"
top: "upscore2"
param {
lr_mult: 0
}
convolution_param {
num_output: 21
bias_term: false
kernel_size: 4
stride: 2
}
}
layer {
name: "score_pool4"
type: "Convolution"
bottom: "pool4"
top: "score_pool4"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 21
pad: 0
kernel_size: 1
}
}
layer {
name: "score_pool4c"
type: "Crop"
bottom: "score_pool4"
bottom: "upscore2"
top: "score_pool4c"
crop_param {
axis: 2
offset: 5
}
}
layer {
name: "fuse_pool4"
type: "Eltwise"
bottom: "upscore2"
bottom: "score_pool4c"
top: "fuse_pool4"
eltwise_param {
operation: SUM
}
}
layer {
name: "upscore_pool4"
type: "Deconvolution"
bottom: "fuse_pool4"
top: "upscore_pool4"
param {
lr_mult: 0
}
convolution_param {
num_output: 21
bias_term: false
kernel_size: 4
stride: 2
}
}
layer {
name: "score_pool3"
type: "Convolution"
bottom: "pool3"
top: "score_pool3"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 21
pad: 0
kernel_size: 1
}
}
layer {
name: "score_pool3c"
type: "Crop"
bottom: "score_pool3"
bottom: "upscore_pool4"
top: "score_pool3c"
crop_param {
axis: 2
offset: 9
}
}
layer {
name: "fuse_pool3"
type: "Eltwise"
bottom: "upscore_pool4"
bottom: "score_pool3c"
top: "fuse_pool3"
eltwise_param {
operation: SUM
}
}
layer {
name: "upscore8"
type: "Deconvolution"
bottom: "fuse_pool3"
top: "upscore8"
param {
lr_mult: 0
}
convolution_param {
num_output: 21
bias_term: false
kernel_size: 16
stride: 8
}
}
layer {
name: "score"
type: "Crop"
bottom: "upscore8"
bottom: "data"
top: "score"
crop_param {
axis: 2
offset: 31
}
}
layer {
name: "accuracy"
type: "Accuracy"
bottom: "score"
bottom: "label"
top: "accuracy"
include {
phase: TEST
}
}
layer {
name: "loss"
type: "SoftmaxWithLoss"
bottom: "score"
bottom: "label"
top: "loss"
loss_param {
ignore_label: 255
normalize: false
}
}
I1213 12:14:08.702296 1380 layer_factory.hpp:77] Creating layer data
I1213 12:14:08.703297 1380 net.cpp:100] Creating Layer data
I1213 12:14:08.704798 1380 net.cpp:418] data -> data
I1213 12:14:08.705298 7864 common.cpp:36] System entropy source not available, using fallback algorithm to generate seed instead.
I1213 12:14:08.706300 1380 data_transformer.cpp:25] Loading mean file from: G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_val_mean.binaryproto
I1213 12:14:08.707300 7864 db_lmdb.cpp:40] Opened lmdb G:/interest_of_imags_for_recognation/VOC2007/Resize224/Img_val
I1213 12:14:08.709803 1380 data_layer.cpp:41] output data size: 1,3,224,224
I1213 12:14:08.715806 1380 net.cpp:150] Setting up data
I1213 12:14:08.716306 1380 net.cpp:157] Top shape: 1 3 224 224 (150528)
I1213 12:14:08.716807 1380 net.cpp:165] Memory required for data: 602112
I1213 12:14:08.716807 1380 layer_factory.hpp:77] Creating layer data_data_0_split
I1213 12:14:08.717808 13228 common.cpp:36] System entropy source not available, using fallback algorithm to generate seed instead.
I1213 12:14:08.718808 1380 net.cpp:100] Creating Layer data_data_0_split
I1213 12:14:08.720309 1380 net.cpp:444] data_data_0_split <- data
I1213 12:14:08.722811 1380 net.cpp:418] data_data_0_split -> data_data_0_split_0
I1213 12:14:08.723311 1380 net.cpp:418] data_data_0_split -> data_data_0_split_1
I1213 12:14:08.723311 1380 net.cpp:150] Setting up data_data_0_split
I1213 12:14:08.723811 1380 net.cpp:157] Top shape: 1 3 224 224 (150528)
I1213 12:14:08.724812 1380 net.cpp:157] Top shape: 1 3 224 224 (150528)
I1213 12:14:08.724812 1380 net.cpp:165] Memory required for data: 1806336
I1213 12:14:08.724812 1380 layer_factory.hpp:77] Creating layer label
I1213 12:14:08.725312 1380 net.cpp:100] Creating Layer label
I1213 12:14:08.725813 1380 net.cpp:418] label -> label
I1213 12:14:08.727314 1380 data_transformer.cpp:25] Loading mean file from: G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_val_mean.binaryproto
I1213 12:14:08.727814 2624 common.cpp:36] System entropy source not available, using fallback algorithm to generate seed instead.
I1213 12:14:08.730816 2624 db_lmdb.cpp:40] Opened lmdb G:/interest_of_imags_for_recognation/VOC2007/Resize224/Label_val
I1213 12:14:08.731317 1380 data_layer.cpp:41] output data size: 1,1,224,224
I1213 12:14:08.733319 1380 net.cpp:150] Setting up label
I1213 12:14:08.733319 1380 net.cpp:157] Top shape: 1 1 224 224 (50176)
I1213 12:14:08.734318 5184 common.cpp:36] System entropy source not available, using fallback algorithm to generate seed instead.
I1213 12:14:08.734819 1380 net.cpp:165] Memory required for data: 2007040
I1213 12:14:08.736820 1380 layer_factory.hpp:77] Creating layer label_label_0_split
I1213 12:14:08.737321 1380 net.cpp:100] Creating Layer label_label_0_split
I1213 12:14:08.738822 1380 net.cpp:444] label_label_0_split <- label
I1213 12:14:08.739823 1380 net.cpp:418] label_label_0_split -> label_label_0_split_0
I1213 12:14:08.739823 1380 net.cpp:418] label_label_0_split -> label_label_0_split_1
I1213 12:14:08.740324 1380 net.cpp:150] Setting up label_label_0_split
I1213 12:14:08.740823 1380 net.cpp:157] Top shape: 1 1 224 224 (50176)
I1213 12:14:08.741324 1380 net.cpp:157] Top shape: 1 1 224 224 (50176)
I1213 12:14:08.742324 1380 net.cpp:165] Memory required for data: 2408448
I1213 12:14:08.743825 1380 layer_factory.hpp:77] Creating layer conv1_1
I1213 12:14:08.744827 1380 net.cpp:100] Creating Layer conv1_1
I1213 12:14:08.745326 1380 net.cpp:444] conv1_1 <- data_data_0_split_0
I1213 12:14:08.746327 1380 net.cpp:418] conv1_1 -> conv1_1
I1213 12:14:08.749830 1380 net.cpp:150] Setting up conv1_1
I1213 12:14:08.749830 1380 net.cpp:157] Top shape: 1 64 422 422 (11397376)
I1213 12:14:08.750830 1380 net.cpp:165] Memory required for data: 47997952
I1213 12:14:08.751332 1380 layer_factory.hpp:77] Creating layer relu1_1
I1213 12:14:08.751832 1380 net.cpp:100] Creating Layer relu1_1
I1213 12:14:08.752832 1380 net.cpp:444] relu1_1 <- conv1_1
I1213 12:14:08.753332 1380 net.cpp:405] relu1_1 -> conv1_1 (in-place)
I1213 12:14:08.756836 1380 net.cpp:150] Setting up relu1_1
I1213 12:14:08.757336 1380 net.cpp:157] Top shape: 1 64 422 422 (11397376)
I1213 12:14:08.757835 1380 net.cpp:165] Memory required for data: 93587456
I1213 12:14:08.760339 1380 layer_factory.hpp:77] Creating layer conv1_2
I1213 12:14:08.761338 1380 net.cpp:100] Creating Layer conv1_2
I1213 12:14:08.761838 1380 net.cpp:444] conv1_2 <- conv1_1
I1213 12:14:08.762339 1380 net.cpp:418] conv1_2 -> conv1_2
I1213 12:14:08.767343 1380 net.cpp:150] Setting up conv1_2
I1213 12:14:08.767843 1380 net.cpp:157] Top shape: 1 64 422 422 (11397376)
I1213 12:14:08.768343 1380 net.cpp:165] Memory required for data: 139176960
I1213 12:14:08.769345 1380 layer_factory.hpp:77] Creating layer relu1_2
I1213 12:14:08.769845 1380 net.cpp:100] Creating Layer relu1_2
I1213 12:14:08.771845 1380 net.cpp:444] relu1_2 <- conv1_2
I1213 12:14:08.772346 1380 net.cpp:405] relu1_2 -> conv1_2 (in-place)
I1213 12:14:08.775348 1380 net.cpp:150] Setting up relu1_2
I1213 12:14:08.775849 1380 net.cpp:157] Top shape: 1 64 422 422 (11397376)
I1213 12:14:08.776350 1380 net.cpp:165] Memory required for data: 184766464
I1213 12:14:08.777349 1380 layer_factory.hpp:77] Creating layer pool1
I1213 12:14:08.778350 1380 net.cpp:100] Creating Layer pool1
I1213 12:14:08.778851 1380 net.cpp:444] pool1 <- conv1_2
I1213 12:14:08.779851 1380 net.cpp:418] pool1 -> pool1
I1213 12:14:08.780853 1380 net.cpp:150] Setting up pool1
I1213 12:14:08.781352 1380 net.cpp:157] Top shape: 1 64 211 211 (2849344)
I1213 12:14:08.782353 1380 net.cpp:165] Memory required for data: 196163840
I1213 12:14:08.782853 1380 layer_factory.hpp:77] Creating layer conv2_1
I1213 12:14:08.783854 1380 net.cpp:100] Creating Layer conv2_1
I1213 12:14:08.784854 1380 net.cpp:444] conv2_1 <- pool1
I1213 12:14:08.785356 1380 net.cpp:418] conv2_1 -> conv2_1
I1213 12:14:08.791860 1380 net.cpp:150] Setting up conv2_1
I1213 12:14:08.791860 1380 net.cpp:157] Top shape: 1 128 211 211 (5698688)
I1213 12:14:08.792861 1380 net.cpp:165] Memory required for data: 218958592
I1213 12:14:08.793861 1380 layer_factory.hpp:77] Creating layer relu2_1
I1213 12:14:08.794363 1380 net.cpp:100] Creating Layer relu2_1
I1213 12:14:08.794862 1380 net.cpp:444] relu2_1 <- conv2_1
I1213 12:14:08.795362 1380 net.cpp:405] relu2_1 -> conv2_1 (in-place)
I1213 12:14:08.796363 1380 net.cpp:150] Setting up relu2_1
I1213 12:14:08.796363 1380 net.cpp:157] Top shape: 1 128 211 211 (5698688)
I1213 12:14:08.796864 1380 net.cpp:165] Memory required for data: 241753344
I1213 12:14:08.797363 1380 layer_factory.hpp:77] Creating layer conv2_2
I1213 12:14:08.797864 1380 net.cpp:100] Creating Layer conv2_2
I1213 12:14:08.798364 1380 net.cpp:444] conv2_2 <- conv2_1
I1213 12:14:08.798864 1380 net.cpp:418] conv2_2 -> conv2_2
I1213 12:14:08.802367 1380 net.cpp:150] Setting up conv2_2
I1213 12:14:08.802367 1380 net.cpp:157] Top shape: 1 128 211 211 (5698688)
I1213 12:14:08.803367 1380 net.cpp:165] Memory required for data: 264548096
I1213 12:14:08.803869 1380 layer_factory.hpp:77] Creating layer relu2_2
I1213 12:14:08.804869 1380 net.cpp:100] Creating Layer relu2_2
I1213 12:14:08.807371 1380 net.cpp:444] relu2_2 <- conv2_2
I1213 12:14:08.808372 1380 net.cpp:405] relu2_2 -> conv2_2 (in-place)
I1213 12:14:08.809875 1380 net.cpp:150] Setting up relu2_2
I1213 12:14:08.810374 1380 net.cpp:157] Top shape: 1 128 211 211 (5698688)
I1213 12:14:08.810874 1380 net.cpp:165] Memory required for data: 287342848
I1213 12:14:08.811373 1380 layer_factory.hpp:77] Creating layer pool2
I1213 12:14:08.811874 1380 net.cpp:100] Creating Layer pool2
I1213 12:14:08.812374 1380 net.cpp:444] pool2 <- conv2_2
I1213 12:14:08.812875 1380 net.cpp:418] pool2 -> pool2
I1213 12:14:08.813375 1380 net.cpp:150] Setting up pool2
I1213 12:14:08.813875 1380 net.cpp:157] Top shape: 1 128 106 106 (1438208)
I1213 12:14:08.814376 1380 net.cpp:165] Memory required for data: 293095680
I1213 12:14:08.814877 1380 layer_factory.hpp:77] Creating layer conv3_1
I1213 12:14:08.815376 1380 net.cpp:100] Creating Layer conv3_1
I1213 12:14:08.815877 1380 net.cpp:444] conv3_1 <- pool2
I1213 12:14:08.816377 1380 net.cpp:418] conv3_1 -> conv3_1
I1213 12:14:08.819380 1380 net.cpp:150] Setting up conv3_1
I1213 12:14:08.819380 1380 net.cpp:157] Top shape: 1 256 106 106 (2876416)
I1213 12:14:08.819880 1380 net.cpp:165] Memory required for data: 304601344
I1213 12:14:08.822382 1380 layer_factory.hpp:77] Creating layer relu3_1
I1213 12:14:08.823384 1380 net.cpp:100] Creating Layer relu3_1
I1213 12:14:08.823884 1380 net.cpp:444] relu3_1 <- conv3_1
I1213 12:14:08.824383 1380 net.cpp:405] relu3_1 -> conv3_1 (in-place)
I1213 12:14:08.826386 1380 net.cpp:150] Setting up relu3_1
I1213 12:14:08.826886 1380 net.cpp:157] Top shape: 1 256 106 106 (2876416)
I1213 12:14:08.827386 1380 net.cpp:165] Memory required for data: 316107008
I1213 12:14:08.828387 1380 layer_factory.hpp:77] Creating layer conv3_2
I1213 12:14:08.828887 1380 net.cpp:100] Creating Layer conv3_2
I1213 12:14:08.829887 1380 net.cpp:444] conv3_2 <- conv3_1
I1213 12:14:08.830387 1380 net.cpp:418] conv3_2 -> conv3_2
I1213 12:14:08.838393 1380 net.cpp:150] Setting up conv3_2
I1213 12:14:08.838393 1380 net.cpp:157] Top shape: 1 256 106 106 (2876416)
I1213 12:14:08.838893 1380 net.cpp:165] Memory required for data: 327612672
I1213 12:14:08.840395 1380 layer_factory.hpp:77] Creating layer relu3_2
I1213 12:14:08.840894 1380 net.cpp:100] Creating Layer relu3_2
I1213 12:14:08.841395 1380 net.cpp:444] relu3_2 <- conv3_2
I1213 12:14:08.841895 1380 net.cpp:405] relu3_2 -> conv3_2 (in-place)
I1213 12:14:08.842896 1380 net.cpp:150] Setting up relu3_2
I1213 12:14:08.842896 1380 net.cpp:157] Top shape: 1 256 106 106 (2876416)
I1213 12:14:08.843397 1380 net.cpp:165] Memory required for data: 339118336
I1213 12:14:08.844398 1380 layer_factory.hpp:77] Creating layer conv3_3
I1213 12:14:08.844898 1380 net.cpp:100] Creating Layer conv3_3
I1213 12:14:08.845397 1380 net.cpp:444] conv3_3 <- conv3_2
I1213 12:14:08.845899 1380 net.cpp:418] conv3_3 -> conv3_3
I1213 12:14:08.850401 1380 net.cpp:150] Setting up conv3_3
I1213 12:14:08.850401 1380 net.cpp:157] Top shape: 1 256 106 106 (2876416)
I1213 12:14:08.851402 1380 net.cpp:165] Memory required for data: 350624000
I1213 12:14:08.851903 1380 layer_factory.hpp:77] Creating layer relu3_3
I1213 12:14:08.852403 1380 net.cpp:100] Creating Layer relu3_3
I1213 12:14:08.852903 1380 net.cpp:444] relu3_3 <- conv3_3
I1213 12:14:08.853404 1380 net.cpp:405] relu3_3 -> conv3_3 (in-place)
I1213 12:14:08.854964 1380 net.cpp:150] Setting up relu3_3
I1213 12:14:08.855406 1380 net.cpp:157] Top shape: 1 256 106 106 (2876416)
I1213 12:14:08.855906 1380 net.cpp:165] Memory required for data: 362129664
I1213 12:14:08.856405 1380 layer_factory.hpp:77] Creating layer pool3
I1213 12:14:08.856906 1380 net.cpp:100] Creating Layer pool3
I1213 12:14:08.857406 1380 net.cpp:444] pool3 <- conv3_3
I1213 12:14:08.857906 1380 net.cpp:418] pool3 -> pool3
I1213 12:14:08.858907 1380 net.cpp:150] Setting up pool3
I1213 12:14:08.858907 1380 net.cpp:157] Top shape: 1 256 53 53 (719104)
I1213 12:14:08.859908 1380 net.cpp:165] Memory required for data: 365006080
I1213 12:14:08.860409 1380 layer_factory.hpp:77] Creating layer pool3_pool3_0_split
I1213 12:14:08.860909 1380 net.cpp:100] Creating Layer pool3_pool3_0_split
I1213 12:14:08.860909 1380 net.cpp:444] pool3_pool3_0_split <- pool3
I1213 12:14:08.861409 1380 net.cpp:418] pool3_pool3_0_split -> pool3_pool3_0_split_0
I1213 12:14:08.861910 1380 net.cpp:418] pool3_pool3_0_split -> pool3_pool3_0_split_1
I1213 12:14:08.862910 1380 net.cpp:150] Setting up pool3_pool3_0_split
I1213 12:14:08.862910 1380 net.cpp:157] Top shape: 1 256 53 53 (719104)
I1213 12:14:08.863410 1380 net.cpp:157] Top shape: 1 256 53 53 (719104)
I1213 12:14:08.863910 1380 net.cpp:165] Memory required for data: 370758912
I1213 12:14:08.864411 1380 layer_factory.hpp:77] Creating layer conv4_1
I1213 12:14:08.864912 1380 net.cpp:100] Creating Layer conv4_1
I1213 12:14:08.865412 1380 net.cpp:444] conv4_1 <- pool3_pool3_0_split_0
I1213 12:14:08.865913 1380 net.cpp:418] conv4_1 -> conv4_1
I1213 12:14:08.871917 1380 net.cpp:150] Setting up conv4_1
I1213 12:14:08.871917 1380 net.cpp:157] Top shape: 1 512 53 53 (1438208)
I1213 12:14:08.872918 1380 net.cpp:165] Memory required for data: 376511744
I1213 12:14:08.873417 1380 layer_factory.hpp:77] Creating layer relu4_1
I1213 12:14:08.874919 1380 net.cpp:100] Creating Layer relu4_1
I1213 12:14:08.875419 1380 net.cpp:444] relu4_1 <- conv4_1
I1213 12:14:08.875921 1380 net.cpp:405] relu4_1 -> conv4_1 (in-place)
I1213 12:14:08.877421 1380 net.cpp:150] Setting up relu4_1
I1213 12:14:08.877421 1380 net.cpp:157] Top shape: 1 512 53 53 (1438208)
I1213 12:14:08.877921 1380 net.cpp:165] Memory required for data: 382264576
I1213 12:14:08.878422 1380 layer_factory.hpp:77] Creating layer conv4_2
I1213 12:14:08.878922 1380 net.cpp:100] Creating Layer conv4_2
I1213 12:14:08.879422 1380 net.cpp:444] conv4_2 <- conv4_1
I1213 12:14:08.879923 1380 net.cpp:418] conv4_2 -> conv4_2
I1213 12:14:08.885927 1380 net.cpp:150] Setting up conv4_2
I1213 12:14:08.885927 1380 net.cpp:157] Top shape: 1 512 53 53 (1438208)
I1213 12:14:08.886929 1380 net.cpp:165] Memory required for data: 388017408
I1213 12:14:08.886929 1380 layer_factory.hpp:77] Creating layer relu4_2
I1213 12:14:08.886929 1380 net.cpp:100] Creating Layer relu4_2
I1213 12:14:08.886929 1380 net.cpp:444] relu4_2 <- conv4_2
I1213 12:14:08.886929 1380 net.cpp:405] relu4_2 -> conv4_2 (in-place)
I1213 12:14:08.887929 1380 net.cpp:150] Setting up relu4_2
I1213 12:14:08.888429 1380 net.cpp:157] Top shape: 1 512 53 53 (1438208)
I1213 12:14:08.888929 1380 net.cpp:165] Memory required for data: 393770240
I1213 12:14:08.890933 1380 layer_factory.hpp:77] Creating layer conv4_3
I1213 12:14:08.891433 1380 net.cpp:100] Creating Layer conv4_3
I1213 12:14:08.891433 1380 net.cpp:444] conv4_3 <- conv4_2
I1213 12:14:08.891433 1380 net.cpp:418] conv4_3 -> conv4_3
I1213 12:14:08.897935 1380 net.cpp:150] Setting up conv4_3
I1213 12:14:08.897935 1380 net.cpp:157] Top shape: 1 512 53 53 (1438208)
I1213 12:14:08.898437 1380 net.cpp:165] Memory required for data: 399523072
I1213 12:14:08.898936 1380 layer_factory.hpp:77] Creating layer relu4_3
I1213 12:14:08.899437 1380 net.cpp:100] Creating Layer relu4_3
I1213 12:14:08.899937 1380 net.cpp:444] relu4_3 <- conv4_3
I1213 12:14:08.900437 1380 net.cpp:405] relu4_3 -> conv4_3 (in-place)
I1213 12:14:08.901938 1380 net.cpp:150] Setting up relu4_3
I1213 12:14:08.902438 1380 net.cpp:157] Top shape: 1 512 53 53 (1438208)
I1213 12:14:08.902938 1380 net.cpp:165] Memory required for data: 405275904
I1213 12:14:08.903439 1380 layer_factory.hpp:77] Creating layer pool4
I1213 12:14:08.903939 1380 net.cpp:100] Creating Layer pool4
I1213 12:14:08.904940 1380 net.cpp:444] pool4 <- conv4_3
I1213 12:14:08.907443 1380 net.cpp:418] pool4 -> pool4
I1213 12:14:08.907443 1380 net.cpp:150] Setting up pool4
I1213 12:14:08.907943 1380 net.cpp:157] Top shape: 1 512 27 27 (373248)
I1213 12:14:08.908443 1380 net.cpp:165] Memory required for data: 406768896
I1213 12:14:08.908443 1380 layer_factory.hpp:77] Creating layer pool4_pool4_0_split
I1213 12:14:08.908443 1380 net.cpp:100] Creating Layer pool4_pool4_0_split
I1213 12:14:08.908943 1380 net.cpp:444] pool4_pool4_0_split <- pool4
I1213 12:14:08.909443 1380 net.cpp:418] pool4_pool4_0_split -> pool4_pool4_0_split_0
I1213 12:14:08.909945 1380 net.cpp:418] pool4_pool4_0_split -> pool4_pool4_0_split_1
I1213 12:14:08.910444 1380 net.cpp:150] Setting up pool4_pool4_0_split
I1213 12:14:08.910944 1380 net.cpp:157] Top shape: 1 512 27 27 (373248)
I1213 12:14:08.911445 1380 net.cpp:157] Top shape: 1 512 27 27 (373248)
I1213 12:14:08.911445 1380 net.cpp:165] Memory required for data: 409754880
I1213 12:14:08.911445 1380 layer_factory.hpp:77] Creating layer conv5_1
I1213 12:14:08.911945 1380 net.cpp:100] Creating Layer conv5_1
I1213 12:14:08.912446 1380 net.cpp:444] conv5_1 <- pool4_pool4_0_split_0
I1213 12:14:08.912946 1380 net.cpp:418] conv5_1 -> conv5_1
I1213 12:14:08.919451 1380 net.cpp:150] Setting up conv5_1
I1213 12:14:08.919451 1380 net.cpp:157] Top shape: 1 512 27 27 (373248)
I1213 12:14:08.919951 1380 net.cpp:165] Memory required for data: 411247872
I1213 12:14:08.922454 1380 layer_factory.hpp:77] Creating layer relu5_1
I1213 12:14:08.922953 1380 net.cpp:100] Creating Layer relu5_1
I1213 12:14:08.923954 1380 net.cpp:444] relu5_1 <- conv5_1
I1213 12:14:08.923954 1380 net.cpp:405] relu5_1 -> conv5_1 (in-place)
I1213 12:14:08.924454 1380 net.cpp:150] Setting up relu5_1
I1213 12:14:08.924955 1380 net.cpp:157] Top shape: 1 512 27 27 (373248)
I1213 12:14:08.925456 1380 net.cpp:165] Memory required for data: 412740864
I1213 12:14:08.925956 1380 layer_factory.hpp:77] Creating layer conv5_2
I1213 12:14:08.926456 1380 net.cpp:100] Creating Layer conv5_2
I1213 12:14:08.926956 1380 net.cpp:444] conv5_2 <- conv5_1
I1213 12:14:08.927458 1380 net.cpp:418] conv5_2 -> conv5_2
I1213 12:14:08.933961 1380 net.cpp:150] Setting up conv5_2
I1213 12:14:08.933961 1380 net.cpp:157] Top shape: 1 512 27 27 (373248)
I1213 12:14:08.934463 1380 net.cpp:165] Memory required for data: 414233856
I1213 12:14:08.934962 1380 layer_factory.hpp:77] Creating layer relu5_2
I1213 12:14:08.935462 1380 net.cpp:100] Creating Layer relu5_2
I1213 12:14:08.938464 1380 net.cpp:444] relu5_2 <- conv5_2
I1213 12:14:08.938464 1380 net.cpp:405] relu5_2 -> conv5_2 (in-place)
I1213 12:14:08.939966 1380 net.cpp:150] Setting up relu5_2
I1213 12:14:08.940466 1380 net.cpp:157] Top shape: 1 512 27 27 (373248)
I1213 12:14:08.940966 1380 net.cpp:165] Memory required for data: 415726848
I1213 12:14:08.941467 1380 layer_factory.hpp:77] Creating layer conv5_3
I1213 12:14:08.942467 1380 net.cpp:100] Creating Layer conv5_3
I1213 12:14:08.942467 1380 net.cpp:444] conv5_3 <- conv5_2
I1213 12:14:08.942467 1380 net.cpp:418] conv5_3 -> conv5_3
I1213 12:14:08.948472 1380 net.cpp:150] Setting up conv5_3
I1213 12:14:08.948472 1380 net.cpp:157] Top shape: 1 512 27 27 (373248)
I1213 12:14:08.948973 1380 net.cpp:165] Memory required for data: 417219840
I1213 12:14:08.949973 1380 layer_factory.hpp:77] Creating layer relu5_3
I1213 12:14:08.950474 1380 net.cpp:100] Creating Layer relu5_3
I1213 12:14:08.950973 1380 net.cpp:444] relu5_3 <- conv5_3
I1213 12:14:08.950973 1380 net.cpp:405] relu5_3 -> conv5_3 (in-place)
I1213 12:14:08.951975 1380 net.cpp:150] Setting up relu5_3
I1213 12:14:08.952976 1380 net.cpp:157] Top shape: 1 512 27 27 (373248)
I1213 12:14:08.952976 1380 net.cpp:165] Memory required for data: 418712832
I1213 12:14:08.952976 1380 layer_factory.hpp:77] Creating layer pool5
I1213 12:14:08.952976 1380 net.cpp:100] Creating Layer pool5
I1213 12:14:08.952976 1380 net.cpp:444] pool5 <- conv5_3
I1213 12:14:08.953476 1380 net.cpp:418] pool5 -> pool5
I1213 12:14:08.953476 1380 net.cpp:150] Setting up pool5
I1213 12:14:08.954977 1380 net.cpp:157] Top shape: 1 512 14 14 (100352)
I1213 12:14:08.955476 1380 net.cpp:165] Memory required for data: 419114240
I1213 12:14:08.955977 1380 layer_factory.hpp:77] Creating layer fc6
I1213 12:14:08.956979 1380 net.cpp:100] Creating Layer fc6
I1213 12:14:08.957479 1380 net.cpp:444] fc6 <- pool5
I1213 12:14:08.957979 1380 net.cpp:418] fc6 -> fc6
I1213 12:14:09.144121 1380 net.cpp:150] Setting up fc6
I1213 12:14:09.144121 1380 net.cpp:157] Top shape: 1 4096 8 8 (262144)
I1213 12:14:09.144611 1380 net.cpp:165] Memory required for data: 420162816
I1213 12:14:09.145612 1380 layer_factory.hpp:77] Creating layer relu6
I1213 12:14:09.146113 1380 net.cpp:100] Creating Layer relu6
I1213 12:14:09.146613 1380 net.cpp:444] relu6 <- fc6
I1213 12:14:09.147114 1380 net.cpp:405] relu6 -> fc6 (in-place)
I1213 12:14:09.148114 1380 net.cpp:150] Setting up relu6
I1213 12:14:09.148114 1380 net.cpp:157] Top shape: 1 4096 8 8 (262144)
I1213 12:14:09.148614 1380 net.cpp:165] Memory required for data: 421211392
I1213 12:14:09.149114 1380 layer_factory.hpp:77] Creating layer drop6
I1213 12:14:09.149616 1380 net.cpp:100] Creating Layer drop6
I1213 12:14:09.150615 1380 net.cpp:444] drop6 <- fc6
I1213 12:14:09.151116 1380 net.cpp:405] drop6 -> fc6 (in-place)
I1213 12:14:09.151617 1380 net.cpp:150] Setting up drop6
I1213 12:14:09.152117 1380 net.cpp:157] Top shape: 1 4096 8 8 (262144)
I1213 12:14:09.153617 1380 net.cpp:165] Memory required for data: 422259968
I1213 12:14:09.154119 1380 layer_factory.hpp:77] Creating layer fc7
I1213 12:14:09.154618 1380 net.cpp:100] Creating Layer fc7
I1213 12:14:09.155119 1380 net.cpp:444] fc7 <- fc6
I1213 12:14:09.155619 1380 net.cpp:418] fc7 -> fc7
I1213 12:14:09.190145 1380 net.cpp:150] Setting up fc7
I1213 12:14:09.190145 1380 net.cpp:157] Top shape: 1 4096 8 8 (262144)
I1213 12:14:09.191145 1380 net.cpp:165] Memory required for data: 423308544
I1213 12:14:09.191645 1380 layer_factory.hpp:77] Creating layer relu7
I1213 12:14:09.192145 1380 net.cpp:100] Creating Layer relu7
I1213 12:14:09.192646 1380 net.cpp:444] relu7 <- fc7
I1213 12:14:09.193145 1380 net.cpp:405] relu7 -> fc7 (in-place)
I1213 12:14:09.194146 1380 net.cpp:150] Setting up relu7
I1213 12:14:09.194146 1380 net.cpp:157] Top shape: 1 4096 8 8 (262144)
I1213 12:14:09.194648 1380 net.cpp:165] Memory required for data: 424357120
I1213 12:14:09.195147 1380 layer_factory.hpp:77] Creating layer drop7
I1213 12:14:09.195647 1380 net.cpp:100] Creating Layer drop7
I1213 12:14:09.196148 1380 net.cpp:444] drop7 <- fc7
I1213 12:14:09.196648 1380 net.cpp:405] drop7 -> fc7 (in-place)
I1213 12:14:09.197149 1380 net.cpp:150] Setting up drop7
I1213 12:14:09.197649 1380 net.cpp:157] Top shape: 1 4096 8 8 (262144)
I1213 12:14:09.198149 1380 net.cpp:165] Memory required for data: 425405696
I1213 12:14:09.198650 1380 layer_factory.hpp:77] Creating layer score_fr
I1213 12:14:09.200150 1380 net.cpp:100] Creating Layer score_fr
I1213 12:14:09.200651 1380 net.cpp:444] score_fr <- fc7
I1213 12:14:09.201653 1380 net.cpp:418] score_fr -> score_fr
I1213 12:14:09.203654 1380 net.cpp:150] Setting up score_fr
I1213 12:14:09.204154 1380 net.cpp:157] Top shape: 1 21 8 8 (1344)
I1213 12:14:09.204654 1380 net.cpp:165] Memory required for data: 425411072
I1213 12:14:09.205155 1380 layer_factory.hpp:77] Creating layer upscore2
I1213 12:14:09.205656 1380 net.cpp:100] Creating Layer upscore2
I1213 12:14:09.206156 1380 net.cpp:444] upscore2 <- score_fr
I1213 12:14:09.207156 1380 net.cpp:418] upscore2 -> upscore2
I1213 12:14:09.207656 1380 net.cpp:150] Setting up upscore2
I1213 12:14:09.208156 1380 net.cpp:157] Top shape: 1 21 18 18 (6804)
I1213 12:14:09.208657 1380 net.cpp:165] Memory required for data: 425438288
I1213 12:14:09.209157 1380 layer_factory.hpp:77] Creating layer upscore2_upscore2_0_split
I1213 12:14:09.209657 1380 net.cpp:100] Creating Layer upscore2_upscore2_0_split
I1213 12:14:09.210157 1380 net.cpp:444] upscore2_upscore2_0_split <- upscore2
I1213 12:14:09.210659 1380 net.cpp:418] upscore2_upscore2_0_split -> upscore2_upscore2_0_split_0
I1213 12:14:09.211158 1380 net.cpp:418] upscore2_upscore2_0_split -> upscore2_upscore2_0_split_1
I1213 12:14:09.211659 1380 net.cpp:150] Setting up upscore2_upscore2_0_split
I1213 12:14:09.212159 1380 net.cpp:157] Top shape: 1 21 18 18 (6804)
I1213 12:14:09.212661 1380 net.cpp:157] Top shape: 1 21 18 18 (6804)
I1213 12:14:09.213160 1380 net.cpp:165] Memory required for data: 425492720
I1213 12:14:09.213660 1380 layer_factory.hpp:77] Creating layer score_pool4
I1213 12:14:09.214160 1380 net.cpp:100] Creating Layer score_pool4
I1213 12:14:09.216163 1380 net.cpp:444] score_pool4 <- pool4_pool4_0_split_1
I1213 12:14:09.216663 1380 net.cpp:418] score_pool4 -> score_pool4
I1213 12:14:09.219164 1380 net.cpp:150] Setting up score_pool4
I1213 12:14:09.219666 1380 net.cpp:157] Top shape: 1 21 27 27 (15309)
I1213 12:14:09.220165 1380 net.cpp:165] Memory required for data: 425553956
I1213 12:14:09.220665 1380 layer_factory.hpp:77] Creating layer score_pool4c
I1213 12:14:09.221166 1380 net.cpp:100] Creating Layer score_pool4c
I1213 12:14:09.221667 1380 net.cpp:444] score_pool4c <- score_pool4
I1213 12:14:09.222167 1380 net.cpp:444] score_pool4c <- upscore2_upscore2_0_split_0
I1213 12:14:09.222667 1380 net.cpp:418] score_pool4c -> score_pool4c
I1213 12:14:09.223167 1380 net.cpp:150] Setting up score_pool4c
I1213 12:14:09.223667 1380 net.cpp:157] Top shape: 1 21 18 18 (6804)
I1213 12:14:09.224169 1380 net.cpp:165] Memory required for data: 425581172
I1213 12:14:09.224668 1380 layer_factory.hpp:77] Creating layer fuse_pool4
I1213 12:14:09.225168 1380 net.cpp:100] Creating Layer fuse_pool4
I1213 12:14:09.225668 1380 net.cpp:444] fuse_pool4 <- upscore2_upscore2_0_split_1
I1213 12:14:09.226169 1380 net.cpp:444] fuse_pool4 <- score_pool4c
I1213 12:14:09.226670 1380 net.cpp:418] fuse_pool4 -> fuse_pool4
I1213 12:14:09.227170 1380 net.cpp:150] Setting up fuse_pool4
I1213 12:14:09.227670 1380 net.cpp:157] Top shape: 1 21 18 18 (6804)
I1213 12:14:09.228171 1380 net.cpp:165] Memory required for data: 425608388
I1213 12:14:09.228672 1380 layer_factory.hpp:77] Creating layer upscore_pool4
I1213 12:14:09.229171 1380 net.cpp:100] Creating Layer upscore_pool4
I1213 12:14:09.229672 1380 net.cpp:444] upscore_pool4 <- fuse_pool4
I1213 12:14:09.231673 1380 net.cpp:418] upscore_pool4 -> upscore_pool4
I1213 12:14:09.233175 1380 net.cpp:150] Setting up upscore_pool4
I1213 12:14:09.233175 1380 net.cpp:157] Top shape: 1 21 38 38 (30324)
I1213 12:14:09.233675 1380 net.cpp:165] Memory required for data: 425729684
I1213 12:14:09.234175 1380 layer_factory.hpp:77] Creating layer upscore_pool4_upscore_pool4_0_split
I1213 12:14:09.234676 1380 net.cpp:100] Creating Layer upscore_pool4_upscore_pool4_0_split
I1213 12:14:09.235177 1380 net.cpp:444] upscore_pool4_upscore_pool4_0_split <- upscore_pool4
I1213 12:14:09.235677 1380 net.cpp:418] upscore_pool4_upscore_pool4_0_split -> upscore_pool4_upscore_pool4_0_split_0
I1213 12:14:09.236176 1380 net.cpp:418] upscore_pool4_upscore_pool4_0_split -> upscore_pool4_upscore_pool4_0_split_1
I1213 12:14:09.236677 1380 net.cpp:150] Setting up upscore_pool4_upscore_pool4_0_split
I1213 12:14:09.237177 1380 net.cpp:157] Top shape: 1 21 38 38 (30324)
I1213 12:14:09.238178 1380 net.cpp:157] Top shape: 1 21 38 38 (30324)
I1213 12:14:09.238679 1380 net.cpp:165] Memory required for data: 425972276
I1213 12:14:09.239179 1380 layer_factory.hpp:77] Creating layer score_pool3
I1213 12:14:09.239680 1380 net.cpp:100] Creating Layer score_pool3
I1213 12:14:09.240180 1380 net.cpp:444] score_pool3 <- pool3_pool3_0_split_1
I1213 12:14:09.240680 1380 net.cpp:418] score_pool3 -> score_pool3
I1213 12:14:09.243181 1380 net.cpp:150] Setting up score_pool3
I1213 12:14:09.243682 1380 net.cpp:157] Top shape: 1 21 53 53 (58989)
I1213 12:14:09.244182 1380 net.cpp:165] Memory required for data: 426208232
I1213 12:14:09.244684 1380 layer_factory.hpp:77] Creating layer score_pool3c
I1213 12:14:09.245184 1380 net.cpp:100] Creating Layer score_pool3c
I1213 12:14:09.246685 1380 net.cpp:444] score_pool3c <- score_pool3
I1213 12:14:09.247186 1380 net.cpp:444] score_pool3c <- upscore_pool4_upscore_pool4_0_split_0
I1213 12:14:09.247687 1380 net.cpp:418] score_pool3c -> score_pool3c
I1213 12:14:09.248687 1380 net.cpp:150] Setting up score_pool3c
I1213 12:14:09.248687 1380 net.cpp:157] Top shape: 1 21 38 38 (30324)
I1213 12:14:09.249187 1380 net.cpp:165] Memory required for data: 426329528
I1213 12:14:09.250187 1380 layer_factory.hpp:77] Creating layer fuse_pool3
I1213 12:14:09.250687 1380 net.cpp:100] Creating Layer fuse_pool3
I1213 12:14:09.251188 1380 net.cpp:444] fuse_pool3 <- upscore_pool4_upscore_pool4_0_split_1
I1213 12:14:09.251688 1380 net.cpp:444] fuse_pool3 <- score_pool3c
I1213 12:14:09.252189 1380 net.cpp:418] fuse_pool3 -> fuse_pool3
I1213 12:14:09.252689 1380 net.cpp:150] Setting up fuse_pool3
I1213 12:14:09.253190 1380 net.cpp:157] Top shape: 1 21 38 38 (30324)
I1213 12:14:09.253690 1380 net.cpp:165] Memory required for data: 426450824
I1213 12:14:09.254191 1380 layer_factory.hpp:77] Creating layer upscore8
I1213 12:14:09.255190 1380 net.cpp:100] Creating Layer upscore8
I1213 12:14:09.255691 1380 net.cpp:444] upscore8 <- fuse_pool3
I1213 12:14:09.256192 1380 net.cpp:418] upscore8 -> upscore8
I1213 12:14:09.257701 1380 net.cpp:150] Setting up upscore8
I1213 12:14:09.258193 1380 net.cpp:157] Top shape: 1 21 312 312 (2044224)
I1213 12:14:09.258694 1380 net.cpp:165] Memory required for data: 434627720
I1213 12:14:09.259194 1380 layer_factory.hpp:77] Creating layer score
I1213 12:14:09.259694 1380 net.cpp:100] Creating Layer score
I1213 12:14:09.260195 1380 net.cpp:444] score <- upscore8
I1213 12:14:09.260695 1380 net.cpp:444] score <- data_data_0_split_1
I1213 12:14:09.262195 1380 net.cpp:418] score -> score
I1213 12:14:09.263198 1380 net.cpp:150] Setting up score
I1213 12:14:09.263696 1380 net.cpp:157] Top shape: 1 21 224 224 (1053696)
I1213 12:14:09.264197 1380 net.cpp:165] Memory required for data: 438842504
I1213 12:14:09.264698 1380 layer_factory.hpp:77] Creating layer score_score_0_split
I1213 12:14:09.265198 1380 net.cpp:100] Creating Layer score_score_0_split
I1213 12:14:09.265699 1380 net.cpp:444] score_score_0_split <- score
I1213 12:14:09.266199 1380 net.cpp:418] score_score_0_split -> score_score_0_split_0
I1213 12:14:09.266700 1380 net.cpp:418] score_score_0_split -> score_score_0_split_1
I1213 12:14:09.267200 1380 net.cpp:150] Setting up score_score_0_split
I1213 12:14:09.267700 1380 net.cpp:157] Top shape: 1 21 224 224 (1053696)
I1213 12:14:09.268200 1380 net.cpp:157] Top shape: 1 21 224 224 (1053696)
I1213 12:14:09.268700 1380 net.cpp:165] Memory required for data: 447272072
I1213 12:14:09.269201 1380 layer_factory.hpp:77] Creating layer accuracy
I1213 12:14:09.269701 1380 net.cpp:100] Creating Layer accuracy
I1213 12:14:09.270202 1380 net.cpp:444] accuracy <- score_score_0_split_0
I1213 12:14:09.270702 1380 net.cpp:444] accuracy <- label_label_0_split_0
I1213 12:14:09.271703 1380 net.cpp:418] accuracy -> accuracy
I1213 12:14:09.272202 1380 net.cpp:150] Setting up accuracy
I1213 12:14:09.272703 1380 net.cpp:157] Top shape: (1)
I1213 12:14:09.273203 1380 net.cpp:165] Memory required for data: 447272076
I1213 12:14:09.273704 1380 layer_factory.hpp:77] Creating layer loss
I1213 12:14:09.274204 1380 net.cpp:100] Creating Layer loss
I1213 12:14:09.274704 1380 net.cpp:444] loss <- score_score_0_split_1
I1213 12:14:09.275205 1380 net.cpp:444] loss <- label_label_0_split_1
I1213 12:14:09.275707 1380 net.cpp:418] loss -> loss
I1213 12:14:09.276206 1380 layer_factory.hpp:77] Creating layer loss
I1213 12:14:09.279708 1380 net.cpp:150] Setting up loss
I1213 12:14:09.280208 1380 net.cpp:157] Top shape: (1)
I1213 12:14:09.280709 1380 net.cpp:160] with loss weight 1
I1213 12:14:09.281208 1380 net.cpp:165] Memory required for data: 447272080
I1213 12:14:09.281708 1380 net.cpp:226] loss needs backward computation.
I1213 12:14:09.282209 1380 net.cpp:228] accuracy does not need backward computation.
I1213 12:14:09.282709 1380 net.cpp:226] score_score_0_split needs backward computation.
I1213 12:14:09.283210 1380 net.cpp:226] score needs backward computation.
I1213 12:14:09.283710 1380 net.cpp:226] upscore8 needs backward computation.
I1213 12:14:09.284210 1380 net.cpp:226] fuse_pool3 needs backward computation.
I1213 12:14:09.284711 1380 net.cpp:226] score_pool3c needs backward computation.
I1213 12:14:09.285212 1380 net.cpp:226] score_pool3 needs backward computation.
I1213 12:14:09.285712 1380 net.cpp:226] upscore_pool4_upscore_pool4_0_split needs backward computation.
I1213 12:14:09.286212 1380 net.cpp:226] upscore_pool4 needs backward computation.
I1213 12:14:09.286712 1380 net.cpp:226] fuse_pool4 needs backward computation.
I1213 12:14:09.287214 1380 net.cpp:226] score_pool4c needs backward computation.
I1213 12:14:09.287714 1380 net.cpp:226] score_pool4 needs backward computation.
I1213 12:14:09.288714 1380 net.cpp:226] upscore2_upscore2_0_split needs backward computation.
I1213 12:14:09.289214 1380 net.cpp:226] upscore2 needs backward computation.
I1213 12:14:09.289715 1380 net.cpp:226] score_fr needs backward computation.
I1213 12:14:09.290215 1380 net.cpp:226] drop7 needs backward computation.
I1213 12:14:09.290715 1380 net.cpp:226] relu7 needs backward computation.
I1213 12:14:09.291216 1380 net.cpp:226] fc7 needs backward computation.
I1213 12:14:09.291716 1380 net.cpp:226] drop6 needs backward computation.
I1213 12:14:09.293217 1380 net.cpp:226] relu6 needs backward computation.
I1213 12:14:09.294219 1380 net.cpp:226] fc6 needs backward computation.
I1213 12:14:09.294718 1380 net.cpp:226] pool5 needs backward computation.
I1213 12:14:09.295218 1380 net.cpp:226] relu5_3 needs backward computation.
I1213 12:14:09.295719 1380 net.cpp:226] conv5_3 needs backward computation.
I1213 12:14:09.296221 1380 net.cpp:226] relu5_2 needs backward computation.
I1213 12:14:09.297220 1380 net.cpp:226] conv5_2 needs backward computation.
I1213 12:14:09.297721 1380 net.cpp:226] relu5_1 needs backward computation.
I1213 12:14:09.298221 1380 net.cpp:226] conv5_1 needs backward computation.
I1213 12:14:09.298722 1380 net.cpp:226] pool4_pool4_0_split needs backward computation.
I1213 12:14:09.299222 1380 net.cpp:226] pool4 needs backward computation.
I1213 12:14:09.299722 1380 net.cpp:226] relu4_3 needs backward computation.
I1213 12:14:09.300222 1380 net.cpp:226] conv4_3 needs backward computation.
I1213 12:14:09.300724 1380 net.cpp:226] relu4_2 needs backward computation.
I1213 12:14:09.301223 1380 net.cpp:226] conv4_2 needs backward computation.
I1213 12:14:09.301725 1380 net.cpp:226] relu4_1 needs backward computation.
I1213 12:14:09.302224 1380 net.cpp:226] conv4_1 needs backward computation.
I1213 12:14:09.302724 1380 net.cpp:226] pool3_pool3_0_split needs backward computation.
I1213 12:14:09.303225 1380 net.cpp:226] pool3 needs backward computation.
I1213 12:14:09.303725 1380 net.cpp:226] relu3_3 needs backward computation.
I1213 12:14:09.304225 1380 net.cpp:226] conv3_3 needs backward computation.
I1213 12:14:09.305227 1380 net.cpp:226] relu3_2 needs backward computation.
I1213 12:14:09.305727 1380 net.cpp:226] conv3_2 needs backward computation.
I1213 12:14:09.306226 1380 net.cpp:226] relu3_1 needs backward computation.
I1213 12:14:09.306726 1380 net.cpp:226] conv3_1 needs backward computation.
I1213 12:14:09.307227 1380 net.cpp:226] pool2 needs backward computation.
I1213 12:14:09.309229 1380 net.cpp:226] relu2_2 needs backward computation.
I1213 12:14:09.310231 1380 net.cpp:226] conv2_2 needs backward computation.
I1213 12:14:09.310731 1380 net.cpp:226] relu2_1 needs backward computation.
I1213 12:14:09.311230 1380 net.cpp:226] conv2_1 needs backward computation.
I1213 12:14:09.311731 1380 net.cpp:226] pool1 needs backward computation.
I1213 12:14:09.312232 1380 net.cpp:226] relu1_2 needs backward computation.
I1213 12:14:09.312731 1380 net.cpp:226] conv1_2 needs backward computation.
I1213 12:14:09.313232 1380 net.cpp:226] relu1_1 needs backward computation.
I1213 12:14:09.313732 1380 net.cpp:226] conv1_1 needs backward computation.
I1213 12:14:09.314234 1380 net.cpp:228] label_label_0_split does not need backward computation.
I1213 12:14:09.314733 1380 net.cpp:228] label does not need backward computation.
I1213 12:14:09.315233 1380 net.cpp:228] data_data_0_split does not need backward computation.
I1213 12:14:09.315734 1380 net.cpp:228] data does not need backward computation.
I1213 12:14:09.316234 1380 net.cpp:270] This network produces output accuracy
I1213 12:14:09.316735 1380 net.cpp:270] This network produces output loss
I1213 12:14:09.317235 1380 net.cpp:283] Network initialization done.
I1213 12:14:09.318235 1380 solver.cpp:60] Solver scaffolding done.
I1213 12:14:09.320236 1380 caffe.cpp:155] Finetuning from fcn8s-heavy-pascal.caffemodel
I1213 12:14:13.756211 1380 net.cpp:774] Copying source layer data
I1213 12:14:13.756702 1380 net.cpp:774] Copying source layer data_data_0_split
I1213 12:14:13.757203 1380 net.cpp:774] Copying source layer conv1_1
I1213 12:14:13.757203 1380 net.cpp:774] Copying source layer relu1_1
I1213 12:14:13.757704 1380 net.cpp:774] Copying source layer conv1_2
I1213 12:14:13.757704 1380 net.cpp:774] Copying source layer relu1_2
I1213 12:14:13.758203 1380 net.cpp:774] Copying source layer pool1
I1213 12:14:13.758203 1380 net.cpp:774] Copying source layer conv2_1
I1213 12:14:13.758703 1380 net.cpp:774] Copying source layer relu2_1
I1213 12:14:13.758703 1380 net.cpp:774] Copying source layer conv2_2
I1213 12:14:13.759204 1380 net.cpp:774] Copying source layer relu2_2
I1213 12:14:13.759704 1380 net.cpp:774] Copying source layer pool2
I1213 12:14:13.759704 1380 net.cpp:774] Copying source layer conv3_1
I1213 12:14:13.760705 1380 net.cpp:774] Copying source layer relu3_1
I1213 12:14:13.760705 1380 net.cpp:774] Copying source layer conv3_2
I1213 12:14:13.762207 1380 net.cpp:774] Copying source layer relu3_2
I1213 12:14:13.762207 1380 net.cpp:774] Copying source layer conv3_3
I1213 12:14:13.763208 1380 net.cpp:774] Copying source layer relu3_3
I1213 12:14:13.763707 1380 net.cpp:774] Copying source layer pool3
I1213 12:14:13.763707 1380 net.cpp:774] Copying source layer pool3_pool3_0_split
I1213 12:14:13.764207 1380 net.cpp:774] Copying source layer conv4_1
I1213 12:14:13.766211 1380 net.cpp:774] Copying source layer relu4_1
I1213 12:14:13.767215 1380 net.cpp:774] Copying source layer conv4_2
I1213 12:14:13.771214 1380 net.cpp:774] Copying source layer relu4_2
I1213 12:14:13.771714 1380 net.cpp:774] Copying source layer conv4_3
I1213 12:14:13.774719 1380 net.cpp:774] Copying source layer relu4_3
I1213 12:14:13.775215 1380 net.cpp:774] Copying source layer pool4
I1213 12:14:13.775215 1380 net.cpp:774] Copying source layer pool4_pool4_0_split
I1213 12:14:13.775717 1380 net.cpp:774] Copying source layer conv5_1
I1213 12:14:13.779729 1380 net.cpp:774] Copying source layer relu5_1
I1213 12:14:13.779729 1380 net.cpp:774] Copying source layer conv5_2
I1213 12:14:13.784723 1380 net.cpp:774] Copying source layer relu5_2
I1213 12:14:13.784723 1380 net.cpp:774] Copying source layer conv5_3
I1213 12:14:13.789227 1380 net.cpp:774] Copying source layer relu5_3
I1213 12:14:13.789227 1380 net.cpp:774] Copying source layer pool5
I1213 12:14:13.789726 1380 net.cpp:774] Copying source layer fc6
I1213 12:14:13.927826 1380 net.cpp:774] Copying source layer relu6
I1213 12:14:13.928326 1380 net.cpp:774] Copying source layer drop6
I1213 12:14:13.928825 1380 net.cpp:774] Copying source layer fc7
I1213 12:14:13.949340 1380 net.cpp:774] Copying source layer relu7
I1213 12:14:13.949340 1380 net.cpp:774] Copying source layer drop7
I1213 12:14:13.949841 1380 net.cpp:774] Copying source layer score_fr
I1213 12:14:13.950340 1380 net.cpp:774] Copying source layer upscore2
I1213 12:14:13.950340 1380 net.cpp:774] Copying source layer upscore2_upscore2_0_split
I1213 12:14:13.950840 1380 net.cpp:774] Copying source layer score_pool4
I1213 12:14:13.950840 1380 net.cpp:774] Copying source layer score_pool4c
I1213 12:14:13.951341 1380 net.cpp:774] Copying source layer fuse_pool4
I1213 12:14:13.951341 1380 net.cpp:774] Copying source layer upscore_pool4
I1213 12:14:13.952844 1380 net.cpp:774] Copying source layer upscore_pool4_upscore_pool4_0_split
I1213 12:14:13.953343 1380 net.cpp:774] Copying source layer score_pool3
I1213 12:14:13.953842 1380 net.cpp:774] Copying source layer score_pool3c
I1213 12:14:13.953842 1380 net.cpp:774] Copying source layer fuse_pool3
I1213 12:14:13.954344 1380 net.cpp:774] Copying source layer upscore8
I1213 12:14:13.954843 1380 net.cpp:774] Copying source layer score
I1213 12:14:13.954843 1380 net.cpp:774] Copying source layer loss
I1213 12:14:14.854532 1380 net.cpp:774] Copying source layer data
I1213 12:14:14.855533 1380 net.cpp:774] Copying source layer data_data_0_split
I1213 12:14:14.856040 1380 net.cpp:774] Copying source layer conv1_1
I1213 12:14:14.856040 1380 net.cpp:774] Copying source layer relu1_1
I1213 12:14:14.856533 1380 net.cpp:774] Copying source layer conv1_2
I1213 12:14:14.857034 1380 net.cpp:774] Copying source layer relu1_2
I1213 12:14:14.857034 1380 net.cpp:774] Copying source layer pool1
I1213 12:14:14.857533 1380 net.cpp:774] Copying source layer conv2_1
I1213 12:14:14.857533 1380 net.cpp:774] Copying source layer relu2_1
I1213 12:14:14.858036 1380 net.cpp:774] Copying source layer conv2_2
I1213 12:14:14.858536 1380 net.cpp:774] Copying source layer relu2_2
I1213 12:14:14.858536 1380 net.cpp:774] Copying source layer pool2
I1213 12:14:14.858536 1380 net.cpp:774] Copying source layer conv3_1
I1213 12:14:14.859539 1380 net.cpp:774] Copying source layer relu3_1
I1213 12:14:14.859539 1380 net.cpp:774] Copying source layer conv3_2
I1213 12:14:14.860539 1380 net.cpp:774] Copying source layer relu3_2
I1213 12:14:14.860539 1380 net.cpp:774] Copying source layer conv3_3
I1213 12:14:14.861539 1380 net.cpp:774] Copying source layer relu3_3
I1213 12:14:14.861539 1380 net.cpp:774] Copying source layer pool3
I1213 12:14:14.863039 1380 net.cpp:774] Copying source layer pool3_pool3_0_split
I1213 12:14:14.864039 1380 net.cpp:774] Copying source layer conv4_1
I1213 12:14:14.865545 1380 net.cpp:774] Copying source layer relu4_1
I1213 12:14:14.866041 1380 net.cpp:774] Copying source layer conv4_2
I1213 12:14:14.869045 1380 net.cpp:774] Copying source layer relu4_2
I1213 12:14:14.869045 1380 net.cpp:774] Copying source layer conv4_3
I1213 12:14:14.873046 1380 net.cpp:774] Copying source layer relu4_3
I1213 12:14:14.873545 1380 net.cpp:774] Copying source layer pool4
I1213 12:14:14.874047 1380 net.cpp:774] Copying source layer pool4_pool4_0_split
I1213 12:14:14.875052 1380 net.cpp:774] Copying source layer conv5_1
I1213 12:14:14.878060 1380 net.cpp:774] Copying source layer relu5_1
I1213 12:14:14.878548 1380 net.cpp:774] Copying source layer conv5_2
I1213 12:14:14.882055 1380 net.cpp:774] Copying source layer relu5_2
I1213 12:14:14.883080 1380 net.cpp:774] Copying source layer conv5_3
I1213 12:14:14.886059 1380 net.cpp:774] Copying source layer relu5_3
I1213 12:14:14.886555 1380 net.cpp:774] Copying source layer pool5
I1213 12:14:14.887054 1380 net.cpp:774] Copying source layer fc6
I1213 12:14:15.006645 1380 net.cpp:774] Copying source layer relu6
I1213 12:14:15.006645 1380 net.cpp:774] Copying source layer drop6
I1213 12:14:15.007140 1380 net.cpp:774] Copying source layer fc7
I1213 12:14:15.030658 1380 net.cpp:774] Copying source layer relu7
I1213 12:14:15.031158 1380 net.cpp:774] Copying source layer drop7
I1213 12:14:15.032158 1380 net.cpp:774] Copying source layer score_fr
I1213 12:14:15.034660 1380 net.cpp:774] Copying source layer upscore2
I1213 12:14:15.035661 1380 net.cpp:774] Copying source layer upscore2_upscore2_0_split
I1213 12:14:15.036164 1380 net.cpp:774] Copying source layer score_pool4
I1213 12:14:15.036666 1380 net.cpp:774] Copying source layer score_pool4c
I1213 12:14:15.036666 1380 net.cpp:774] Copying source layer fuse_pool4
I1213 12:14:15.037163 1380 net.cpp:774] Copying source layer upscore_pool4
I1213 12:14:15.037163 1380 net.cpp:774] Copying source layer upscore_pool4_upscore_pool4_0_split
I1213 12:14:15.038663 1380 net.cpp:774] Copying source layer score_pool3
I1213 12:14:15.038663 1380 net.cpp:774] Copying source layer score_pool3c
I1213 12:14:15.039165 1380 net.cpp:774] Copying source layer fuse_pool3
I1213 12:14:15.039664 1380 net.cpp:774] Copying source layer upscore8
I1213 12:14:15.040163 1380 net.cpp:774] Copying source layer score
I1213 12:14:15.042166 1380 net.cpp:774] Copying source layer loss
I1213 12:14:15.088698 1380 caffe.cpp:252] Starting Optimization
I1213 12:14:15.089200 1380 solver.cpp:279] Solving
I1213 12:14:15.090199 1380 solver.cpp:280] Learning Rate Policy: fixed
I1213 12:14:15.136232 1380 solver.cpp:337] Iteration 0, Testing net (#0)
I1213 12:14:26.184625 1380 solver.cpp:404] Test net output #0: accuracy = 1
I1213 12:14:26.184625 1380 solver.cpp:404] Test net output #1: loss = 5.40403 (* 1 = 5.40403 loss)
I1213 12:14:26.345742 1380 solver.cpp:228] Iteration 0, loss = 10.3912
I1213 12:14:26.346242 1380 solver.cpp:244] Train net output #0: loss = 10.3912 (* 1 = 10.3912 loss)
I1213 12:14:26.346741 1380 sgd_solver.cpp:106] Iteration 0, lr = 1e-014
I1213 12:14:36.910781 1380 solver.cpp:228] Iteration 20, loss = 2.21905
I1213 12:14:36.910781 1380 solver.cpp:244] Train net output #0: loss = 2.21906 (* 1 = 2.21906 loss)
I1213 12:14:36.911775 1380 sgd_solver.cpp:106] Iteration 20, lr = 1e-014
I1213 12:14:47.624909 1380 solver.cpp:228] Iteration 40, loss = 4.72848
I1213 12:14:47.625411 1380 solver.cpp:244] Train net output #0: loss = 4.72848 (* 1 = 4.72848 loss)
I1213 12:14:47.625911 1380 sgd_solver.cpp:106] Iteration 40, lr = 1e-014
I1213 12:14:58.325038 1380 solver.cpp:228] Iteration 60, loss = 2.64817
I1213 12:14:58.325539 1380 solver.cpp:244] Train net output #0: loss = 2.64817 (* 1 = 2.64817 loss)
I1213 12:14:58.329041 1380 sgd_solver.cpp:106] Iteration 60, lr = 1e-014
I1213 12:15:08.973148 1380 solver.cpp:228] Iteration 80, loss = 2.92758
I1213 12:15:08.973649 1380 solver.cpp:244] Train net output #0: loss = 2.92758 (* 1 = 2.92758 loss)
I1213 12:15:08.973649 1380 sgd_solver.cpp:106] Iteration 80, lr = 1e-014
I1213 12:15:19.647307 1380 solver.cpp:228] Iteration 100, loss = 2.62991
I1213 12:15:19.647807 1380 solver.cpp:244] Train net output #0: loss = 2.62992 (* 1 = 2.62992 loss)
I1213 12:15:19.648298 1380 sgd_solver.cpp:106] Iteration 100, lr = 1e-014
I1213 12:15:30.322911 1380 solver.cpp:228] Iteration 120, loss = 4.65416
I1213 12:15:30.322911 1380 solver.cpp:244] Train net output #0: loss = 4.65416 (* 1 = 4.65416 loss)
I1213 12:15:30.323411 1380 sgd_solver.cpp:106] Iteration 120, lr = 1e-014
I1213 12:15:40.963016 1380 solver.cpp:228] Iteration 140, loss = 4.19446
I1213 12:15:40.963515 1380 solver.cpp:244] Train net output #0: loss = 4.19446 (* 1 = 4.19446 loss)
I1213 12:15:40.964017 1380 sgd_solver.cpp:106] Iteration 140, lr = 1e-014
I1213 12:15:51.605111 1380 solver.cpp:228] Iteration 160, loss = 3.29427
I1213 12:15:51.605111 1380 solver.cpp:244] Train net output #0: loss = 3.29427 (* 1 = 3.29427 loss)
I1213 12:15:51.605613 1380 sgd_solver.cpp:106] Iteration 160, lr = 1e-014
I1213 12:16:02.237191 1380 solver.cpp:228] Iteration 180, loss = 5.40818
I1213 12:16:02.237690 1380 solver.cpp:244] Train net output #0: loss = 5.40819 (* 1 = 5.40819 loss)
I1213 12:16:02.237690 1380 sgd_solver.cpp:106] Iteration 180, lr = 1e-014
I1213 12:16:12.995949 1380 solver.cpp:228] Iteration 200, loss = 7.47552
I1213 12:16:12.996439 1380 solver.cpp:244] Train net output #0: loss = 7.47552 (* 1 = 7.47552 loss)
I1213 12:16:12.996948 1380 sgd_solver.cpp:106] Iteration 200, lr = 1e-014
I1213 12:16:23.705600 1380 solver.cpp:228] Iteration 220, loss = 5.60802
I1213 12:16:23.705600 1380 solver.cpp:244] Train net output #0: loss = 5.60802 (* 1 = 5.60802 loss)
I1213 12:16:23.706099 1380 sgd_solver.cpp:106] Iteration 220, lr = 1e-014
I1213 12:16:34.293746 1380 solver.cpp:228] Iteration 240, loss = 2.95836
I1213 12:16:34.294245 1380 solver.cpp:244] Train net output #0: loss = 2.95836 (* 1 = 2.95836 loss)
I1213 12:16:34.294746 1380 sgd_solver.cpp:106] Iteration 240, lr = 1e-014
I1213 12:16:44.911836 1380 solver.cpp:228] Iteration 260, loss = 3.27787
I1213 12:16:44.912346 1380 solver.cpp:244] Train net output #0: loss = 3.27787 (* 1 = 3.27787 loss)
I1213 12:16:44.912838 1380 sgd_solver.cpp:106] Iteration 260, lr = 1e-014
I1213 12:16:55.712539 1380 solver.cpp:228] Iteration 280, loss = 3.04385
I1213 12:16:55.713040 1380 solver.cpp:244] Train net output #0: loss = 3.04386 (* 1 = 3.04386 loss)
I1213 12:16:55.713040 1380 sgd_solver.cpp:106] Iteration 280, lr = 1e-014
I1213 12:17:06.361131 1380 solver.cpp:228] Iteration 300, loss = 3.75074
I1213 12:17:06.361637 1380 solver.cpp:244] Train net output #0: loss = 3.75075 (* 1 = 3.75075 loss)
I1213 12:17:06.362130 1380 sgd_solver.cpp:106] Iteration 300, lr = 1e-014
I1213 12:17:17.130873 1380 solver.cpp:228] Iteration 320, loss = 2.53425
I1213 12:17:17.131374 1380 solver.cpp:244] Train net output #0: loss = 2.53425 (* 1 = 2.53425 loss)
I1213 12:17:17.131873 1380 sgd_solver.cpp:106] Iteration 320, lr = 1e-014
I1213 12:17:26.238867 1380 solver.cpp:337] Iteration 338, Testing net (#0)
I1213 12:17:37.551930 1380 solver.cpp:404] Test net output #0: accuracy = 1
I1213 12:17:37.552439 1380 solver.cpp:404] Test net output #1: loss = 5.40402 (* 1 = 5.40402 loss)
I1213 12:17:38.718762 1380 solver.cpp:228] Iteration 340, loss = 3.97111
I1213 12:17:38.718762 1380 solver.cpp:244] Train net output #0: loss = 3.97111 (* 1 = 3.97111 loss)
I1213 12:17:38.719262 1380 sgd_solver.cpp:106] Iteration 340, lr = 1e-014
I1213 12:17:49.461891 1380 solver.cpp:228] Iteration 360, loss = 3.32952
I1213 12:17:49.462391 1380 solver.cpp:244] Train net output #0: loss = 3.32952 (* 1 = 3.32952 loss)
I1213 12:17:49.462893 1380 sgd_solver.cpp:106] Iteration 360, lr = 1e-014
I1213 12:18:00.222075 1380 solver.cpp:228] Iteration 380, loss = 5.11817
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 30
- 31
- 32
- 33
- 34
- 35
- 36
- 37
- 38
- 39
- 40
- 41
- 42
- 43
- 44
- 45
- 46
- 47
- 48
- 49
- 50
- 51
- 52
- 53
- 54
- 55
- 56
- 57
- 58
- 59
- 60
- 61
- 62
- 63
- 64
- 65
- 66
- 67
- 68
- 69
- 70
- 71
- 72
- 73
- 74
- 75
- 76
- 77
- 78
- 79
- 80
- 81
- 82
- 83
- 84
- 85
- 86
- 87
- 88
- 89
- 90
- 91
- 92
- 93
- 94
- 95
- 96
- 97
- 98
- 99
- 100
- 101
- 102
- 103
- 104
- 105
- 106
- 107
- 108
- 109
- 110
- 111
- 112
- 113
- 114
- 115
- 116
- 117
- 118
- 119
- 120
- 121
- 122
- 123
- 124
- 125
- 126
- 127
- 128
- 129
- 130
- 131
- 132
- 133
- 134
- 135
- 136
- 137
- 138
- 139
- 140
- 141
- 142
- 143
- 144
- 145
- 146
- 147
- 148
- 149
- 150
- 151
- 152
- 153
- 154
- 155
- 156
- 157
- 158
- 159
- 160
- 161
- 162
- 163
- 164
- 165
- 166
- 167
- 168
- 169
- 170
- 171
- 172
- 173
- 174
- 175
- 176
- 177
- 178
- 179
- 180
- 181
- 182
- 183
- 184
- 185
- 186
- 187
- 188
- 189
- 190
- 191
- 192
- 193
- 194
- 195
- 196
- 197
- 198
- 199
- 200
- 201
- 202
- 203
- 204
- 205
- 206
- 207
- 208
- 209
- 210
- 211
- 212
- 213
- 214
- 215
- 216
- 217
- 218
- 219
- 220
- 221
- 222
- 223
- 224
- 225
- 226
- 227
- 228
- 229
- 230
- 231
- 232
- 233
- 234
- 235
- 236
- 237
- 238
- 239
- 240
- 241
- 242
- 243
- 244
- 245
- 246
- 247
- 248
- 249
- 250
- 251
- 252
- 253
- 254
- 255
- 256
- 257
- 258
- 259
- 260
- 261
- 262
- 263
- 264
- 265
- 266
- 267
- 268
- 269
- 270
- 271
- 272
- 273
- 274
- 275
- 276
- 277
- 278
- 279
- 280
- 281
- 282
- 283
- 284
- 285
- 286
- 287
- 288
- 289
- 290
- 291
- 292
- 293
- 294
- 295
- 296
- 297
- 298
- 299
- 300
- 301
- 302
- 303
- 304
- 305
- 306
- 307
- 308
- 309
- 310
- 311
- 312
- 313
- 314
- 315
- 316
- 317
- 318
- 319
- 320
- 321
- 322
- 323
- 324
- 325
- 326
- 327
- 328
- 329
- 330
- 331
- 332
- 333
- 334
- 335
- 336
- 337
- 338
- 339
- 340
- 341
- 342
- 343
- 344
- 345
- 346
- 347
- 348
- 349
- 350
- 351
- 352
- 353
- 354
- 355
- 356
- 357
- 358
- 359
- 360
- 361
- 362
- 363
- 364
- 365
- 366
- 367
- 368
- 369
- 370
- 371
- 372
- 373
- 374
- 375
- 376
- 377
- 378
- 379
- 380
- 381
- 382
- 383
- 384
- 385
- 386
- 387
- 388
- 389
- 390
- 391
- 392
- 393
- 394
- 395
- 396
- 397
- 398
- 399
- 400
- 401
- 402
- 403
- 404
- 405
- 406
- 407
- 408
- 409
- 410
- 411
- 412
- 413
- 414
- 415
- 416
- 417
- 418
- 419
- 420
- 421
- 422
- 423
- 424
- 425
- 426
- 427
- 428
- 429
- 430
- 431
- 432
- 433
- 434
- 435
- 436
- 437
- 438
- 439
- 440
- 441
- 442
- 443
- 444
- 445
- 446
- 447
- 448
- 449
- 450
- 451
- 452
- 453
- 454
- 455
- 456
- 457
- 458
- 459
- 460
- 461
- 462
- 463
- 464
- 465
- 466
- 467
- 468
- 469
- 470
- 471
- 472
- 473
- 474
- 475
- 476
- 477
- 478
- 479
- 480
- 481
- 482
- 483
- 484
- 485
- 486
- 487
- 488
- 489
- 490
- 491
- 492
- 493
- 494
- 495
- 496
- 497
- 498
- 499
- 500
- 501
- 502
- 503
- 504
- 505
- 506
- 507
- 508
- 509
- 510
- 511
- 512
- 513
- 514
- 515
- 516
- 517
- 518
- 519
- 520
- 521
- 522
- 523
- 524
- 525
- 526
- 527
- 528
- 529
- 530
- 531
- 532
- 533
- 534
- 535
- 536
- 537
- 538
- 539
- 540
- 541
- 542
- 543
- 544
- 545
- 546
- 547
- 548
- 549
- 550
- 551
- 552
- 553
- 554
- 555
- 556
- 557
- 558
- 559
- 560
- 561
- 562
- 563
- 564
- 565
- 566
- 567
- 568
- 569
- 570
- 571
- 572
- 573
- 574
- 575
- 576
- 577
- 578
- 579
- 580
- 581
- 582
- 583
- 584
- 585
- 586
- 587
- 588
- 589
- 590
- 591
- 592
- 593
- 594
- 595
- 596
- 597
- 598
- 599
- 600
- 601
- 602
- 603
- 604
- 605
- 606
- 607
- 608
- 609
- 610
- 611
- 612
- 613
- 614
- 615
- 616
- 617
- 618
- 619
- 620
- 621
- 622
- 623
- 624
- 625
- 626
- 627
- 628
- 629
- 630
- 631
- 632
- 633
- 634
- 635
- 636
- 637
- 638
- 639
- 640
- 641
- 642
- 643
- 644
- 645
- 646
- 647
- 648
- 649
- 650
- 651
- 652
- 653
- 654
- 655
- 656
- 657
- 658
- 659
- 660
- 661
- 662
- 663
- 664
- 665
- 666
- 667
- 668
- 669
- 670
- 671
- 672
- 673
- 674
- 675
- 676
- 677
- 678
- 679
- 680
- 681
- 682
- 683
- 684
- 685
- 686
- 687
- 688
- 689
- 690
- 691
- 692
- 693
- 694
- 695
- 696
- 697
- 698
- 699
- 700
- 701
- 702
- 703
- 704
- 705
- 706
- 707
- 708
- 709
- 710
- 711
- 712
- 713
- 714
- 715
- 716
- 717
- 718
- 719
- 720
- 721
- 722
- 723
- 724
- 725
- 726
- 727
- 728
- 729
- 730
- 731
- 732
- 733
- 734
- 735
- 736
- 737
- 738
- 739
- 740
- 741
- 742
- 743
- 744
- 745
- 746
- 747
- 748
- 749
- 750
- 751
- 752
- 753
- 754
- 755
- 756
- 757
- 758
- 759
- 760
- 761
- 762
- 763
- 764
- 765
- 766
- 767
- 768
- 769
- 770
- 771
- 772
- 773
- 774
- 775
- 776
- 777
- 778
- 779
- 780
- 781
- 782
- 783
- 784
- 785
- 786
- 787
- 788
- 789
- 790
- 791
- 792
- 793
- 794
- 795
- 796
- 797
- 798
- 799
- 800
- 801
- 802
- 803
- 804
- 805
- 806
- 807
- 808
- 809
- 810
- 811
- 812
- 813
- 814
- 815
- 816
- 817
- 818
- 819
- 820
- 821
- 822
- 823
- 824
- 825
- 826
- 827
- 828
- 829
- 830
- 831
- 832
- 833
- 834
- 835
- 836
- 837
- 838
- 839
- 840
- 841
- 842
- 843
- 844
- 845
- 846
- 847
- 848
- 849
- 850
- 851
- 852
- 853
- 854
- 855
- 856
- 857
- 858
- 859
- 860
- 861
- 862
- 863
- 864
- 865
- 866
- 867
- 868
- 869
- 870
- 871
- 872
- 873
- 874
- 875
- 876
- 877
- 878
- 879
- 880
- 881
- 882
- 883
- 884
- 885
- 886
- 887
- 888
- 889
- 890
- 891
- 892
- 893
- 894
- 895
- 896
- 897
- 898
- 899
- 900
- 901
- 902
- 903
- 904
- 905
- 906
- 907
- 908
- 909
- 910
- 911
- 912
- 913
- 914
- 915
- 916
- 917
- 918
- 919
- 920
- 921
- 922
- 923
- 924
- 925
- 926
- 927
- 928
- 929
- 930
- 931
- 932
- 933
- 934
- 935
- 936
- 937
- 938
- 939
- 940
- 941
- 942
- 943
- 944
- 945
- 946
- 947
- 948
- 949
- 950
- 951
- 952
- 953
- 954
- 955
- 956
- 957
- 958
- 959
- 960
- 961
- 962
- 963
- 964
- 965
- 966
- 967
- 968
- 969
- 970
- 971
- 972
- 973
- 974
- 975
- 976
- 977
- 978
- 979
- 980
- 981
- 982
- 983
- 984
- 985
- 986
- 987
- 988
- 989
- 990
- 991
- 992
- 993
- 994
- 995
- 996
- 997
- 998
- 999
- 1000
- 1001
- 1002
- 1003
- 1004
- 1005
- 1006
- 1007
- 1008
- 1009
- 1010
- 1011
- 1012
- 1013
- 1014
- 1015
- 1016
- 1017
- 1018
- 1019
- 1020
- 1021
- 1022
- 1023
- 1024
- 1025
- 1026
- 1027
- 1028
- 1029
- 1030
- 1031
- 1032
- 1033
- 1034
- 1035
- 1036
- 1037
- 1038
- 1039
- 1040
- 1041
- 1042
- 1043
- 1044
- 1045
- 1046
- 1047
- 1048
- 1049
- 1050
- 1051
- 1052
- 1053
- 1054
- 1055
- 1056
- 1057
- 1058
- 1059
- 1060
- 1061
- 1062
- 1063
- 1064
- 1065
- 1066
- 1067
- 1068
- 1069
- 1070
- 1071
- 1072
- 1073
- 1074
- 1075
- 1076
- 1077
- 1078
- 1079
- 1080
- 1081
- 1082
- 1083
- 1084
- 1085
- 1086
- 1087
- 1088
- 1089
- 1090
- 1091
- 1092
- 1093
- 1094
- 1095
- 1096
- 1097
- 1098
- 1099
- 1100
- 1101
- 1102
- 1103
- 1104
- 1105
- 1106
- 1107
- 1108
- 1109
- 1110
- 1111
- 1112
- 1113
- 1114
- 1115
- 1116
- 1117
- 1118
- 1119
- 1120
- 1121
- 1122
- 1123
- 1124
- 1125
- 1126
- 1127
- 1128
- 1129
- 1130
- 1131
- 1132
- 1133
- 1134
- 1135
- 1136
- 1137
- 1138
- 1139
- 1140
- 1141
- 1142
- 1143
- 1144
- 1145
- 1146
- 1147
- 1148
- 1149
- 1150
- 1151
- 1152
- 1153
- 1154
- 1155
- 1156
- 1157
- 1158
- 1159
- 1160
- 1161
- 1162
- 1163
- 1164
- 1165
- 1166
- 1167
- 1168
- 1169
- 1170
- 1171
- 1172
- 1173
- 1174
- 1175
- 1176
- 1177
- 1178
- 1179
- 1180
- 1181
- 1182
- 1183
- 1184
- 1185
- 1186
- 1187
- 1188
- 1189
- 1190
- 1191
- 1192
- 1193
- 1194
- 1195
- 1196
- 1197
- 1198
- 1199
- 1200
- 1201
- 1202
- 1203
- 1204
- 1205
- 1206
- 1207
- 1208
- 1209
- 1210
- 1211
- 1212
- 1213
- 1214
- 1215
- 1216
- 1217
- 1218
- 1219
- 1220
- 1221
- 1222
- 1223
- 1224
- 1225
- 1226
- 1227
- 1228
- 1229
- 1230
- 1231
- 1232
- 1233
- 1234
- 1235
- 1236
- 1237
- 1238
- 1239
- 1240
- 1241
- 1242
- 1243
- 1244
- 1245
- 1246
- 1247
- 1248
- 1249
- 1250
- 1251
- 1252
- 1253
- 1254
- 1255
- 1256
- 1257
- 1258
- 1259
- 1260
- 1261
- 1262
- 1263
- 1264
- 1265
- 1266
- 1267
- 1268
- 1269
- 1270
- 1271
- 1272
- 1273
- 1274
- 1275
- 1276
- 1277
- 1278
- 1279
- 1280
- 1281
- 1282
- 1283
- 1284
- 1285
- 1286
- 1287
- 1288
- 1289
- 1290
- 1291
- 1292
- 1293
- 1294
- 1295
- 1296
- 1297
- 1298
- 1299
- 1300
- 1301
- 1302
- 1303
- 1304
- 1305
- 1306
- 1307
- 1308
- 1309
- 1310
- 1311
- 1312
- 1313
- 1314
- 1315
- 1316
- 1317
- 1318
- 1319
- 1320
- 1321
- 1322
- 1323
- 1324
- 1325
- 1326
- 1327
- 1328
- 1329
- 1330
- 1331
- 1332
- 1333
- 1334
- 1335
- 1336
- 1337
- 1338
- 1339
- 1340
- 1341
- 1342
- 1343
- 1344
- 1345
- 1346
- 1347
- 1348
- 1349
- 1350
- 1351
- 1352
- 1353
- 1354
- 1355
- 1356
- 1357
- 1358
- 1359
- 1360
- 1361
- 1362
- 1363
- 1364
- 1365
- 1366
- 1367
- 1368
- 1369
- 1370
- 1371
- 1372
- 1373
- 1374
- 1375
- 1376
- 1377
- 1378
- 1379
- 1380
- 1381
- 1382
- 1383
- 1384
- 1385
- 1386
- 1387
- 1388
- 1389
- 1390
- 1391
- 1392
- 1393
- 1394
- 1395
- 1396
- 1397
- 1398
- 1399
- 1400
- 1401
- 1402
- 1403
- 1404
- 1405
- 1406
- 1407
- 1408
- 1409
- 1410
- 1411
- 1412
- 1413
- 1414
- 1415
- 1416
- 1417
- 1418
- 1419
- 1420
- 1421
- 1422
- 1423
- 1424
- 1425
- 1426
- 1427
- 1428
- 1429
- 1430
- 1431
- 1432
- 1433
- 1434
- 1435
- 1436
- 1437
- 1438
- 1439
- 1440
- 1441
- 1442
- 1443
- 1444
- 1445
- 1446
- 1447
- 1448
- 1449
- 1450
- 1451
- 1452
- 1453
- 1454
- 1455
- 1456
- 1457
- 1458
- 1459
- 1460
- 1461
- 1462
- 1463
- 1464
- 1465
- 1466
- 1467
- 1468
- 1469
- 1470
- 1471
- 1472
- 1473
- 1474
- 1475
- 1476
- 1477
- 1478
- 1479
- 1480
- 1481
- 1482
- 1483
- 1484
- 1485
- 1486
- 1487
- 1488
- 1489
- 1490
- 1491
- 1492
- 1493
- 1494
- 1495
- 1496
- 1497
- 1498
- 1499
- 1500
- 1501
- 1502
- 1503
- 1504
- 1505
- 1506
- 1507
- 1508
- 1509
- 1510
- 1511
- 1512
- 1513
- 1514
- 1515
- 1516
- 1517
- 1518
- 1519
- 1520
- 1521
- 1522
- 1523
- 1524
- 1525
- 1526
- 1527
- 1528
- 1529
- 1530
- 1531
- 1532
- 1533
- 1534
- 1535
- 1536
- 1537
- 1538
- 1539
- 1540
- 1541
- 1542
- 1543
- 1544
- 1545
- 1546
- 1547
- 1548
- 1549
- 1550
- 1551
- 1552
- 1553
- 1554
- 1555
- 1556
- 1557
- 1558
- 1559
- 1560
- 1561
- 1562
- 1563
- 1564
- 1565
- 1566
- 1567
- 1568
- 1569
- 1570
- 1571
- 1572
- 1573
- 1574
- 1575
- 1576
- 1577
- 1578
- 1579
- 1580
- 1581
- 1582
- 1583
- 1584
- 1585
- 1586
- 1587
- 1588
- 1589
- 1590
- 1591
- 1592
- 1593
- 1594
- 1595
- 1596
- 1597
- 1598
- 1599
- 1600
- 1601
- 1602
- 1603
- 1604
- 1605
- 1606
- 1607
- 1608
- 1609
- 1610
- 1611
- 1612
- 1613
- 1614
- 1615
- 1616
- 1617
- 1618
- 1619
- 1620
- 1621
- 1622
- 1623
- 1624
- 1625
- 1626
- 1627
- 1628
- 1629
- 1630
- 1631
- 1632
- 1633
- 1634
- 1635
- 1636
- 1637
- 1638
- 1639
- 1640
- 1641
- 1642
- 1643
- 1644
- 1645
- 1646
- 1647
- 1648
- 1649
- 1650
- 1651
- 1652
- 1653
- 1654
- 1655
- 1656
- 1657
- 1658
- 1659
- 1660
- 1661
- 1662
- 1663
- 1664
- 1665
- 1666
- 1667
- 1668
- 1669
- 1670
- 1671
- 1672
- 1673
- 1674
- 1675
- 1676
- 1677
- 1678
- 1679
- 1680
- 1681
- 1682
- 1683
- 1684
- 1685
- 1686
- 1687
- 1688
- 1689
- 1690
- 1691
- 1692
- 1693
- 1694
- 1695
- 1696
- 1697
- 1698
- 1699
- 1700
- 1701
- 1702
- 1703
- 1704
- 1705
- 1706
- 1707
- 1708
- 1709
- 1710
- 1711
- 1712
- 1713
- 1714
- 1715
- 1716
- 1717
- 1718
- 1719
- 1720
- 1721
- 1722
- 1723
- 1724
- 1725
- 1726
- 1727
- 1728
- 1729
- 1730
- 1731
- 1732
- 1733
- 1734
- 1735
- 1736
- 1737
- 1738
- 1739
- 1740
- 1741
- 1742
- 1743
- 1744
- 1745
- 1746
- 1747
- 1748
- 1749
- 1750
- 1751
- 1752
- 1753
- 1754
- 1755
- 1756
- 1757
- 1758
- 1759
- 1760
- 1761
- 1762
- 1763
- 1764
- 1765
- 1766
- 1767
- 1768
- 1769
- 1770
- 1771
- 1772
- 1773
- 1774
- 1775
- 1776
- 1777
- 1778
- 1779
- 1780
- 1781
- 1782
- 1783
- 1784
- 1785
- 1786
- 1787
- 1788
- 1789
- 1790
- 1791
- 1792
- 1793
- 1794
- 1795
- 1796
- 1797
- 1798
- 1799
- 1800
- 1801
- 1802
- 1803
- 1804
- 1805
- 1806
- 1807
- 1808
- 1809
- 1810
- 1811
- 1812
- 1813
- 1814
- 1815
- 1816
- 1817
- 1818
- 1819
- 1820
- 1821
- 1822
- 1823
- 1824
- 1825
- 1826
- 1827
- 1828
- 1829
- 1830
- 1831
- 1832
- 1833
- 1834
- 1835
- 1836
- 1837
- 1838
- 1839
- 1840
- 1841
- 1842
- 1843
- 1844
- 1845
- 1846
- 1847
- 1848
- 1849
- 1850
- 1851
- 1852
- 1853
- 1854
- 1855
- 1856
- 1857
- 1858
- 1859
- 1860
- 1861
- 1862
- 1863
- 1864
- 1865
- 1866
- 1867
- 1868
- 1869
- 1870
- 1871
- 1872
- 1873
- 1874
- 1875
- 1876
- 1877
- 1878
- 1879
- 1880
- 1881
- 1882
- 1883
- 1884
- 1885
- 1886
- 1887
- 1888
- 1889
- 1890
- 1891
- 1892
- 1893
- 1894
- 1895
- 1896
- 1897
- 1898
- 1899
- 1900
- 1901
- 1902
- 1903
- 1904
- 1905
- 1906
- 1907
- 1908
- 1909
- 1910
- 1911
- 1912
- 1913
- 1914
- 1915
- 1916
- 1917
- 1918
- 1919
- 1920
- 1921
- 1922
- 1923
- 1924
- 1925
- 1926
- 1927
- 1928
- 1929
- 1930
- 1931
- 1932
- 1933
- 1934
- 1935
- 1936
- 1937
- 1938
- 1939
- 1940
- 1941
- 1942
- 1943
- 1944
- 1945
- 1946
- 1947
- 1948
- 1949
- 1950
- 1951
- 1952
- 1953
- 1954
- 1955
- 1956
- 1957
- 1958
- 1959
- 1960
- 1961
- 1962
- 1963
- 1964
- 1965
- 1966
- 1967
- 1968
- 1969
- 1970
- 1971
- 1972
- 1973
- 1974
- 1975
- 1976
- 1977
- 1978
- 1979
- 1980
- 1981
- 1982
- 1983
- 1984
- 1985
- 1986
- 1987
- 1988
- 1989
- 1990
- 1991
- 1992
- 1993
- 1994
- 1995
- 1996
- 1997
- 1998
- 1999
- 2000
- 2001
- 2002
- 2003
- 2004
- 2005
- 2006
- 2007
- 2008
- 2009
- 2010
- 2011
- 2012
- 2013
- 2014
- 2015
- 2016
- 2017
- 2018
- 2019
- 2020
- 2021
- 2022
- 2023
- 2024
- 2025
- 2026
- 2027
- 2028
- 2029
- 2030
- 2031
- 2032
- 2033
- 2034
- 2035
- 2036
- 2037
- 2038
- 2039
- 2040
- 2041
- 2042
- 2043
- 2044
- 2045
- 2046
- 2047
- 2048
- 2049
- 2050
- 2051
- 2052
- 2053
- 2054
- 2055
- 2056
- 2057
- 2058
- 2059
- 2060
- 2061
- 2062
- 2063
- 2064
- 2065
- 2066
- 2067
- 2068
- 2069
- 2070
- 2071
- 2072
- 2073
- 2074
- 2075
- 2076
- 2077
- 2078
- 2079
- 2080
- 2081
- 2082
- 2083
- 2084
- 2085
- 2086
- 2087
- 2088
- 2089
- 2090
- 2091
- 2092
- 2093
- 2094
- 2095
- 2096
- 2097
- 2098
- 2099
- 2100
- 2101
- 2102
- 2103
- 2104
- 2105
- 2106
- 2107
- 2108
- 2109
- 2110
- 2111
- 2112
- 2113
- 2114
- 2115
- 2116
- 2117
- 2118
- 2119
- 2120
- 2121
- 2122
- 2123
- 2124
- 2125
- 2126
- 2127
- 2128
- 2129
- 2130
- 2131
- 2132
- 2133
- 2134
- 2135
- 2136
- 2137
- 2138
- 2139
- 2140
- 2141
- 2142
- 2143
- 2144
- 2145
- 2146
- 2147
- 2148
- 2149
- 2150
- 2151
- 2152
- 2153
- 2154
- 2155
- 2156
- 2157
- 2158
- 2159
- 2160
- 2161
- 2162
- 2163
- 2164
- 2165
- 2166
- 2167
- 2168
- 2169
- 2170
- 2171
- 2172
- 2173
- 2174
- 2175
- 2176
- 2177
- 2178
- 2179
- 2180
- 2181
- 2182
- 2183
- 2184
- 2185
- 2186
- 2187
- 2188
- 2189
- 2190
- 2191
- 2192
- 2193
- 2194
- 2195
- 2196
- 2197
- 2198
- 2199
- 2200
- 2201
- 2202
- 2203
- 2204
- 2205
- 2206
- 2207
- 2208
- 2209
- 2210
- 2211
- 2212
- 2213
- 2214
- 2215
- 2216
- 2217
- 2218
- 2219
- 2220
- 2221
- 2222
- 2223
- 2224
- 2225
- 2226
- 2227
- 2228
- 2229
- 2230
- 2231
- 2232
- 2233
- 2234
- 2235
- 2236
- 2237
- 2238
- 2239
- 2240
- 2241
- 2242
- 2243
- 2244
- 2245
- 2246
- 2247
- 2248
- 2249
- 2250
- 2251
- 2252
- 2253
- 2254
- 2255
- 2256
- 2257
- 2258
- 2259
- 2260
- 2261
- 2262
- 2263
- 2264
- 2265
- 2266
- 2267
- 2268
- 2269
- 2270
- 2271
- 2272
- 2273
- 2274
- 2275
- 2276
- 2277
- 2278
- 2279
- 2280
- 2281
- 2282
- 2283
- 2284
- 2285
- 2286
- 2287
- 2288
- 2289
- 2290
- 2291
- 2292
- 2293
- 2294
- 2295
- 2296
- 2297
- 2298
- 2299
- 2300
- 2301
- 2302
- 2303
- 2304
- 2305
- 2306
- 2307
- 2308
- 2309
- 2310
- 2311
- 2312
- 2313
- 2314
- 2315
- 2316
- 2317
- 2318
- 2319
- 2320
- 2321
- 2322
- 2323
- 2324
- 2325
- 2326
- 2327
- 2328
- 2329
- 2330
- 2331
- 2332
- 2333
- 2334
- 2335
- 2336
- 2337
- 2338
- 2339
- 2340
- 2341
- 2342
- 2343
- 2344
- 2345
- 2346
- 2347
- 2348
- 2349
- 2350
- 2351
- 2352
- 2353
- 2354
- 2355
- 2356
- 2357
- 2358
- 2359
- 2360
- 2361
- 2362
- 2363
- 2364
- 2365
- 2366
- 2367
- 2368
- 2369
- 2370
- 2371
- 2372
- 2373
- 2374
- 2375
- 2376
- 2377
- 2378
- 2379
- 2380
- 2381
- 2382
- 2383
- 2384
- 2385
- 2386
- 2387
- 2388
- 2389
- 2390
- 2391
- 2392
- 2393
- 2394
- 2395
- 2396
- 2397
- 2398
- 2399
- 2400
- 2401
- 2402
- 2403
- 2404
- 2405
- 2406
- 2407
- 2408
- 2409
- 2410
- 2411
- 2412
- 2413
- 2414
- 2415
- 2416
- 2417
- 2418
- 2419
- 2420
- 2421
- 2422
- 2423
- 2424
- 2425
- 2426
- 2427
- 2428
- 2429
- 2430
- 2431
- 2432
- 2433
- 2434
- 2435
- 2436
- 2437
- 2438
- 2439
- 2440
- 2441
- 2442
- 2443
- 2444
- 2445
- 2446
- 2447
- 2448
- 2449
- 2450
- 2451
- 2452
- 2453
- 2454
- 2455
- 2456
- 2457
- 2458
- 2459
- 2460
- 2461
- 2462
- 2463
- 2464
- 2465
- 2466
- 2467
- 2468
- 2469
- 2470
- 2471
- 2472
- 2473
- 2474
- 2475
- 2476
- 2477
- 2478
- 2479
- 2480
- 2481
- 2482
- 2483
- 2484
- 2485
- 2486
- 2487
- 2488
- 2489
- 2490
- 2491
- 2492
- 2493
- 2494
- 2495
- 2496
- 2497
- 2498
- 2499
- 2500
- 2501
- 2502
- 2503
- 2504
- 2505
- 2506
- 2507
- 2508
- 2509
- 2510
- 2511
- 2512
- 2513
- 2514
- 2515
- 2516
- 2517
- 2518
- 2519
- 2520
- 2521
- 2522
- 2523
- 2524
- 2525
- 2526
- 2527
- 2528
- 2529
- 2530
- 2531
- 2532
- 2533
- 2534
- 2535
- 2536
- 2537
- 2538
- 2539
- 2540
- 2541