caffe不支持relu6_Matlab 使用caffe示例

环境: Ubuntu 12.04 ,  Matlab 2013b

1. 首先修改Makefile.config中的MATLAB_DIR项, 如下所示

MATLAB_DIR := /u01/MATLAB/R2013b

2. 编译下caffe下的matlab接口

make matcaffe

3. 切换到目录/u01/caffe/examples/imagenet, 运行./get_caffe_reference_imagenet_model.sh下载训练的模型

4. 切换到目录/u01/caffe/matlab/caffe下,运行matlab调用caffe的示例,

matlab -nodisplay

>> run('matcaffe_demo.m')

......

layers {

bottom: "conv4"

top: "conv4"

name: "relu4"

type: RELU

}

layers {

bottom: "conv4"

top: "conv5"

name: "conv5"

type: CONVOLUTION

convolution_param {

num_output: 256

pad: 1

kernel_size: 3

group: 2

}

}

layers {

bottom: "conv5"

top: "conv5"

name: "relu5"

type: RELU

}

layers {

bottom: "conv5"

top: "pool5"

name: "pool5"

type: POOLING

pooling_param {

pool: MAX

kernel_size: 3

stride: 2

}

}

layers {

bottom: "pool5"

top: "fc6"

name: "fc6"

type: INNER_PRODUCT

inner_product_param {

num_output: 4096

}

}

layers {

bottom: "fc6"

top: "fc6"

name: "relu6"

type: RELU

}

layers {

bottom: "fc6"

top: "fc6"

name: "drop6"

type: DROPOUT

dropout_param {

dropout_ratio: 0.5

}

}

layers {

bottom: "fc6"

top: "fc7"

name: "fc7"

type: INNER_PRODUCT

inner_product_param {

num_output: 4096

}

}

layers {

bottom: "fc7"

top: "fc7"

name: "relu7"

type: RELU

}

layers {

bottom: "fc7"

top: "fc7"

name: "drop7"

type: DROPOUT

dropout_param {

dropout_ratio: 0.5

}

}

layers {

bottom: "fc7"

top: "fc8"

name: "fc8"

type: INNER_PRODUCT

inner_product_param {

num_output: 1000

}

}

layers {

bottom: "fc8"

top: "prob"

name: "prob"

type: SOFTMAX

}

input: "data"

input_dim: 10

input_dim: 3

input_dim: 227

input_dim: 227

I0912 18:22:26.956653 11968 net.cpp:292] Input 0 -> data

I0912 18:22:26.956778 11968 net.cpp:66] Creating Layer conv1

I0912 18:22:26.956809 11968 net.cpp:329] conv1

I0912 18:22:26.956889 11968 net.cpp:290] conv1 -> conv1

I0912 18:22:26.957068 11968 net.cpp:83] Top shape: 10 96 55 55 (2904000)

I0912 18:22:26.957139 11968 net.cpp:125] conv1 needs backward computation.

I0912 18:22:26.957207 11968 net.cpp:66] Creating Layer relu1

I0912 18:22:26.957243 11968 net.cpp:329] relu1

I0912 18:22:26.957279 11968 net.cpp:280] relu1 -> conv1 (in-place)

I0912 18:22:26.957347 11968 net.cpp:83] Top shape: 10 96 55 55 (2904000)

I0912 18:22:26.957382 11968 net.cpp:125] relu1 needs backward computation.

I0912 18:22:26.957422 11968 net.cpp:66] Creating Layer pool1

I0912 18:22:26.957458 11968 net.cpp:329] pool1

I0912 18:22:26.957496 11968 net.cpp:290] pool1 -> pool1

I0912 18:22:26.957548 11968 net.cpp:83] Top shape: 10 96 27 27 (699840)

I0912 18:22:26.957583 11968 net.cpp:125] pool1 needs backward computation.

I0912 18:22:26.957619 11968 net.cpp:66] Creating Layer norm1

I0912 18:22:26.957681 11968 net.cpp:329] norm1

I0912 18:22:26.957728 11968 net.cpp:290] norm1 -> norm1

I0912 18:22:26.957774 11968 net.cpp:83] Top shape: 10 96 27 27 (699840)

I0912 18:22:26.957809 11968 net.cpp:125] norm1 needs backward computation.

I0912 18:22:26.958052 11968 net.cpp:66] Creating Layer conv2

I0912 18:22:26.958092 11968 net.cpp:329] conv2

I0912 18:22:26.960306 11968 net.cpp:290] conv2 -> conv2

I0912 18:22:26.961231 11968 net.cpp:83] Top shape: 10 256 27 27 (1866240)

I0912 18:22:26.961369 11968 net.cpp:125] conv2 needs backward computation.

I0912 18:22:26.961398 11968 net.cpp:66] Creating Layer relu2

I0912 18:22:26.961436 11968 net.cpp:329] relu2

I0912 18:22:26.961468 11968 net.cpp:280] relu2 -> conv2 (in-place)

I0912 18:22:26.961496 11968 net.cpp:83] Top shape: 10 256 27 27 (1866240)

I0912 18:22:26.961516 11968 net.cpp:125] relu2 needs backward computation.

I0912 18:22:26.961539 11968 net.cpp:66] Creating Layer pool2

I0912 18:22:26.961593 11968 net.cpp:329] pool2

I0912 18:22:26.961629 11968 net.cpp:290] pool2 -> pool2

I0912 18:22:26.961676 11968 net.cpp:83] Top shape: 10 256 13 13 (432640)

I0912 18:22:26.961710 11968 net.cpp:125] pool2 needs backward computation.

I0912 18:22:26.961805 11968 net.cpp:66] Creating Layer norm2

I0912 18:22:26.961841 11968 net.cpp:329] norm2

I0912 18:22:26.961875 11968 net.cpp:290] norm2 -> norm2

I0912 18:22:26.961913 11968 net.cpp:83] Top shape: 10 256 13 13 (432640)

I0912 18:22:26.961969 11968 net.cpp:125] norm2 needs backward computation.

I0912 18:22:26.962023 11968 net.cpp:66] Creating Layer conv3

I0912 18:22:26.962059 11968 net.cpp:329] conv3

I0912 18:22:26.962096 11968 net.cpp:290] conv3 -> conv3

I0912 18:22:26.965011 11968 net.cpp:83] Top shape: 10 384 13 13 (648960)

I0912 18:22:26.965140 11968 net.cpp:125] conv3 needs backward computation.

I0912 18:22:26.965181 11968 net.cpp:66] Creating Layer relu3

I0912 18:22:26.965258 11968 net.cpp:329] relu3

I0912 18:22:26.965299 11968 net.cpp:280] relu3 -> conv3 (in-place)

I0912 18:22:26.965338 11968 net.cpp:83] Top shape: 10 384 13 13 (648960)

I0912 18:22:26.965479 11968 net.cpp:125] relu3 needs backward computation.

I0912 18:22:26.965520 11968 net.cpp:66] Creating Layer conv4

I0912 18:22:26.965555 11968 net.cpp:329] conv4

I0912 18:22:26.965634 11968 net.cpp:290] conv4 -> conv4

I0912 18:22:26.968613 11968 net.cpp:83] Top shape: 10 384 13 13 (648960)

I0912 18:22:26.968745 11968 net.cpp:125] conv4 needs backward computation.

I0912 18:22:26.968781 11968 net.cpp:66] Creating Layer relu4

I0912 18:22:26.968819 11968 net.cpp:329] relu4

I0912 18:22:26.968873 11968 net.cpp:280] relu4 -> conv4 (in-place)

I0912 18:22:26.968919 11968 net.cpp:83] Top shape: 10 384 13 13 (648960)

I0912 18:22:26.968992 11968 net.cpp:125] relu4 needs backward computation.

I0912 18:22:26.969028 11968 net.cpp:66] Creating Layer conv5

I0912 18:22:26.969066 11968 net.cpp:329] conv5

I0912 18:22:26.969108 11968 net.cpp:290] conv5 -> conv5

I0912 18:22:26.970634 11968 net.cpp:83] Top shape: 10 256 13 13 (432640)

I0912 18:22:26.970749 11968 net.cpp:125] conv5 needs backward computation.

I0912 18:22:26.970780 11968 net.cpp:66] Creating Layer relu5

I0912 18:22:26.970803 11968 net.cpp:329] relu5

I0912 18:22:26.970827 11968 net.cpp:280] relu5 -> conv5 (in-place)

I0912 18:22:26.970918 11968 net.cpp:83] Top shape: 10 256 13 13 (432640)

I0912 18:22:26.970952 11968 net.cpp:125] relu5 needs backward computation.

I0912 18:22:26.970988 11968 net.cpp:66] Creating Layer pool5

I0912 18:22:26.971233 11968 net.cpp:329] pool5

I0912 18:22:26.971282 11968 net.cpp:290] pool5 -> pool5

I0912 18:22:26.971361 11968 net.cpp:83] Top shape: 10 256 6 6 (92160)

I0912 18:22:26.971397 11968 net.cpp:125] pool5 needs backward computation.

I0912 18:22:26.971434 11968 net.cpp:66] Creating Layer fc6

I0912 18:22:26.971470 11968 net.cpp:329] fc6

I0912 18:22:26.971559 11968 net.cpp:290] fc6 -> fc6

I0912 18:22:27.069502 11968 net.cpp:83] Top shape: 10 4096 1 1 (40960)

I0912 18:22:27.069640 11968 net.cpp:125] fc6 needs backward computation.

I0912 18:22:27.069672 11968 net.cpp:66] Creating Layer relu6

I0912 18:22:27.069694 11968 net.cpp:329] relu6

I0912 18:22:27.069718 11968 net.cpp:280] relu6 -> fc6 (in-place)

I0912 18:22:27.069743 11968 net.cpp:83] Top shape: 10 4096 1 1 (40960)

I0912 18:22:27.069763 11968 net.cpp:125] relu6 needs backward computation.

I0912 18:22:27.069792 11968 net.cpp:66] Creating Layer drop6

I0912 18:22:27.069824 11968 net.cpp:329] drop6

I0912 18:22:27.069875 11968 net.cpp:280] drop6 -> fc6 (in-place)

I0912 18:22:27.069954 11968 net.cpp:83] Top shape: 10 4096 1 1 (40960)

I0912 18:22:27.069990 11968 net.cpp:125] drop6 needs backward computation.

I0912 18:22:27.070144 11968 net.cpp:66] Creating Layer fc7

I0912 18:22:27.070173 11968 net.cpp:329] fc7

I0912 18:22:27.070199 11968 net.cpp:290] fc7 -> fc7

I0912 18:22:27.111870 11968 net.cpp:83] Top shape: 10 4096 1 1 (40960)

I0912 18:22:27.111963 11968 net.cpp:125] fc7 needs backward computation.

I0912 18:22:27.111991 11968 net.cpp:66] Creating Layer relu7

I0912 18:22:27.112015 11968 net.cpp:329] relu7

I0912 18:22:27.112040 11968 net.cpp:280] relu7 -> fc7 (in-place)

I0912 18:22:27.112068 11968 net.cpp:83] Top shape: 10 4096 1 1 (40960)

I0912 18:22:27.112139 11968 net.cpp:125] relu7 needs backward computation.

I0912 18:22:27.112164 11968 net.cpp:66] Creating Layer drop7

I0912 18:22:27.112184 11968 net.cpp:329] drop7

I0912 18:22:27.112213 11968 net.cpp:280] drop7 -> fc7 (in-place)

I0912 18:22:27.112242 11968 net.cpp:83] Top shape: 10 4096 1 1 (40960)

I0912 18:22:27.112263 11968 net.cpp:125] drop7 needs backward computation.

I0912 18:22:27.112285 11968 net.cpp:66] Creating Layer fc8

I0912 18:22:27.112305 11968 net.cpp:329] fc8

I0912 18:22:27.112334 11968 net.cpp:290] fc8 -> fc8

I0912 18:22:27.122274 11968 net.cpp:83] Top shape: 10 1000 1 1 (10000)

I0912 18:22:27.122380 11968 net.cpp:125] fc8 needs backward computation.

I0912 18:22:27.122421 11968 net.cpp:66] Creating Layer prob

I0912 18:22:27.122503 11968 net.cpp:329] prob

I0912 18:22:27.122547 11968 net.cpp:290] prob -> prob

I0912 18:22:27.122660 11968 net.cpp:83] Top shape: 10 1000 1 1 (10000)

I0912 18:22:27.122688 11968 net.cpp:125] prob needs backward computation.

I0912 18:22:27.122706 11968 net.cpp:156] This network produces output prob

I0912 18:22:27.122745 11968 net.cpp:402] Collecting Learning Rate and Weight Decay.

I0912 18:22:27.122769 11968 net.cpp:167] Network initialization done.

I0912 18:22:27.122788 11968 net.cpp:168] Memory required for data: 6183480

Done with init

Using CPU Mode

Done with set_mode

Done with set_phase_test

Elapsed time is 0.579487 seconds.

Elapsed time is 3.748376 seconds.

ans =

1           1        1000          10

来自 “ ITPUB博客 ” ,链接:http://blog.itpub.net/16582684/viewspace-1268749/,如需转载,请注明出处,否则将追究法律责任。

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值