更改LetNet-5网络

话说上次博客画出来MNIST的LENET-5模型,http://blog.csdn.net/ture_dream/article/details/53007803

今天可以把模型稍加修改,变成逻辑回归(Logistics Regression,LR)分类器

1.复制examples/mnist/lenet_train_test.prototxt,重命名为lenet_lr.prototxt,修改:

name: "LeNet"
layer {
  name: "mnist"
  type: "Data"
  top: "data"
  top: "label"
  include {
    phase: TRAIN
  }
  transform_param {
    scale: 0.00390625
  }
  data_param {
    source: "examples/mnist/mnist_train_lmdb"
    batch_size: 64
    backend: LMDB
  }
}
layer {
  name: "mnist"
  type: "Data"
  top: "data"
  top: "label"
  include {
    phase: TEST
  }
  transform_param {
    scale: 0.00390625
  }
  data_param {
    source: "examples/mnist/mnist_test_lmdb"
    batch_size: 100
    backend: LMDB
  }
}
layer {
  name: "ip"
  type: "InnerProduct"
  bottom: "data"
  top: "ip"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  inner_product_param {
    num_output: 10
    weight_filler {
      type: "xavier"
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "accuracy"
  type: "Accuracy"
  bottom: "ip"
  bottom: "label"
  top: "accuracy"
  include {
    phase: TEST
  }
}
layer {
  name: "loss"
  type: "SoftmaxWithLoss"
  bottom: "ip"
  bottom: "label"
  top: "loss"
}

即,去掉两个卷基层,保留一个全连接层。

2.复制examples/mnist/lenet_solver.prototxt,重命名为lenet_lr_solver.prototxt,修改:

# The train/test net protocol buffer definition
net: "examples/mnist/lenet_lr.prototxt"
# test_iter specifies how many forward passes the test should carry out.
# In the case of MNIST, we have test batch size 100 and 100 test iterations,
# covering the full 10,000 testing images.
test_iter: 100
# Carry out testing every 500 training iterations.
test_interval: 500
# The base learning rate, momentum and the weight decay of the network.
base_lr: 0.01
momentum: 0.9
weight_decay: 0.0005
# The learning rate policy
lr_policy: "inv"
gamma: 0.0001
power: 0.75
# Display every 100 iterations
display: 100
# The maximum number of iterations
max_iter: 10000
# snapshot intermediate results
snapshot: 5000
snapshot_prefix: "examples/mnist/lenet"
# solver mode: CPU or GPU
solver_mode: GPU

3.训练运行,在caffe根目录下,命令行输入:

./build/tools/caffe train --solver=examples/mnist/lenet_lr_solver.prototxt



还记得以前LeNet-5精度吗?没关系,数据还在,命令行输入:

./build/tools/caffe train --solver=examples/mnist/lenet_solver.prototxt



  都是迭代一万次,准确率低了0.99到0.92.

应为模型简化后参数变少,层数变少,网络表达能力变差

4.结构图,

以前结构图:




命令行输入:


sudo python ./python/draw_net.py  ./examples/mnist/lenet_lr.prototxt  ./examples/mnist/lenet_lr.jpg






结构一致,少了一些层



期间,应为没改IP1,画出了这样:

悬空,肯定不对,Net应该是一个DAG有向无环图

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值