caffe流程


1.caffe.cpp

2. caffe.cpp

train()

3. caffe.cpp

shared_ptr<caffe::Solver<float> >
      solver(caffe::SolverRegistry<float>::CreateSolver(solver_param));

4.solver.cpp

Solver<Dtype>::Solver(const SolverParameter& param, const Solver* root_solver)
    : net_(), callbacks_(), root_solver_(root_solver),
      requested_early_exit_(false) {
  Init(param);
}



Solver::Init()函数,在Init()中,有两个函数值得注意:InitTrainNet()InitTestNets()分别初始化训练网络和测试网络。

template <typename Dtype>
void Solver<Dtype>::Init(const SolverParameter& param) {
  CHECK(Caffe::root_solver() || root_solver_)
      << "root_solver_ needs to be set for all non-root solvers";
  LOG_IF(INFO, Caffe::root_solver()) << "Initializing solver from parameters: "
    << std::endl << param.DebugString();
  printf("gaoenyang---------------------------------------param.DebugString()  End\n");
  param_ = param;
  CHECK_GE(param_.average_loss(), 1) << "average_loss should be non-negative.";
  CheckSnapshotWritePermissions();
  if (Caffe::root_solver() && param_.random_seed() >= 0) {
    Caffe::set_random_seed(param_.random_seed());
  }
  // Scaffolding code
  InitTrainNet();
  if (Caffe::root_solver()) {
    InitTestNets();
    LOG(INFO) << "Solver scaffolding done.";
  }
  iter_ = 0;
  current_step_ = 0;
}




template <typename Dtype>
void Solver<Dtype>::InitTrainNet() {........

else if (param_.has_train_net()) {
    LOG_IF(INFO, Caffe::root_solver())
        << "Creating training net from train_net file: " << param_.train_net();
    ReadNetParamsFromTextFileOrDie(param_.train_net(), &net_param);
 }



首先, ReadNetParamsFromTextFileOrDie(param_.NET(), &net_param)param_.Net()(即 examples/mnist/lenet_train_test.prototxt)中的信息读入 net_param

其次,net_.reset(new Net<Dtype>(net_param))重新构建网络,调用Net的构造方法。

if (Caffe::root_solver()) {
    net_.reset(new Net<Dtype>(net_param));
  } else {
    net_.reset(new Net<Dtype>(net_param, root_solver_->net_.get()));
  }


4.net.cpp

然后,在构造方法中执行Net::init(),开始正式创建网络。其主要代码如下:

template <typename Dtype>
    void Net<Dtype>::Init(const NetParameter& in_param) {
    ...
      for (int layer_id = 0; layer_id < param.layer_size(); ++layer_id) {

        // Setup layer.
        const LayerParameter& layer_param = param.layer(layer_id);

        // 在这里创建网络层
        layers_.push_back(LayerRegistry<Dtype>::CreateLayer(layer_param));

        // Figure out this layer's input and output
        for (int bottom_id = 0; bottom_id < layer_param.bottom_size();  ++bottom_id) {
          const int blob_id = AppendBottom(param, layer_id, bottom_id, &available_blobs, &blob_name_to_idx);
          // If a blob needs backward, this layer should provide it.
          need_backward |= blob_need_backward_[blob_id];
        }
        int num_top = layer_param.top_size();
        for (int top_id = 0; top_id < num_top; ++top_id) {
          AppendTop(param, layer_id, top_id, &available_blobs, &blob_name_to_idx);
        }
     ...

      // 在这里配置网络层
      layers_[layer_id]->SetUp(bottom_vecs_[layer_id], top_vecs_[layer_id]);
      ...
     }

    for (int param_id = 0; param_id < num_param_blobs; ++param_id) {
      AppendParam(param, layer_id, param_id);
    }

    ...
    }




说明:

  1. Lenet5在caffe中共有9层,即param.layer_size()==9,以上代码每一次for循环创建一个网络层
  2. 每层网络是通过LayerRegistry::CreateLayer()创建的,类似与Solver的创建
  3. 14行Net::AppendBottom(),对于layer_id这层,从Net::blob_中取出blob放入该层对应的bottom_vecs_[layer_id]
  4. 20行Net::AppendTop(),对于layer_id这层,创建blob(未包含数据)并放入Net::blob_
  5. AppendParam中把每层网络的训练参数与网络变量learnable_params_绑定,在lenet中,只有conv1,conv2,ip1,ip2四层有参数,每层分别有参数与偏置参数两项参数,因而learnable_params_的size为8.



  • 2
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值