caffe代码解读-mnist例子-后向计算

后向和前向计算之间存在一个数据转化点,其实就是前向计算的loss如何传递给 后向计算。

在net中,存在

/// bottom_vecs stores the vectors containing the input for each layer.
  /// They don't actually host the blobs (blobs_ does), so we simply store
  /// pointers.
  vector<vector<Blob<Dtype>*> > bottom_vecs_;
  vector<vector<int> > bottom_id_vecs_;
  vector<vector<bool> > bottom_need_backward_;
  /// top_vecs stores the vectors containing the output for each layer
  vector<vector<Blob<Dtype>*> > top_vecs_;

调用函数栈

(lldb) bt
* thread #1: tid = 0xdba216, 0x00000001002ae785 libcaffe.so.1.0.0-rc3`caffe::SoftmaxWithLossLayer<float>::Backward_cpu(this=0x0000000101717cb0, top=0x0000000101615070, propagate_down=0x0000000101615210, bottom=0x00000001017009d0) + 53 at softmax_loss_layer.cpp:120, queue = 'com.apple.main-thread', stop reason = breakpoint 15.1
  * frame #0: 0x00000001002ae785 libcaffe.so.1.0.0-rc3`caffe::SoftmaxWithLossLayer<float>::Backward_cpu(this=0x0000000101717cb0, top=0x0000000101615070, propagate_down=0x0000000101615210, bottom=0x00000001017009d0) + 53 at softmax_loss_layer.cpp:120
    frame #1: 0x000000010019f1f3 libcaffe.so.1.0.0-rc3`caffe::Layer<float>::Backward(this=0x0000000101717cb0, top=0x0000000101615070, propagate_down=0x0000000101615210, bottom=0x00000001017009d0) + 131 at layer.hpp:495
    frame #2: 0x00000001002f32db libcaffe.so.1.0.0-rc3`caffe::Net<float>::BackwardFromTo(this=0x0000000101625a50, start=2, end=0) + 1003 at net.cpp:595
    frame #3: 0x00000001002f2b94 libcaffe.so.1.0.0-rc3`caffe::Net<float>::Backward(this=0x0000000101625a50) + 68 at net.cpp:725
    frame #4: 0x00000001002f3d4a libcaffe.so.1.0.0-rc3`caffe::Net<float>::ForwardBackward(this=0x0000000101625a50) + 42 at net.hpp:89
    frame #5: 0x00000001003356eb libcaffe.so.1.0.0-rc3`caffe::Solver<float>::Step(this=0x00000001017003f0, iters=10000) + 1179 at solver.cpp:222
    frame #6: 0x0000000100334a31 libcaffe.so.1.0.0-rc3`caffe::Solver<float>::Solve(this=0x00000001017003f0, resume_file=0x0000000000000000) + 881 at solver.cpp:293
    frame #7: 0x0000000100004df8 caffe.bin`train() + 6840 at caffe.cpp:252
    frame #8: 0x0000000100009038 caffe.bin`main(argc=2, argv=0x00007fff5fbffb08) + 600 at caffe.cpp:443
    frame #9: 0x00007fff8d0585ad libdyld.dylib`start + 1


在net.hpp中

  Dtype ForwardBackward() {
    Dtype loss;
    Forward(&loss);
    Backward();
    return loss;
  }
为什么loss不传给backward呢 

top_vecs_也没有传递

参考

http://deeplearning.stanford.edu/wiki/index.php/Softmax回归

可以看出,diff实际上和loss没有关系,所以是不需要传递的。

参考,https://www.zhihu.com/question/28927103,对于solfmaxwithloss层来说,backword 的计算并没有theta,而是z


评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值