Caffe 损失层中loss_weight 如何存储?

一个网络中如果存在多个损失层的话,需要给每个损失层加上loss_weight参数,不加的话默认为1.0
但是loss_weight如何存储的呢?

这里我是从ContrastiveLossLayer::Backward_cpu中发现的:

const Dtype sign = (i == 0) ? 1 : -1;
const Dtype alpha = sign * top[0]->cpu_diff()[0] /
      static_cast<Dtype>(bottom[i]->num());

其中top[0]->cpu_diff()[0]保存的即为该层的loss_weight

训练时函数调用如下:

这里写图片描述

在所有层的父类layer.hpp中会执行下列操作:

void SetUp(const vector<Blob<Dtype>*>& bottom,
      const vector<Blob<Dtype>*>& top) {
    InitMutex();
    CheckBlobCounts(bottom, top);
    LayerSetUp(bottom, top);
    Reshape(bottom, top);
    SetLossWeights(top);
  }

先执行完LayerSetUp和Reshape的初始化操作,调用了SetLossWeights,其中caffe_set(count, loss_weight, loss_multiplier);将loss_weight赋值给top[0]->cpu_diff()。

/**
  * Called by SetUp to initialize the weights associated with any top blobs in
  * the loss function. Store non-zero loss weights in the diff blob.
  */
 inline void SetLossWeights(const vector<Blob<Dtype>*>& top) {
   const int num_loss_weights = layer_param_.loss_weight_size();
   if (num_loss_weights) {
     CHECK_EQ(top.size(), num_loss_weights) << "loss_weight must be "
         "unspecified or specified once per top blob.";
     for (int top_id = 0; top_id < top.size(); ++top_id) {
       const Dtype loss_weight = layer_param_.loss_weight(top_id);
       if (loss_weight == Dtype(0)) { continue; }
       this->set_loss(top_id, loss_weight);
       const int count = top[top_id]->count();
       Dtype* loss_multiplier = top[top_id]->mutable_cpu_diff();
       caffe_set(count, loss_weight, loss_multiplier);
     }
   }
 }

从const Dtype loss_weight = layer_param_.loss_weight(top_id);可以看到loss_wight可以直接从layer_param_中获取

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值