TensorRT softmax层

11 篇文章 0 订阅
6 篇文章 0 订阅

今天遇到分类问题 

TensorRT softmax层

如果直接在fc后面接softmax 则会进行全局softmax

需要进行reshape  比如2分类  需要转为[1, 1, 2]这种维度  然后在维度2上进行softmax

百度后  先使用 IShuffleLayer 进行维度变换

IShuffleLayer *shuffleLayer = network->addShuffle(input);
assert(shuffleLayer);
shuffleLayer->setReshapeDimensions(Dims3(1, -1, c));

再进行softmax层

ISoftMaxLayer *softmax = network->addSoftMax(*shuffleLayer1->getOutput(0));
assert(softmax);
softmax->setAxes(1<<2);

特别注意这里的 softmax->setAxes(1<<2);

如果按照pytorch思路  直接 softmax->setAxes(2);  则结果就是全部1 结果不是我我们预期的

看下官方代码注释

//!
    //! \brief Set the axis along which softmax is computed. Currently, only one axis can be set.
    //!
    //! The axis is specified by setting the bit corresponding to the axis to 1.
    //! Let's say we have an NCHW tensor as input (three non-batch dimensions).
    //!
    //! In implicit mode :
    //! Bit 0 corresponds to the C dimension boolean.
    //! Bit 1 corresponds to the H dimension boolean.
    //! Bit 2 corresponds to the W dimension boolean.
    //! By default, softmax is performed on the axis which is the number of axes minus three. It is 0 if
    //! there are fewer than 3 non-batch axes. For example, if the input is NCHW, the default axis is C. If the input
    //! is NHW, then the default axis is H.
    //!
    //! In explicit mode :
    //! Bit 0 corresponds to the N dimension boolean.
    //! Bit 1 corresponds to the C dimension boolean.
    //! Bit 2 corresponds to the H dimension boolean.
    //! Bit 3 corresponds to the W dimension boolean.
    //! By default, softmax is performed on the axis which is the number of axes minus three. It is 0 if
    //! there are fewer than 3 axes. For example, if the input is NCHW, the default axis is C. If the input
    //! is NHW, then the default axis is N.
    //!
    //! For example, to perform softmax on axis R of a NPQRCHW input, set bit 2 with implicit batch mode,
    //! set bit 3 with explicit batch mode.
    //!
    //! \param axes The axis along which softmax is computed.
    //!        Here axes is a bitmap. For example, when doing softmax along axis 0, bit 0 is set to 1, axes = 1 << axis = 1.
    //!

按照网上的解释 :

例如以NCHW而言,如果想要对H所在维度进行softmax, mask为0010 对于bitmap表示法:0100  转为bit移位操作  (1<<2)

对于C维度操作,mask为0100 bit表示法则为0010 转为bit移位操作(1<<1)

对于我的代码 我的输入softmax维度为(1, 1, 2, 0)

对于维度2进行操作 mask 0010  bit操作则 0100 移位操作(1<<2)

完整代码:

// softmax layer
ILayer* reshapeSoftmax(INetworkDefinition *network, ITensor &input, int c) {
    IShuffleLayer *shuffleLayer1 = network->addShuffle(input);
    assert(shuffleLayer1);
    shuffleLayer1->setReshapeDimensions(Dims3(1, -1, c));

    Dims dim0 = shuffleLayer1->getOutput(0)->getDimensions();

    cout << "softmax output dims " << dim0.d[0] << " " << dim0.d[1] << " " << dim0.d[2] << " " << dim0.d[3] << endl;

    ISoftMaxLayer *softmax = network->addSoftMax(*shuffleLayer1->getOutput(0));
    assert(softmax);
    softmax->setAxes(1<<2);

    // 再变为一维数组
    Dims dim_{};
    dim_.nbDims = 1;
    dim_.d[0] = -1;

    IShuffleLayer *shuffleLayer2 = network->addShuffle(*softmax->getOutput(0));
    assert(shuffleLayer2);
    shuffleLayer2->setReshapeDimensions(dim_);

    return shuffleLayer2;

}

可以参考下

https://www.cnblogs.com/yanghailin/p/14486077.html

https://github.com/wang-xinyu/tensorrtx/blob/18fa419ae35bfcbd27248b3eb9329f415f604366/retinafaceAntiCov/retinafaceAntiCov.cpp

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值