loss函数之SoftMarginLoss

SoftMarginLoss

用于二分类任务

对于包含 N N N个样本的batch数据 D ( x , y ) D(x, y) D(x,y), x x x代表模型输出, y y y代表真实的类别标签, y y y中元素的值属于 { 1 , − 1 } \{1,-1\} {1,1} l o s s loss loss计算如下:

l o s s = ∑ i log ⁡ ( 1 + exp ⁡ ( − y [ i ] ∗ x [ i ] ) )  x.nelement  ( ) loss= \frac{\sum_{i}\log (1+\exp (-y[i] * x[i]))}{\text { x.nelement }()} loss= x.nelement ()ilog(1+exp(y[i]x[i]))

 x.nelement  ( ) {\text { x.nelement }()}  x.nelement ()代表 x x x中元素的个数

若每个样本对应一个二分类,则  x.nelement  ( ) = N {\text { x.nelement }()}=N  x.nelement ()=N
若每个样本对应于M个二分类,则  x.nelement  ( ) = = M ∗ N {\text { x.nelement }()}==M*N  x.nelement ()==MN

  • x [ i ] x[i] x[i] y [ i ] y[i] y[i]同号,即预测正确时, exp ⁡ ( − y [ i ] ∗ x [ i ] ) < 1 \exp(-y[i] * x[i]) <1 exp(y[i]x[i])<1, log ⁡ ( 1 + exp ⁡ ( − y [ i ] ∗ x [ i ] ) ) < l o g 2 \log(1+\exp (-y[i] * x[i])) < log2 log(1+exp(y[i]x[i]))<log2, 值很小。并且 y [ i ] ∗ x [ i ] y[i] * x[i] y[i]x[i]乘积越大,分类确信度越高,loss越小;

  • x [ i ] x[i] x[i] y [ i ] y[i] y[i]异号,即预测错误时, log ⁡ ( 1 + exp ⁡ ( − y [ i ] ∗ x [ i ] ) ) \log (1+\exp (-y[i] * x[i])) log(1+exp(y[i]x[i]))取值较大;

l o s s loss loss取值 log ⁡ ( 1 + exp ⁡ ( − y [ i ] ∗ x [ i ] ) ) \log (1+\exp (-y[i] * x[i])) log(1+exp(y[i]x[i]))而不是 log ⁡ ( exp ⁡ ( − y [ i ] ∗ x [ i ] ) ) \log (\exp (-y[i] * x[i])) log(exp(y[i]x[i])),是为了避免 l o s s loss loss计算为负数

例子:

import torch
import torch.nn as nn
import math

def validate_SoftMarginLoss(input, target):
    val = 0
    for li_x, li_y in zip(input, target):
        for x, y in zip(li_x, li_y):
            loss_val = math.log(1 + math.exp(- y * x), math.e)
            val += loss_val
    return val / input.nelement()

    
x = torch.FloatTensor([[0.1, 0.2, 0.4, 0.8], [0.1, 0.2, 0.4, 0.8]])
print(x.size())
y = torch.FloatTensor([[1, -1, 1, 1], [1, -1, 1, 1]])
print(y.size())

loss = nn.SoftMarginLoss(reduction="none")
loss_val = loss(x, y)
print(loss_val)

loss = nn.SoftMarginLoss(reduction="sum")
loss_val = loss(x, y)
print(loss_val.item())
print(loss_val.item() / x.nelement())

loss = nn.SoftMarginLoss(reduction="mean")
loss_val = loss(x, y)
print(loss_val.item())

valid_loss_val = validate_SoftMarginLoss(x, y)
print(valid_loss_val)

结果:

torch.Size([2, 4])
torch.Size([2, 4])
tensor([[0.6444, 0.7981, 0.5130, 0.3711],
        [0.6444, 0.7981, 0.5130, 0.3711]])
4.653303146362305
0.5816628932952881
0.5816628932952881
0.5816628606614725
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

旺旺棒棒冰

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值