NLP常用loss总结

一、torch文档给出的所有loss函数

  1. torch.nn.AdaptiveLogSoftmaxWithLoss (Python class, in torch.nn)
  2. torch.nn.BCELoss (Python class, in torch.nn)
  3. torch.nn.BCEWithLogitsLoss (Python class, in torch.nn)
  4. torch.nn.CosineEmbeddingLoss (Python class, in torch.nn)
  5. torch.nn.CrossEntropyLoss (Python class, in torch.nn)
  6. torch.nn.CTCLoss (Python class, in torch.nn)
  7. torch.nn.HingeEmbeddingLoss (Python class, in torch.nn)
  8. torch.nn.KLDivLoss (Python class, in torch.nn)
  9. torch.nn.L1Loss (Python class, in torch.nn)
  10. torch.nn.MarginRankingLoss (Python class, in torch.nn)
  11. torch.nn.MSELoss (Python class, in torch.nn)
  12. torch.nn.MultiLabelMarginLoss (Python class, in torch.nn)
  13. torch.nn.MultiLabelSoftMarginLoss (Python class, in torch.nn)
  14. torch.nn.MultiMarginLoss (Python class, in torch.nn)
  15. torch.nn.NLLLoss (Python class, in torch.nn)
  16. torch.nn.PoissonNLLLoss (Python class, in torch.nn)
  17. torch.nn.SmoothL1Loss (Python class, in torch.nn)
  18. torch.nn.SoftMarginLoss (Python class, in torch.nn)
  19. torch.nn.TripletMarginLoss (Python class, in torch.nn)

  1. torch.nn.functional.cosine_embedding_loss (Python function, in torch.nn.functional)
  2. torch.nn.functional.ctc_loss (Python function, in torch.nn.functional)
  3. torch.nn.functional.hinge_embedding_loss (Python function, in torch.nn.functional)
  4. torch.nn.functional.l1_loss (Python function, in torch.nn.functional)
  5. torch.nn.functional.margin_ranking_loss (Python function, in torch.nn.functional)
  6. torch.nn.functional.mse_loss (Python function, in torch.nn.functional)
  7. torch.nn.functional.multi_margin_loss (Python function, in torch.nn.functional)
  8. torch.nn.functional.multilabel_margin_loss (Python function, in torch.nn.functional)
  9. torch.nn.functional.multilabel_soft_margin_loss (Python function, in torch.nn.functional)
  10. torch.nn.functional.nll_loss (Python function, in torch.nn.functional)
  11. torch.nn.functional.poisson_nll_loss (Python function, in torch.nn.functional)
  12. torch.nn.functional.smooth_l1_loss (Python function, in torch.nn.functional)
  13. torch.nn.functional.soft_margin_loss (Python function, in torch.nn.functional)
  14. torch.nn.functional.triplet_margin_loss (Python function, in torch.nn.functional)

loss含义:

  1. torch.nn.AdaptiveLogSoftmaxWithLoss (Python class, in torch.nn)
  2. torch.nn.BCELoss (Python class, in torch.nn)
  3. torch.nn.BCEWithLogitsLoss (Python class, in torch.nn)
  4. torch.nn.CosineEmbeddingLoss (Python class, in torch.nn) 余弦损失
  5. torch.nn.CrossEntropyLoss (Python class, in torch.nn)
  6. torch.nn.CTCLoss (Python class, in torch.nn) 序列分类 详见 https://blog.csdn.net/yifen4234/article/details/80334516
  7. torch.nn.HingeEmbeddingLoss (Python class, in torch.nn) 合页损失
  8. torch.nn.KLDivLoss (Python class, in torch.nn) KL损失
  9. torch.nn.L1Loss (Python class, in torch.nn)     L1损失 详见 https://pytorch.org/docs/stable/nn.html?highlight=loss
  10. torch.nn.MarginRankingLoss (Python class, in torch.nn)
  11. torch.nn.MSELoss (Python class, in torch.nn) 平方损失  L2损失的平方
  12. torch.nn.MultiLabelMarginLoss (Python class, in torch.nn) 多分类合页损失
  13. torch.nn.MultiLabelSoftMarginLoss (Python class, in torch.nn) 多分类 根据最大熵的多标签 one-versue-all 损失
  14. torch.nn.MultiMarginLoss (Python class, in torch.nn)
  15. torch.nn.NLLLoss (Python class, in torch.nn)
  16. torch.nn.PoissonNLLLoss (Python class, in torch.nn) 带位置信息的负对数似然
  17. torch.nn.SmoothL1Loss (Python class, in torch.nn)    
  18. torch.nn.SoftMarginLoss (Python class, in torch.nn)
  19. torch.nn.TripletMarginLoss (Python class, in torch.nn)

 

 

二、NLP任务中常用的loss函数

 

1. nn.CrossEntropyLoss() 

适用于多分类问题(词表上的多分类等)

输入 - input x, (N,C)(N,C)(N, C), C=num_classes 类别总数.

输入 - target y, (N)(N)(N), 每个值都是 0 ≤ targets[i] ≤ C−1 

 

2. nn.BCELoss()

输入 - input x, (N, *), 0或1类别

输入 - target y, (N, *),,每个值都是 0 和 1 之间的数值

【更多交叉熵】链接:https://cloud.tencent.com/developer/article/1126921

 

3. nn.NLLLoss() 负对数似然损失函数(Negative Log Likelihood) 

输入 - Input: (N,C)(N,C) where C = number of classes, or (N,C,d1​,d2​,...,dK​) with K≥2 in the case of K-dimensional loss.

输入 - Target: (N)(N) where each value is 0 ≤ targets[i] ≤ C−1, or (N,d1,d2,...,dK) with K≥2 in the case of K-dimensional loss.

【1】与【3】唯一的不同是【1】为我们去做 softmax。 

即 CrossEntropyLoss() = log_softmax() + NLLLoss() 

******note*****:

(a):nn.NLLLoss()的input必须是F.log_softmax()的输出,若是F.softmax()的输出则loss会变成负值。

(b):手动计算nn.NLLLoss():一份很好的链接:http://www.pianshen.com/article/4916145350/

 

4.nn.MSELoss()

平方损失函数 ,常用于线性模型

 

5.nn.KLDivLoss()

KL 散度,又叫做相对熵,算的是两个分布之间的距离,越相似则越接近零。 

注意这里的 是 概率,刚开始还以为 API 弄错了。

 

6.nn.MarginRankingLoss()

评价相似度的损失

这里的三个都是标量,y 只能取 1 或者 -1,取 1 时表示 x1 比 x2 要大;反之 x2 要大。参数 margin 表示两个向量至少要相聚 margin 的大小,否则 loss 非负。默认 margin 取零。

 

7.nn.MultiMarginLoss()

多分类(multi-class)的 Hinge 损失,

其中 表示标签, 默认取 1, 默认取 1,也可以取别的值。参考 cs231n 作业里对 SVM Loss 的推导。

 

 

 

参考:http://www.mamicode.com/info-detail-2580425.html

  • 0
    点赞
  • 8
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值