一、torch文档给出的所有loss函数
- torch.nn.AdaptiveLogSoftmaxWithLoss (Python class, in torch.nn)
- torch.nn.BCELoss (Python class, in torch.nn)
- torch.nn.BCEWithLogitsLoss (Python class, in torch.nn)
- torch.nn.CosineEmbeddingLoss (Python class, in torch.nn)
- torch.nn.CrossEntropyLoss (Python class, in torch.nn)
- torch.nn.CTCLoss (Python class, in torch.nn)
- torch.nn.HingeEmbeddingLoss (Python class, in torch.nn)
- torch.nn.KLDivLoss (Python class, in torch.nn)
- torch.nn.L1Loss (Python class, in torch.nn)
- torch.nn.MarginRankingLoss (Python class, in torch.nn)
- torch.nn.MSELoss (Python class, in torch.nn)
- torch.nn.MultiLabelMarginLoss (Python class, in torch.nn)
- torch.nn.MultiLabelSoftMarginLoss (Python class, in torch.nn)
- torch.nn.MultiMarginLoss (Python class, in torch.nn)
- torch.nn.NLLLoss (Python class, in torch.nn)
- torch.nn.PoissonNLLLoss (Python class, in torch.nn)
- torch.nn.SmoothL1Loss (Python class, in torch.nn)
- torch.nn.SoftMarginLoss (Python class, in torch.nn)
- torch.nn.TripletMarginLoss (Python class, in torch.nn)
或
- torch.nn.functional.cosine_embedding_loss (Python function, in torch.nn.functional)
- torch.nn.functional.ctc_loss (Python function, in torch.nn.functional)
- torch.nn.functional.hinge_embedding_loss (Python function, in torch.nn.functional)
- torch.nn.functional.l1_loss (Python function, in torch.nn.functional)
- torch.nn.functional.margin_ranking_loss (Python function, in torch.nn.functional)
- torch.nn.functional.mse_loss (Python function, in torch.nn.functional)
- torch.nn.functional.multi_margin_loss (Python function, in torch.nn.functional)
- torch.nn.functional.multilabel_margin_loss (Python function, in torch.nn.functional)
- torch.nn.functional.multilabel_soft_margin_loss (Python function, in torch.nn.functional)
- torch.nn.functional.nll_loss (Python function, in torch.nn.functional)
- torch.nn.functional.poisson_nll_loss (Python function, in torch.nn.functional)
- torch.nn.functional.smooth_l1_loss (Python function, in torch.nn.functional)
- torch.nn.functional.soft_margin_loss (Python function, in torch.nn.functional)
- torch.nn.functional.triplet_margin_loss (Python function, in torch.nn.functional)
loss含义:
- torch.nn.AdaptiveLogSoftmaxWithLoss (Python class, in torch.nn)
- torch.nn.BCELoss (Python class, in torch.nn)
- torch.nn.BCEWithLogitsLoss (Python class, in torch.nn)
- torch.nn.CosineEmbeddingLoss (Python class, in torch.nn) 余弦损失
- torch.nn.CrossEntropyLoss (Python class, in torch.nn)
- torch.nn.CTCLoss (Python class, in torch.nn) 序列分类 详见 https://blog.csdn.net/yifen4234/article/details/80334516
- torch.nn.HingeEmbeddingLoss (Python class, in torch.nn) 合页损失
- torch.nn.KLDivLoss (Python class, in torch.nn) KL损失
- torch.nn.L1Loss (Python class, in torch.nn) L1损失 详见 https://pytorch.org/docs/stable/nn.html?highlight=loss
- torch.nn.MarginRankingLoss (Python class, in torch.nn)
- torch.nn.MSELoss (Python class, in torch.nn) 平方损失 L2损失的平方
- torch.nn.MultiLabelMarginLoss (Python class, in torch.nn) 多分类合页损失
- torch.nn.MultiLabelSoftMarginLoss (Python class, in torch.nn) 多分类 根据最大熵的多标签 one-versue-all 损失
- torch.nn.MultiMarginLoss (Python class, in torch.nn)
- torch.nn.NLLLoss (Python class, in torch.nn)
- torch.nn.PoissonNLLLoss (Python class, in torch.nn) 带位置信息的负对数似然
- torch.nn.SmoothL1Loss (Python class, in torch.nn)
- torch.nn.SoftMarginLoss (Python class, in torch.nn)
- torch.nn.TripletMarginLoss (Python class, in torch.nn)
二、NLP任务中常用的loss函数
1. nn.CrossEntropyLoss()
适用于多分类问题(词表上的多分类等)
输入 - input x, (N,C)(N,C)(N, C), C=num_classes 类别总数.
输入 - target y, (N)(N)(N), 每个值都是 0 ≤ targets[i] ≤ C−1
2. nn.BCELoss()
输入 - input x, (N, *), 0或1类别
输入 - target y, (N, *),,每个值都是 0 和 1 之间的数值
【更多交叉熵】链接:https://cloud.tencent.com/developer/article/1126921
3. nn.NLLLoss() 负对数似然损失函数(Negative Log Likelihood)
输入 - Input: (N,C)(N,C) where C = number of classes, or (N,C,d1,d2,...,dK) with K≥2 in the case of K-dimensional loss.
输入 - Target: (N)(N) where each value is 0 ≤ targets[i] ≤ C−1, or (N,d1,d2,...,dK) with K≥2 in the case of K-dimensional loss.
【1】与【3】唯一的不同是【1】为我们去做 softmax。
即 CrossEntropyLoss() = log_softmax() + NLLLoss()
******note*****:
(a):nn.NLLLoss()的input必须是F.log_softmax()的输出,若是F.softmax()的输出则loss会变成负值。
(b):手动计算nn.NLLLoss():一份很好的链接:http://www.pianshen.com/article/4916145350/
4.nn.MSELoss()
平方损失函数 ,常用于线性模型
5.nn.KLDivLoss()
KL 散度,又叫做相对熵,算的是两个分布之间的距离,越相似则越接近零。
注意这里的 是 概率,刚开始还以为 API 弄错了。
6.nn.MarginRankingLoss()
评价相似度的损失
这里的三个都是标量,y 只能取 1 或者 -1,取 1 时表示 x1 比 x2 要大;反之 x2 要大。参数 margin 表示两个向量至少要相聚 margin 的大小,否则 loss 非负。默认 margin 取零。
7.nn.MultiMarginLoss()
多分类(multi-class)的 Hinge 损失,
其中 表示标签, 默认取 1, 默认取 1,也可以取别的值。参考 cs231n 作业里对 SVM Loss 的推导。