1、torch.nn.L1Loss
平均绝对值误差:运算结果与基准之间对应元素之间的差值的绝对值组成的向量;如果使用mean参数,就是求这个向量的均值作为loss,如果是使用了sum就是这个向量的和作为loss;
官方文档实现和原理见:
https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html#torch.nn.CrossEntropyLoss
2、torch.nn.MSELoss
均方误差loss:运算结果与基准之间对应元素之间的差值的平方组成的向量;如果使用mean参数,就是求这个向量的均值作为loss,如果是使用了sum就是这个向量的和作为loss;
官方文档实现和原理见:
https://pytorch.org/docs/stable/generated/torch.nn.MSELoss.html#torch.nn.MSELoss
3、torch.nn.CrossEntropyLoss
交叉熵:类别和概率,比较适用于分类;
官方文档实现和原理见;
https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html#torch.nn.CrossEntropyLoss
4、torch.nn.CTCLoss
用于语音识别,文本分析;
官方文档实现和原理见:
https://pytorch.org/docs/stable/generated/torch.nn.CTCLoss.html#torch.nn.CTCLoss
5、torch.nn.
NLLLoss
适用于分类;
官方文档实现和原理见:
https://pytorch.org/docs/stable/generated/torch.nn.NLLLoss.html#torch.nn.NLLLoss
https://blog.csdn.net/qq_22210253/article/details/85229988
6、torch.nn.
PoissonNLLLoss
Negative log likelihood loss with Poisson distribution of target.
官方文档实现和原理见:
https://pytorch.org/docs/stable/generated/torch.nn.PoissonNLLLoss.html#torch.nn.PoissonNLLLoss
7、torch.nn.GaussianNLLLoss
Gaussian negative log likelihood loss.
官方文档实现和原理见:
https://pytorch.org/docs/stable/generated/torch.nn.GaussianNLLLoss.html#torch.nn.GaussianNLLLoss
8、torch.nn.
KLDivLoss
The Kullback-Leibler divergence loss measure.
官方文档实现和原理见:
https://pytorch.org/docs/stable/generated/torch.nn.KLDivLoss.html#torch.nn.KLDivLoss
9、torch.nn.
BCELoss
Creates a criterion that measures the Binary Cross Entropy between the target and the output:
官方文档实现和原理见:
https://pytorch.org/docs/stable/generated/torch.nn.BCELoss.html#torch.nn.BCELoss
10、torch.nn.
BCEWithLogitsLoss
This loss combines a Sigmoid layer and the BCELoss in one single class.
官方文档实现和原理见:
https://pytorch.org/docs/stable/generated/torch.nn.BCEWithLogitsLoss.html#torch.nn.BCEWithLogitsLoss
11、torch.nn.
MarginRankingLoss
GAN、排名任务等;
官方文档实现和原理见:
https://pytorch.org/docs/stable/generated/torch.nn.MarginRankingLoss.html#torch.nn.MarginRankingLoss
12、torch.nn.
HuberLoss
增强对离群点的鲁棒性;
官方文档实现和原理见:
https://pytorch.org/docs/stable/generated/torch.nn.HuberLoss.html#torch.nn.HuberLoss
13、torch.nn.
SmoothL1Loss
Fast-RCNN;
官方文档实现和原理见:
https://pytorch.org/docs/stable/generated/torch.nn.SmoothL1Loss.html#torch.nn.SmoothL1Loss
14、torch.nn.
SoftMarginLoss
两类分类softmarginloss;
官方文档实现和原理见:
https://pytorch.org/docs/stable/generated/torch.nn.SoftMarginLoss.html#torch.nn.SoftMarginLoss
15、torch.nn.
MultiLabelSoftMarginLoss
多类别分类的softmarginloss;
官方文档实现和原理见:
16、torch.nn.
CosineEmbeddingLoss
consine距离的一种变种,用于向量;
官方文档实现和原理:
17、torch.nn.
MultiMarginLoss
optimizes a multi-class classification hinge loss ;
官方文档实现和原理:
https://pytorch.org/docs/stable/generated/torch.nn.MultiMarginLoss.html#torch.nn.MultiMarginLoss
18、torch.nn.
TripletMarginLoss
图像检索;同类相似度大于不同类别的相似度;
官方文档实现和原理:
https://pytorch.org/docs/stable/generated/torch.nn.TripletMarginLoss.html#torch.nn.TripletMarginLoss
19、torch.nn.
TripletMarginWithDistanceLoss
与上面不同的距离函数产生的loss;
官方文档实现和原理:
https://pytorch.org/docs/stable/generated/torch.nn.TripletMarginLoss.html#torch.nn.TripletMarginLoss
20、torch.nn.
CosineSimilarity
cosine距离:
官方文档实现和原理:
https://pytorch.org/docs/stable/generated/torch.nn.CosineSimilarity.html#torch.nn.CosineSimilarity
21、torch.nn.
PairwiseDistance
像素级欧氏距离;
官方文档实现和原理:
https://pytorch.org/docs/stable/generated/torch.nn.PairwiseDistance.html#torch.nn.PairwiseDistance