Loss.function:计算实际输出和目标之间的差距,为更新输出提供一定的依据(反向传播)
一、L1Loss()
x:1,2,3
y:1,2,5
=0.6
reduction可以默认为none,输出均值;reduction='sum',输出为相加和。
需要注意输入和输出的形式,可以为任意维度的数据,但是输出也要跟输入相同格式!!!
import torch
from torch.nn import L1Loss
input = torch.tensor([1, 2, 3], dtype=torch.float32)#loss计算floating类型的数据
targets = torch.tensor([1, 2, 5],dtype=torch.float32)
input = torch.reshape(input,(1, 1, 1, 3))
targets = torch.reshape(targets,(1, 1, 1, 3))
# loss = L1Loss()
loss = L1Loss(reduction='sum')
result = loss(input, targets)
print(result)
二、MseLoss()
x:1,2,3
y:1,2,5
L1loss =[(1-1)+(2-2)+ (3-5)^2]/3=1.3333
import torch
from torch import nn
from torch.nn import L1Loss
input = torch.tensor([1, 2, 3], dtype=torch.float32)#loss计算floating类型的数据
targets = torch.tensor([1, 2, 5],dtype=torch.float32)
input = torch.reshape(input,(1, 1, 1, 3))
targets = torch.reshape(targets,(1, 1, 1, 3))
# loss = L1Loss()
# loss = L1Loss(reduction='sum')
# result = loss(input, targets)
#MseLoss()
loss_mes = nn.MSELoss()
result_mes = loss_mes(input, targets)
# print(result)
print(result_mes)
三、CrossEntropyLoss交叉熵
torch.nn.CrossEntropyLoss(weight=None, size_average=None, ignore_index=- 100, reduce=None, reduction='mean', label_smoothing=0.0)
图片来自--B站,我是土堆
设person的class为0,概率为0.1;dog的class为1,概率为0.2;cat的class为3,概率为0.3。
需要注意输入。可以只有分类或数据及分类
import torch
from torch import nn
from torch.nn import L1Loss
x = torch.tensor([0.1, 0.2, 0.3])
y = torch.tensor([1])
x = torch.reshape(x,(1, 3))
loss_cross = nn.CrossEntropyLoss()
result_cross = loss_cross(x, y)
print(result_cross)