ImageNet 上标签平滑效果的可视化。顶部:当增加ε时,目标类别与其它类别之间的理论差距减小。下图:最大预测与其它类别平均值之间差距的经验分布。很明显,通过标签平滑,分布中心处于理论值并具有较少的极端值。
Loss:基于CrossEntropy进行Label Smooth操作
class CrossEntropyLabelSmooth(nn.Module):
ㅤdef __init__(self, num_classes, epsilon=0.1, use_gpu=True):
ㅤㅤㅤsuper(CrossEntropyLabelSmooth, self).__init__()
ㅤㅤㅤself.num_classes = num_classes
ㅤㅤㅤself.epsilon = epsilon
ㅤㅤㅤself.use_gpu = use_gpu
ㅤㅤㅤself.logsoftmax = nn.LogSoftmax(dim=1)
ㅤ
ㅤdef forward(self, inputs, targets):
ㅤㅤㅤlog_probs = self.logsoftmax(inputs)
ㅤㅤㅤtargets = torch.zeros(log_probs.size()).scatter_(1, targets.unsqueeze(1).data.cpu(), 1)
ㅤㅤㅤif self.use_gpu: targets = targets.cuda()
ㅤㅤㅤtargets = (1 - self.epsilon) * targets + self.epsilon / self.num_classes
ㅤㅤㅤloss = (- targets * log_probs).mean(0).sum()
ㅤㅤㅤreturn loss