def balanced_softmax_loss(labels, logits, sample_per_class, reduction):
"""Compute the Balanced Softmax Loss between `logits` and the ground truth `labels`.
Args:
labels: A int tensor of size [batch].
logits: A float tensor of size [batch, no_of_classes].
sample_per_class: A int tensor of size [no of classes].
reduction: string. One of "none", "mean", "sum"
Returns:
loss: A float tensor. Balance
Balanced Meta-Softmax for Long-Tailed Visual Recognition
最新推荐文章于 2023-11-10 13:40:08 发布