import timm
import torch
import torch.nn.functional as F
from timm.loss import LabelSmoothingCrossEntropy, SoftTargetCrossEntropy
from timm.data.mixup import mixup_target
Same as NLL loss with label smoothing. Label smoothing increases loss when the model is correct x
and decreases loss when model is incorrect x_i
. Use this to not punish model as harshly, such as when incorrect labels are expected.
x = torch.eye(2)
x_i = 1 - x
y = torch.arange(2)
LabelSmoothingCrossEntropy(0.0)(x,y),LabelSmoothingCrossEntropy(0.0)(x_i,y)
LabelSmoothingCrossEntropy(0.1)(x,y),LabelSmoothingCrossEntropy(0.1)(x_i,y)
log_softmax
family loss function to be used with mixup. Use mixup_target to add label smoothing and adjust the amount of mixing of the target labels.
x=torch.tensor([[[0,1.,0,0,1.]],[[1.,1.,1.,1.,1.]]],device='cuda')
y=mixup_target(torch.tensor([1,4],device='cuda'),5, lam=0.7)
x,y
SoftTargetCrossEntropy()(x[0],y),SoftTargetCrossEntropy()(x[1],y)