**How to use Soft-label for Cross-Entropy loss? - PyTorch Forums**
https://discuss.pytorch.org/t/how-to-use-soft-label-for-cross-entropy-loss/72844

Mar 11, 2020 · Soft Cross Entropy Loss (TF has it does Pytorch have it) softmax_cross_entropy_with_logits TF supports not needing to have hard labels for cross entropy loss: logits = [[4.0, 2.0, 1.0], [0.0, 5.0, 1.0]] labels = [[1.0, 0.0, 0.0], [0.0, 0.8, 0.2]] tf.nn.softmax_cross_entropy_with_logits(labels=labels, logits=logits) Can we do the same thing in ...

**DA:** 44 **PA:** 40 **MOZ Rank:** 37