Translations:Cross-Entropy Loss/1/en

    From Marovi AI

    Cross-entropy loss (also called log loss) is the most widely used loss function for classification tasks in machine learning. Rooted in information theory, it measures the dissimilarity between the true label distribution and the model's predicted probability distribution, providing a smooth, differentiable objective that drives probabilistic classifiers toward confident, correct predictions.