Translations:Cross-Entropy Loss/27/en

    From Marovi AI

    Naively computing $ \log(\mathrm{softmax}(z_k)) $ involves exponentiating potentially large logits, causing overflow. The log-sum-exp trick avoids this: