Translations:Cross-Entropy Loss/27/en

    From Marovi AI
    Revision as of 23:34, 27 April 2026 by FuzzyBot (talk | contribs) (Importing a new version from external source)
    (diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

    Naively computing $ \log(\mathrm{softmax}(z_k)) $ involves exponentiating potentially large logits, causing overflow. The log-sum-exp trick avoids this: