Translations:Softmax Function/8/en: Difference between revisions

    From Marovi AI
    (Importing a new version from external source)
     
    (Importing a new version from external source)
    Tag: Manual revert
     
    (2 intermediate revisions by the same user not shown)
    Line 1: Line 1:
    The softmax function amplifies differences between logits. A logit that is larger than its peers receives a disproportionately large share of the probability mass because the exponential function grows super-linearly. For example:
    The softmax function amplifies differences between {{Term|logits}}. A {{Term|logits|logit}} that is larger than its peers receives a disproportionately large share of the probability mass because the exponential function grows super-linearly. For example:

    Latest revision as of 23:34, 27 April 2026

    Information about message (contribute)
    This message has no documentation. If you know where or how this message is used, you can help other translators by adding documentation to this message.
    Message definition (Softmax Function)
    The softmax function amplifies differences between {{Term|logits}}. A {{Term|logits|logit}} that is larger than its peers receives a disproportionately large share of the probability mass because the exponential function grows super-linearly. For example:

    The softmax function amplifies differences between logits. A logit that is larger than its peers receives a disproportionately large share of the probability mass because the exponential function grows super-linearly. For example: