Translations:Softmax Function/8/en: Difference between revisions
(Importing a new version from external source) |
(Importing a new version from external source) |
||
| Line 1: | Line 1: | ||
The softmax function amplifies differences between logits. A logit that is larger than its peers receives a disproportionately large share of the probability mass because the exponential function grows super-linearly. For example: | The softmax function amplifies differences between {{Term|logits}}. A {{Term|logits|logit}} that is larger than its peers receives a disproportionately large share of the probability mass because the exponential function grows super-linearly. For example: | ||
Revision as of 19:42, 27 April 2026
The softmax function amplifies differences between logits. A logit that is larger than its peers receives a disproportionately large share of the probability mass because the exponential function grows super-linearly. For example: