Translations:Softmax Function/1/en: Difference between revisions
(Importing a new version from external source) |
(Importing a new version from external source) Tag: Manual revert |
||
| Line 1: | Line 1: | ||
The '''softmax function''' (also called the '''normalized exponential function''') is a mathematical function that converts a vector of real numbers (''' | The '''softmax function''' (also called the '''normalized exponential function''') is a mathematical function that converts a vector of real numbers ('''logits''') into a probability distribution. It is the standard output activation for multi-class classification in neural networks and plays a central role in models ranging from logistic regression to large language models. | ||
Revision as of 22:02, 27 April 2026
The softmax function (also called the normalized exponential function) is a mathematical function that converts a vector of real numbers (logits) into a probability distribution. It is the standard output activation for multi-class classification in neural networks and plays a central role in models ranging from logistic regression to large language models.