Translations:Word Embeddings/23/en: Difference between revisions
(Importing a new version from external source) |
(Importing a new version from external source) Tag: Manual revert |
||
| (2 intermediate revisions by the same user not shown) | |||
| Line 1: | Line 1: | ||
where <math>\mathbf{v}_w</math> and <math>\mathbf{v}'_w</math> are the input and output embedding vectors. Computing the full softmax over the vocabulary is expensive, so two approximations are commonly used: | where <math>\mathbf{v}_w</math> and <math>\mathbf{v}'_w</math> are the input and output embedding vectors. Computing the full {{Term|softmax}} over the vocabulary is expensive, so two approximations are commonly used: | ||
Latest revision as of 23:34, 27 April 2026
where $ \mathbf{v}_w $ and $ \mathbf{v}'_w $ are the input and output embedding vectors. Computing the full softmax over the vocabulary is expensive, so two approximations are commonly used: