Translations:Word Embeddings/23/en: Difference between revisions
(Importing a new version from external source) |
(Importing a new version from external source) |
||
| Line 1: | Line 1: | ||
where <math>\mathbf{v}_w</math> and <math>\mathbf{v}'_w</math> are the input and output embedding vectors. Computing the full softmax over the vocabulary is expensive, so two approximations are commonly used: | where <math>\mathbf{v}_w</math> and <math>\mathbf{v}'_w</math> are the input and output embedding vectors. Computing the full {{Term|softmax}} over the vocabulary is expensive, so two approximations are commonly used: | ||
Revision as of 19:42, 27 April 2026
where $ \mathbf{v}_w $ and $ \mathbf{v}'_w $ are the input and output embedding vectors. Computing the full softmax over the vocabulary is expensive, so two approximations are commonly used: