Translations:Word Embeddings/1/en: Difference between revisions
(Importing a new version from external source) Tag: Manual revert |
(Importing a new version from external source) Tag: Manual revert |
||
| Line 1: | Line 1: | ||
'''Word embeddings''' are dense, low-dimensional vector representations of words in which semantically similar words are mapped to nearby points in the vector space. They are a foundational component of modern natural language processing (NLP), replacing sparse one-hot encodings with representations that capture meaning, analogy, and syntactic relationships. | '''Word embeddings''' are dense, low-dimensional vector representations of words in which semantically similar words are mapped to nearby points in the vector space. They are a foundational component of modern natural language processing (NLP), replacing sparse {{Term|one-hot encoding|one-hot encodings}} with representations that capture meaning, analogy, and syntactic relationships. | ||
Latest revision as of 23:34, 27 April 2026
Word embeddings are dense, low-dimensional vector representations of words in which semantically similar words are mapped to nearby points in the vector space. They are a foundational component of modern natural language processing (NLP), replacing sparse one-hot encodings with representations that capture meaning, analogy, and syntactic relationships.