Translations:Efficient Estimation of Word Representations/2/en
Efficient Estimation of Word Representations in Vector Space is a 2013 paper by Mikolov et al. from Google that introduced Word2Vec, a family of computationally efficient methods for learning distributed word representations (word embeddings) from large text corpora. The paper proposed two novel architectures — Continuous Bag-of-Words (CBOW) and Skip-gram — that could be trained on billions of words in hours, producing vector representations that captured syntactic and semantic word relationships, including the celebrated word analogy property.