Translations:Efficient Estimation of Word Representations/28/en
word2vec transformed NLP by establishing word embeddings as the standard input representation for neural NLP systems. Before word2vec, most NLP systems relied on sparse, high-dimensional representations like one-hot vectors or tf-idf. word2vec demonstrated that dense, low-dimensional vectors could capture rich linguistic structure and transfer meaningfully across tasks.