Word2Vec: A Word is Worth a Thousand Vectors
That word vectors represent much of the information available in a dictionary definition is a convenient and almost miraculous side effect of trying to predict the context of a word. - article
- text vector - sum of token to get meanings
There is a strong conceptual relation between bag-of-word representations of old days and word2vec. - compare evolutio from document vectors and term vectors to dense vector similar to word2vec. - Neural Word Embedding as Implicit Matrix Factorization
see also
- How might LLMs store facts - DL7
- The secret ingredients of word2vec (2016) - Word2vec is a pervasive tool for learning word embeddings. Its success, however, is mostly due to particular architecture choices. Transferring these choices to traditional distributional methods makes them competitive with popular word embedding methods.
Written on September 3, 2019, Last update on January 29, 2025
word2vec
AI
LLM
text
vector