PROWAREtech
Word2Vec Word Vector Math
Google's Word2Vec is a popular machine learning model used to generate word embeddings, which are vector representations of words that capture semantic meanings based on their usage in text. The approach uses neural networks to learn word associations from large corpora of text. Here’s a more detailed breakdown of how Word2Vec works:
1. Model Architecture
Word2Vec offers two model architectures:
- CBOW (Continuous Bag of Words): In this architecture, the model predicts the current word based on its context (surrounding words). This means the input to the model is the context words, and the output is the target word.
- Skip-gram: The skip-gram model works in the opposite way; it uses the current word to predict the surrounding context words. This is generally more effective when dealing with less frequent words.
2. Learning Process
Both architectures use a shallow neural network (usually just one hidden layer) to learn the weights, which eventually become the "embeddings" or vector representations. During training, Word2Vec uses:
- A sliding window to move across the text.
- Each word (or the current word in the case of skip-gram) and its adjacent words within the window are either used as inputs to predict the middle word (CBOW) or as outputs predicted by the middle word (skip-gram).
- The objective is to adjust the vector representations to maximize the likelihood of the actual words in the context.
3. Vector Operations
Once trained, each word in the vocabulary is associated with a vector. Word2Vec allows performing interesting algebraic operations on these vectors, such as:
- Similarity: Cosine similarity between vectors is typically used to find words that have similar meanings, though this page uses squared Euclidean distance as it is less computationally intensive.
- Analogy: For example, "king + queen = princess" shows how relationships and analogies can be captured by vector arithmetic.
Word embeddings learned by Word2Vec can be manipulated algebraically to discover semantic relationships between words. As mentioned, an example is "king + queen = princess".
Continuous Bag of Words
Add and subtract words, and find the nearest word.
This feature is deprecated |
search |