Word Embedding
NLP & Text
Representing words as numerical vectors
What is Word Embedding?
Converts words to dense vectors where similar words have similar vectors. Captures semantic meaning.
Real-World Examples
- •Word2Vec
- •GloVe
- •BERT embeddings
- •king - man + woman ≈ queen
When to Use This
Converting text to numbers for ML models
Related Terms
Learn more about concepts related to Word Embedding