BERT (Bidirectional Encoder Representations from Transformers)
Neural Networks
Language model that understands context from both directions
What is BERT (Bidirectional Encoder Representations from Transformers)?
Unlike GPT which reads left-to-right, BERT reads in both directions to better understand context. Excels at understanding and classification tasks.
Real-World Examples
- •Search engines understanding queries
- •Question answering
- •Sentiment analysis
- •Named entity recognition
When to Use This
For understanding text rather than generating it
Related Terms
Learn more about concepts related to BERT (Bidirectional Encoder Representations from Transformers)