Attention Mechanism
Neural Networks
Focusing on relevant parts of input
What is Attention Mechanism?
Allows models to focus on specific parts of input when making predictions, like humans focusing attention on important details.
Real-World Examples
- •Highlighting important words in translation
- •Focusing on relevant image regions
- •Attention in transformers
When to Use This
Core component of modern transformers and seq2seq models
Related Terms
Learn more about concepts related to Attention Mechanism