Learning Rate
AI/ML Fundamentals
Step size for parameter updates during training
What is Learning Rate?
Controls how much to adjust weights each iteration. Too high: unstable. Too low: slow training.
Real-World Examples
- •Common values: 0.001, 0.01, 0.0001
- •Adaptive learning rates (Adam optimizer)
When to Use This
Critical hyperparameter for all gradient-based training
Related Terms
Learn more about concepts related to Learning Rate