Gradient Descent
AI/ML Fundamentals
Optimization algorithm minimizing loss
What is Gradient Descent?
Iteratively adjusts parameters in direction of steepest descent (negative gradient) to minimize loss function.
Real-World Examples
- •Batch gradient descent
- •Stochastic gradient descent (SGD)
- •Mini-batch gradient descent
When to Use This
Fundamental optimization algorithm for training ML models
Related Terms
Learn more about concepts related to Gradient Descent