Batch Size
AI/ML Fundamentals
Number of samples processed before updating weights
What is Batch Size?
Larger batches: more stable gradients but more memory. Smaller: faster updates but noisier.
Real-World Examples
- •Batch size 32 for most neural networks
- •Batch size 1 = stochastic gradient descent
When to Use This
Important hyperparameter affecting training speed and convergence
Related Terms
Learn more about concepts related to Batch Size