massOfai

Batch Normalization

Neural Networks

Normalizing layer inputs during training

What is Batch Normalization?

Normalizes inputs of each layer to have zero mean and unit variance, stabilizing and accelerating training.

Real-World Examples

  • Used in modern CNNs
  • Faster training convergence
  • Allows higher learning rates

When to Use This

Almost standard in deep networks for faster, stable training

Related Terms

Learn more about concepts related to Batch Normalization