massOfai

Dropout

Neural Networks

Regularization technique randomly ignoring neurons

What is Dropout?

During training, randomly "drops" neurons with probability p to prevent overfitting. Forces network to learn robust features.

Real-World Examples

  • 50% dropout in dense layers
  • Preventing overfitting in CNNs
  • Regularizing large networks

When to Use This

Common regularization for preventing overfitting in neural networks