massOfai

ReLU (Rectified Linear Unit)

Neural Networks

Activation function: max(0, x)

What is ReLU (Rectified Linear Unit)?

Most popular activation. Simple, fast, helps with vanishing gradient. Outputs input if positive, else 0.

Real-World Examples

  • Default activation in CNNs
  • Hidden layers in neural networks

When to Use This

Default choice for hidden layers in most neural networks