Rectifier (neural networks)
| Part of a series on | 
| Machine learning and data mining | 
|---|
In the context of artificial neural networks, the rectifier or ReLU (rectified linear unit) activation function is an activation function defined as the non-negative part of its argument, i.e., the ramp function:
where is the input to a neuron. This is analogous to half-wave rectification in electrical engineering.
ReLU is one of the most popular activation functions for artificial neural networks, and finds application in computer vision and speech recognition using deep neural nets and computational neuroscience.