Skip to content

ReLU

Entity Type: Glossary ID: relu

Definition: Rectified Linear Unit, an activation function that outputs the input directly if positive, otherwise outputs zero. ReLU is widely used in deep learning due to its computational efficiency and ability to mitigate the vanishing gradient problem.

Related Terms: - activation-function - leaky-relu - vanishing-gradient - neural-network

Source Urls: - https://en.wikipedia.org/wiki/Rectifier_(neural_networks)

Tags: - neural-networks - activation-functions - deep-learning

Status: active

Version: 1.0.0

Created At: 2025-08-31

Last Updated: 2025-08-31