Rectified Linear Unit. The most widely used activation function in deep learning. It simply outputs zero for negative inputs and passes positive inputs through unchanged. Simple, fast to compute, and works surprisingly well. Variants like Leaky ReLU and GELU address its tendency to 'die' on negative inputs.
A mathematical function applied to a neuron's output that introduces non-linearity into the network.
A computing system loosely inspired by biological brains, consisting of interconnected nodes (neurons) organized in layers.
Gaussian Error Linear Unit.
An optimization algorithm that combines the best parts of two other methods — AdaGrad and RMSProp.
Artificial General Intelligence.
The research field focused on making sure AI systems do what humans actually want them to do.
Browse our complete glossary or subscribe to our newsletter for the latest AI news and insights.