MBMACHINE BRIEF
AnalysisOriginalsModelsResearchStartupsTools
Newsletter

Navigate

  • Home
  • About Us
  • Newsletter
  • Search
  • Sitemap

Content

  • Original Analysis
  • Blog
  • Glossary
  • Best Lists
  • AI Tools

Categories

  • Models
  • Research
  • Startups
  • Robotics
  • Policy
  • Business
  • Analysis
  • Originals

Legal

  • Privacy Policy
  • Terms of Service
Machine Brief|

2026 Machine Brief. All rights reserved.

  1. Home
  2. /Glossary
  3. /ReLU
Back to Glossary
ai

ReLU

Rectified Linear Unit.

Definition

Rectified Linear Unit. The most widely used activation function in deep learning. It simply outputs zero for negative inputs and passes positive inputs through unchanged. Simple, fast to compute, and works surprisingly well. Variants like Leaky ReLU and GELU address its tendency to 'die' on negative inputs.

Share this term

Related Terms

Activation Function

A mathematical function applied to a neuron's output that introduces non-linearity into the network.

Neural Network

A computing system loosely inspired by biological brains, consisting of interconnected nodes (neurons) organized in layers.

GELU

Gaussian Error Linear Unit.

Adam Optimizer

An optimization algorithm that combines the best parts of two other methods — AdaGrad and RMSProp.

AGI

Artificial General Intelligence.

AI Alignment

The research field focused on making sure AI systems do what humans actually want them to do.

Explore More

Latest NewsAI NewsMarketsAnalysisFull Glossary

Want to learn more about AI?

Browse our complete glossary or subscribe to our newsletter for the latest AI news and insights.

Browse GlossarySubscribe to Newsletter