A mathematical function applied to a neuron's output that introduces non-linearity into the network.
A mathematical function applied to a neuron's output that introduces non-linearity into the network. Common examples include ReLU, sigmoid, and tanh. Without activation functions, neural networks would only be able to learn linear relationships, no matter how many layers they have.
Rectified Linear Unit.
A computing system loosely inspired by biological brains, consisting of interconnected nodes (neurons) organized in layers.
A subset of machine learning that uses neural networks with many layers (hence 'deep') to learn complex patterns from large amounts of data.
An optimization algorithm that combines the best parts of two other methods — AdaGrad and RMSProp.
Artificial General Intelligence.
The research field focused on making sure AI systems do what humans actually want them to do.
Browse our complete glossary or subscribe to our newsletter for the latest AI news and insights.