An optimization algorithm that combines the best parts of two other methods — AdaGrad and RMSProp.
An optimization algorithm that combines the best parts of two other methods — AdaGrad and RMSProp. It adapts the learning rate for each parameter individually, making it one of the most popular choices for training deep neural networks. Works well out of the box with minimal tuning.
The fundamental optimization algorithm used to train neural networks.
A hyperparameter that controls how much the model's weights change in response to each update.
The algorithm that makes neural network training possible.
A mathematical function applied to a neuron's output that introduces non-linearity into the network.
Artificial General Intelligence.
The research field focused on making sure AI systems do what humans actually want them to do.
Browse our complete glossary or subscribe to our newsletter for the latest AI news and insights.