Running a trained model to make predictions on new data. Distinct from training, which is where the model learns. Inference needs to be fast and cheap for real-world deployment. Companies spend enormous sums on inference infrastructure — it's often more expensive than training over a model's lifetime.
The process of teaching an AI model by exposing it to data and adjusting its parameters to minimize errors.
A mathematical function applied to a neuron's output that introduces non-linearity into the network.
An optimization algorithm that combines the best parts of two other methods — AdaGrad and RMSProp.
Artificial General Intelligence.
The research field focused on making sure AI systems do what humans actually want them to do.
The broad field studying how to build AI systems that are safe, reliable, and beneficial.
Browse our complete glossary or subscribe to our newsletter for the latest AI news and insights.