Techniques for artificially expanding training datasets by creating modified versions of existing data.
Techniques for artificially expanding training datasets by creating modified versions of existing data. For images, this might mean flipping, rotating, or changing colors. For text, it could mean paraphrasing or back-translation. Helps prevent overfitting and improves model generalization.
When a model memorizes the training data so well that it performs poorly on new, unseen data.
Techniques that prevent a model from overfitting by adding constraints during training.
A mathematical function applied to a neuron's output that introduces non-linearity into the network.
An optimization algorithm that combines the best parts of two other methods — AdaGrad and RMSProp.
Artificial General Intelligence.
The research field focused on making sure AI systems do what humans actually want them to do.
Browse our complete glossary or subscribe to our newsletter for the latest AI news and insights.