MBMACHINE BRIEF
AnalysisOriginalsModelsResearchStartupsTools
Newsletter

Navigate

  • Home
  • About Us
  • Newsletter
  • Search
  • Sitemap

Content

  • Original Analysis
  • Blog
  • Glossary
  • Best Lists
  • AI Tools

Categories

  • Models
  • Research
  • Startups
  • Robotics
  • Policy
  • Business
  • Analysis
  • Originals

Legal

  • Privacy Policy
  • Terms of Service
Machine Brief|

2026 Machine Brief. All rights reserved.

  1. Home
  2. /Glossary
  3. /Next-Token Prediction
Back to Glossary
ai

Next-Token Prediction

The fundamental task that language models are trained on: given a sequence of tokens, predict what comes next.

Definition

The fundamental task that language models are trained on: given a sequence of tokens, predict what comes next. Despite this seeming simplicity, training at massive scale produces models that can reason, code, translate, and write creatively. It's the core insight behind GPT and similar models.

Share this term

Related Terms

Autoregressive Model

A model that generates output one piece at a time, with each new piece depending on all the previous ones.

Language Model

An AI model that understands and generates human language.

Pre-Training

The initial, expensive phase of training where a model learns general patterns from a massive dataset.

Activation Function

A mathematical function applied to a neuron's output that introduces non-linearity into the network.

Adam Optimizer

An optimization algorithm that combines the best parts of two other methods — AdaGrad and RMSProp.

AGI

Artificial General Intelligence.

Explore More

Latest NewsAI NewsMarketsAnalysisFull Glossary

Want to learn more about AI?

Browse our complete glossary or subscribe to our newsletter for the latest AI news and insights.

Browse GlossarySubscribe to Newsletter