Skip to main content
Core concept · AI Search Infrastructure

Definition

A training cutoff is the date beyond which a language model’s training data does not extend. Events, brand repositioning, product launches, or changes that occurred after the cutoff are not represented in parametric memory and must be supplied through real-time retrieval or fine-tuning. Training cutoffs create predictable gaps and errors in AI brand representations. A brand that repositioned after the cutoff may be described using pre-reposition language. A brand that launched after the cutoff may have no parametric representation at all. Understanding cutoffs per platform explains why a brand appears differently across ChatGPT, Perplexity, and Google — each draws on different training data vintages and different retrieval architectures.

Parametric knowledge

Parametric inertia

Knowledge conflict

Training corpus

Knowledge cutoff

Relevant Plate Lunch Collective Services

AI SEO AI Search Visibility Assessment Context Map