System Drift
Degradation of knowledge systems through self-contamination or neglect
System drift is the category of drift phenomena where the knowledge system itself degrades over time, independent of semantic or environmental changes.
Forms
- Model Collapse — degradation from training on AI-generated data
- Correction vs Drift — when fixing errors costs more than letting them propagate
Why It Matters
System drift is the failure mode where a knowledge system stabilizes around an "approximately maintained" false steady state. Errors persist because correction is expensive; AI-generated content contaminates the training data; and the system gradually loses contact with ground truth.
Encyclopedia Meltdown is the extreme form: when AI takes the initiative of writing, the responsibility line disappears, and the system drifts toward self-referential collapse. The defense is the epistemic protocol layer: keeping correction cheaper than drift through traceability, zero-trust ingestion, and rebuttal-first search.
Related
- Drift Phenomena — the parent category
- Semantic Drift — when meanings change
- Environmental Drift — when the world changes
- Institutional Brain Rot — organizational analog