Model Collapse
Degradation of models trained on their own generated data
Model collapse is degradation that can occur when models are trained on increasing amounts of AI-generated data. Over time, the output distribution narrows, rare modes disappear, and small errors can be amplified because the training signal is contaminated by the model's own artifacts.
It is related to the broader epistemic pressure of AI slop and to system-level failure modes like Encyclopedia Meltdown, where self-referential text is treated as ground truth.