Deskilling Through AI Delegation
Risk of losing learning capacity when AI does the cognitive work
Deskilling through AI delegation is the risk that offloading cognitive work to AI erodes the capacity to do that work unaided. If verification is always automated, the Operator never practices verification. If synthesis is always generated, the Operator never learns to synthesize.
This is related to model collapse, but it happens in the human rather than the model: skills atrophy when they are not exercised. Meta-learning requires doing the learning, not just receiving the output.
Coscientist addresses this by treating AI as a partner for contemplation labor rather than a replacement for judgment. The Operator still performs active recall, traces evidence spans, and engages with contention. The system handles search and structuring; the human handles verification and decision.