COGNITIVE DEBT
The Cost of Short-Term Thinking in the Age of AI
Happy Wednesday!
I’ve been working on this a while and the full Cognitive Debt white paper is below:
Download a pdf copy you can drop into your LLM of choice to ask questions or for a TLDR
Get Cognitive Debt swag. Shout out to friend and reader, Katelyn who inspired me to put Nerd Out on a shirt so I can wear it because I’m the “Guy From Nerd Out” :-D. I made myself a couple Nerd Out inside joke products and you’re welcome to get one too.
The $20 Million Hallucination
In March 2020, our world shrank to the size of a 13-inch MacBook. I was leading the design team for the School for Googlers, and we had a $20 million problem. The high-touch-in-person-program that defined Google’s culture, “Noogler” was suddenly illegal to run.
AI edited but real. Yes, we had cake pops shaped like hats and a donut wall.
We presented leadership with a data driven choice: go for a fast elearning experience, try a virtual hybrid, or attempt to replicate the entire classroom experience on a screen. We chose speed. We moved at the speed of light toward what looked like a flawless deployment. On paper, we saved $20 million in travel and logistics. The learner satisfaction scores, which were already good, hit an unbelievable 97%. Attendance greatly improved and people were thrilled that they could complete onboarding without travel. Everyone wins.
We felt like heroes. We all got swag; I got promoted! The high touch magic of Mountain View and Google’s culture was replaced with 4k video. We were taking out a high interest loan on our employees’ ability to navigate the company. We were taking out cognitive debt.
By late 2020, we could smell the smoke even if the data didn’t show the fire. One Director told me they’d been a ‘Googler’ for months from their kitchen table, but they didn’t feel like one. The invisible connected experiences at the live orientation vanished, replaced by a view of the salt & pepper shakers at your own kitchen table. Leaders were operating in a vacuum. A few years later, when the office doors reopened, the Cognitive Debt bomb finally detonated. Senior leaders found themselves unable to lead. They were experiencing ‘organ rejection’ from their own teams. The pandemic-era goodwill had evaporated. Leaving them to learn the texture of Google’s culture while their organizations were already in revolt. Subtle cultural norms were missing. Google’s culture has been described in a lot of ways but in practice Googlers were kind and helpful because they had a thirst for problem solving. It’s one thing to hear that but another to experience volunteers from across the company to show you how high the bar is.
What is Cognitive Debt?
Cognitive debt is more malicious than just forgetting. It’s the deficit that builds when we outsource our brains to a vendor, a teammate, or a prompt. It’s the point where your people can no longer explain the logic of the systems they manage. The cost of relearning how your own company works costs more than just burning it down and starting over. AI makes the debt invisible. It’s so smooth that we mistake “AI aid” for “knowing how”. It’s a trade of fluency for speed and you don’t know you’re bankrupt till you have to make a high stakes judgement call in a crisis. This debt gets you in two primary ways:
Operational Weight: This is the invisible friction that makes an organization feel heavy, allergic to change, and slow to pivot. It’s the drag that makes an organization feel heavy, slow to pivot, and resistant to new ideas. It causes burnout when you force employees to drag along outdated mental models. It’s a reason why your best people leave. They’re tired of a system that fights against them.
The Timed Detonation: There are a lot of examples. An easy example is the compliance bomb. It’s lit when a sales rep hits their quota by ignoring the ‘boring’ training. Often this is an open secret! Then the whole thing blows up in a flurry of lawsuits and litigation. Sometimes it’s a second or third order consequence after a series of shortcuts but it can knock a business off its axis.
In an AI-enabled world, content has a marginal cost of zero, but human attention remains a finite, biological resource. If we ignore the hard limits of the human brain, we are building debt.
The danger isn’t the offloading itself. You don’t need a retail associate to memorize a thousand fluctuating SKUs. That’s a debt we can afford. The danger is that AI makes offloading so smooth we mistake output for expertise.
As the 2025 MIT research paper, ‘Your Brain on ChatGPT’ warned, we are entering an era of ‘Metacognitive Laziness.’ When we use AI as a cognitive crutch, we get a short-term productivity spike at the cost of long-term atrophy.
When you outsource the mental work that leads to judgment, fluency, and recall, you create a crisis. As I shared recently, “You could be in a board meeting, landing a huge deal with a team of lawyers. If you don’t know your stuff, if you’ve been using AI to outsource your thinking, you aren’t going to be able to call on that knowledge in your moment of need.” If something truly matters, be intentional and design to overcome laziness by making learners demonstrate what they learn, rather than just absorbing content. Good training requires friction.
Cognitive debt is manufactured at scale. Depending on the shortcuts that are taken there are a few ways you might see that:
Slop content: This happens when we outsource creativity to the machine. We use AI to build low-quality, generic training faster than ever, leading to a flood of “AI slop” that fills inboxes but never reaches a brain. If an LLM can generate a passable version of your course in ten seconds, your training is a liability, not an asset. This content fails a brutally useful test for L&D: Would my learners pay money to take this training?
Ego content: Ego content happens when we do things because we can, not because they’re needed. These are the high-gloss “awareness” campaigns. They look stunning in a board deck but they solve exactly zero performance gaps. The cost and time suck of the training itself becomes a business problem. It may feel like building political capital but the opportunity cost is too high. We could be fixing real problems.
Heroism: This is the debt of inhuman expectations. We reward the ‘hero’ who stays up all night manually fixing a broken report instead of building the AI-integrated system that prevents the break. Excessive task switching adds to this debt. Workers who are asked to multi-task or take on too much work, even when it is AI aided, will have lower performance. While they and their managers feel like martyrs for shouldering the burden, they lose the ability to see the bigger picture.
Missing Rung: We’re sawing off the bottom rung on the career ladder when we outsource experience to the machine. Junior roles that once provided the “tacit knowledge acquisition” needed for senior-level expertise are being automated. Entry level workers may never develop the “professional intuition” required to judge the quality of AI-generated work, creating a structural debt. By the time your senior leaders retire, you’ll realize the new guard has ‘institutional amnesia.’ They know how to hit ‘generate,’ but they don’t know why the company does what it does. That’s a detonation waiting to happen.
The Cognitive Debt Audit
Don’t wonder if your organization has this problem. Prove it. Answer these questions honestly to determine your learners’ cognitive debt. If the answer is, I don’t know, assume the worst but follow up in your future analysis to answer this question for your organization.
See how you stack up with the research
Category 1: The productivity hallucination 10-13 on average.
We are living in a productivity hallucination. Organizations are confusing volume with value. While experts see a technical speedup, the 2026 Stanford Forecast shows a 12% decline in real outcomes for high context roles. We produce more, but achieve less.
Even when AI generates work in seconds, verification of the work consumes up to 65% of a project’s lifecycle. If your team can’t explain the logic behind the output, they haven’t saved time. Rework is deferred to the fix it phase.
(Sources: Stanford AI 2026 Forecast; METR 2026)
Category 2: Skill Loss and the Future Workforce 9-12 on average.
There is a direct negative correlation between constant AI reliance and critical thinking. We are sawing off the bottom rungs of the career ladder, allowing younger workers to bypass the “desirable difficulties” required to build professional intuition. By the time they reach leadership, the know-how will have atrophied.
This isn’t just a “junior” problem; it’s total dependency. 2026 data shows that even experienced engineers are now so reliant on these tools that they often refuse to work without them. If your team cannot explain the “why” behind the code, you’ve lost your autonomy. You are managing a black box of interdependencies you cannot diagnose or plan against. (Sources: Gerlich, 2025; METR, 2026)
Category 3: “Fading” Company Knowledge 9-12 on average.
When a veteran leaves, 42% of the context required to do their job disappears with them. Without a strategy for “interaction logic,” it takes 200 hours to get a new hire back to zero. Most organizations are currently flying a “black box” of broken context. (Source: Panopto/YouGov via Stravito)
Category 4: Focus, Energy, and Overwhelm 9-12 on average.
Even pre-AI modern knowledge workers were challenged, and AI adoption is making it worse. AI tools are marketed as time-savers, but they’ve created a ‘Digital Debt’ where the volume of data has outpaced human processing. We are drowning in professional grade text that nobody has the ‘bandwidth’ to actually read. The noise has been automated.
68% of employees do not have uninterrupted focus time during the workday. 64% of employees struggle with having the daily time and energy to do their jobs at all. Employees who are overwhelmed are 3.5 times more likely to struggle with innovation and strategic thinking. (Source: Microsoft Global Study on Work Trend Index)
Category 5: Building Lasting Mental Habits 9-12 on average
The human brain is naturally wired to conserve energy, meaning individuals naturally opt for the external answer rather than taking the route of deep thinking or manual problem-solving.
Because most mainstream digital interfaces are intentionally designed for seamlessness rather than maintaining human engagement (a lack of constructive friction), organizations tend to score high here simply because they lack active guardrails and training to protect independent reasoning. (Source: Research on Cognitive Conservation & Interface Design)
Performance Economist vs. Order Taker
The L&D profession is currently bifurcating.
The Order Taker is the primary agent of debt. They are the human vending machine who says “Yes” to every request for a 15-minute module or a quick video. They focus on delivery rather than discovery. In the age of agentic AI, the machine will eventually replace the Order Taker because the machine is faster and cheaper at taking orders.
The Performance Economist balances human capability (I’m not trying to coin yet another term for someone who works in corporate education, this is a metaphor). They have the courage to say, “A video won’t solve this; we need to fix the cognitive load”. They recognize that L&D must stop behaving like a content factory and start acting as a problem-solving discipline. They act like product managers, discovering problems worth solving rather than just fulfilling content requests. They understand that while we can automate the production of a course, we cannot automate the connection a learner makes with the material.
They understand that their job is to show the value created by solving business problems, remembering that, as one CLO put it, “as soon as you get into an ROI conversation, you’ve already lost.” This isn’t about finding just the right way to report on your value. It’s about spending time with learners, stakeholders and data to find the right solutions and providing undeniable solutions.
The Human Dividend
This is a manifesto for a world that values speed over depth.
AI’s best use case isn’t generating more content; it’s handling the invisible labor of research and planning so humans can return to high fidelity learning. This frees up L&D to be strategic. Let’s do the deep work. High stakes coaching, in person immersion, and the ruthless auditing of the data feeding our AI systems.
When you design ‘productive struggle’ back into your organization, you unlock the Human Dividend. This is a workforce that is agile, resilient, and capable of doing the things the machine cannot.
We need strategies that encourage independent thinking and innovation. We must design environments that prioritize independent reasoning over automated defaults. Use AI as a ‘sparring partner’ rather than a ghostwriter, and ensure that learners demonstrate mastery before we check the box. It’s time to stop building more content and start building more capability.
I’ve outlined these liabilities because we cannot solve a problem we are unwilling to audit. But this is just the beginning. We’re only just starting to imagine a better AI enabled future. Subsequent papers will focus on the approaches we can take to flourish with AI, not by outpacing the machine but by being excellent humans.
Citations
Becker, J., Rush, N., & Rein, D. (2025). Measuring the Impact of Early-2025 AI on Experienced Open-Source Developer Productivity (arXiv:2507.09089). arXiv. https://doi.org/10.48550/arXiv.2507.09089
Gerlich, M. (2025). AI tools in society: Impacts on cognitive offloading and the future of critical thinking. Societies, 15(1), 6. https://doi.org/10.3390/soc15010006
Kosmyna, N., Hauptmann, E., Yuan, Y. T., Situ, J., Liao, X. H., Beresnitzky, A. V., Braunstein, I., & Maes, P. (2025). Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task. arXiv. https://arxiv.org/abs/2506.08872
METR. (2026, February 24). We are changing our developer productivity experiment design. METR Blog. https://metr.org/blog/2026-02-24-uplift-update/
Microsoft. (2023). Will AI fix work? Annual Work Trend Index Report. https://www.microsoft.com/en-us/worklab/work-trend-index/will-ai-fix-work
Stack Overflow. (2025). Stack Overflow 2025 Developer Survey on AI. https://survey.stackoverflow.co/2025/
Stanford HAI. (2026). Stanford AI Experts Predict What Will Happen in 2026: The Era of Evaluation. Stanford University. https://hai.stanford.edu/news/stanford-ai-experts-predict-what-will-happen-in-2026
Stravito. (2025). Organizational memory loss: Why it matters and how to prevent it. https://www.stravito.com/resources/organizational-memory-loss-why-it-matters-and-how-to-prevent-it
Kabashkin, I. (2025). Cognitive atrophy paradox of AI–human interaction: From cognitive growth and atrophy to balance. Information, 16(11), 1009. https://doi.org/10.3390/info16111009




I have so many thoughts on this. This idea of AI taking over the roles of early-career workers I think will highlight and exacerbate existing institutional weaknesses like the post-doc system in academia. The amount of institutional knowledge that is lost in the churn and turnover when one inexpensive temporary worker is discarded for another inexpensive temporary worker is truly staggering. The system relies on post docs for a staggering amount of its publication results yet does very little to preserve the knowledge during turnover (I can see AI aiding in some of the documentation here, but only if it’s applied wisely).
The other thing I thought about when I read your Forbes interview is that I have always taken on a lot of other people’s cognitive debt and found it incredibly annoying. It made me want to try incorporating AI into my workflow to pay back some of what my poor overworked brain is owed. 🤣
This is an excellent article and paper! I just sent it to all my coworkers.