ChatGPT making us dumber: Study finds brain’s inability to quote its own writing after using AI tools

Researchers discovered that participants who used AI tools showed dramatically different neural patterns compared to those who wrote essays using only their minds.

Published Jun 19, 2025 | 11:21 AMUpdated Jun 19, 2025 | 11:21 AM

AI tools brain

Synopsis: A study by MIT revealed that writing assistants may be fundamentally changing how our brains work — and not necessarily for the better. The study also revealed a troubling trend toward intellectual homogenisation.

In an era where artificial intelligence (AI) tools like ChatGPT have become as commonplace as smartphones, a study by the researches in the Massachusetts Institute of Technology (MIT) in the US uncovered a troubling reality: These powerful writing assistants may be fundamentally changing how our brains work — and not necessarily for the better.

The four-month study, which tracked 54 participants through multiple essay-writing sessions while monitoring their brain activity, reveals that heavy reliance on AI tools creates what researchers call “cognitive debt” — a condition where our mental muscles atrophy from lack of use, much like how a broken arm emerges weaker after months in a cast.

Also Read: As India mulls artificial intelligence regulation, it must look for culturally aware AI systems

The brain under the microscope

Using electroencephalography (EEG) technology — essentially a sophisticated cap that reads brain waves — researchers discovered that participants who used AI tools showed dramatically different neural patterns compared to those who wrote essays using only their minds.

Think of neural connectivity as the brain’s highway system: The more traffic between different regions, the more engaged and active the brain becomes.

The results were striking. Participants who wrote without any tools showed the strongest, most widespread neural networks, like a National Highway with traffic flowing freely without any stoppage. Those who used traditional search engines showed moderate connectivity, similar to the traffic of Hyderabad city.

However, participants who relied on AI tools like ChatGPT? Their brain activity resembled congested traffic in Bengaluru.

“Brain connectivity systematically scaled down with the amount of external support,” the researchers noted. The AI group showed up to 55% less neural connectivity compared to the brain-only group, particularly in areas responsible for deep thinking and memory formation.

The memory problem: When words don’t stick

Perhaps most concerning was what happened when participants were asked to quote from essays they had just written minutes earlier. It’s like asking someone to recall a conversation they just had — normally, this should be easy. However, for AI users, it wasn’t.

In the first session, a staggering 83 percent of AI users couldn’t quote from their own work, and none provided accurate quotes. Even after multiple sessions, many still struggled with this basic task.

Meanwhile, participants who wrote without AI assistance quickly mastered the ability to recall their own words with near-perfect accuracy.

This isn’t just about memory – it’s about ownership. When you write something yourself, wrestling with ideas and crafting sentences, your brain creates strong neural pathways that encode both the content and the experience. Using AI is like having someone else do pushups for you; you get the result, but miss the mental workout that builds cognitive strength.

Also Read: Musk’s AI tool Grok has Indian rightwing propagandists worried

The homogenisation effect: When everyone sounds the same

The study also revealed a troubling trend toward intellectual homogenisation. Essays written with AI assistance were remarkably similar to each other, showing what researchers called “statistically homogeneous” patterns. It’s as if AI tools were creating a kind of intellectual monoculture, where diverse human perspectives get filtered through the same algorithmic lens.

Human teachers, who weren’t told which essays were AI-assisted, could nonetheless identify them based on their conventional structure and similar content approaches.

This suggests that AI tools may be nudging users toward predictable patterns of thinking and expression, potentially stifling the kind of creative diversity that drives innovation and progress.

The search engine middle ground

Interestingly, participants who used traditional search engines occupied a middle ground between AI users and those who relied solely on their brains. Their neural activity showed that they were actively engaging with information — scanning, evaluating, and integrating what they found. The visual cortex lit up as they processed search results, indicating active cognitive work.

This finding suggests that not all digital tools are created equal. Search engines require users to evaluate sources, synthesise information, and make judgements about relevance and credibility. AI tools, by contrast, present pre-processed answers that require less critical thinking — like the difference between shopping for ingredients and cooking versus ordering takeout.

Can the damage be reversed?

In a clever twist, researchers conducted a fourth session where they swapped the tools. Participants who had been using their brains alone were given AI assistance, while those who had been using AI were asked to write without it. The results were illuminating and somewhat hopeful.

When brain-only participants were introduced to AI tools, their neural activity actually increased rather than decreased. Having developed strong cognitive foundations, they seemed to use AI as a genuine assistant rather than a crutch — like a skilled chef using a food processor to speed up prep work rather than replace cooking knowledge.

However, participants who had grown accustomed to AI assistance struggled significantly when the tools were removed. Their neural connectivity didn’t bounce back to normal levels, remaining in what researchers described as an “intermediate state” — neither fully dependent nor fully independent.

The cognitive debt crisis

The researchers introduced a crucial concept: Cognitive debt. Just as financial debt allows you to enjoy benefits now while paying costs later, cognitive debt occurs when we offload mental effort to external systems, creating immediate convenience but long-term intellectual costs.

The study found evidence that participants who relied heavily on AI were accumulating this debt. They showed decreased creativity, increased vulnerability to bias, and diminished critical thinking skills. When they stopped using AI, they didn’t simply return to baseline – they remained cognitively diminished.

“This pattern reflects the accumulation of cognitive debt,” the researchers explained, “A condition in which repeated reliance on external systems like LLMs [Large Language Models] replaces the effortful cognitive processes required for independent thinking.”

Also Read: Mental health to physical wellbeing, WHO AI tool SARAH is in town now

What this mean for students and professionals

The implications extend far beyond academic essay writing. In classrooms, offices, and creative industries where AI tools are becoming standard, this research raises urgent questions about the long-term cognitive consequences of our growing AI dependence.

For students, the findings suggest that using AI for homework and assignments may undermine the very learning these tasks are designed to promote. It’s like using a calculator for simple arithmetic — convenient in the moment but potentially harmful to developing mathematical intuition.

For professionals, the research indicates that while AI tools can boost immediate productivity, they may be eroding the deep thinking skills that drive innovation and strategic decision-making. The brain’s “use it or lose it” principle applies to cognitive abilities just as much as physical fitness.

The study doesn’t advocate for abandoning AI tools entirely. Instead, it suggests a more strategic approach. The most promising finding was that participants who developed strong cognitive foundations before using AI tools could leverage them more effectively and safely.

(Edited by Muhammed Fazil.)

Follow us