New Technology and the Impact of Using AI Tools on Cognitive Offloading and Critical Thinking

New Technology and the Impact of Using AI Tools on Cognitive Offloading and Critical Thinking

Everyone reading this is either familiar with the popularity and increasing omnipresence of AI tools such as ChatGPT, or has been living under a very cozy rock for the last two-plus years. They are excellent sidekicks that provide assistance with some of the more routine & mundane tasks in nearly every field. Proofreading emails, writing & debugging code, and generating content are just a few of the many, many common applications where AI tools can best assist users. But what happens when we start relying on AI to help us with more complex tasks, or even complete entire tasks for us?

New technology enables us to identify and solve more complex problems by reducing the need for us to do the mundane (see this quote from Gottfried Leibniz), but this does not come without drawbacks. This phenomenon is a tale as old as time – new technology does something for us, which begets concerns that such technology makes that something “too easy,” and reduces our individual capacity to do that something. People said (and in some cases, continue to say) this about calculators and computers. These concerns do not generally outweigh the benefits of new technology, but that is not to say they are negligible.

In the past, new technology has been shown to reduce our individual aptitude with respect to the tasks that we cognitively outsource, or give the thinking to, to such technology. Previous studies have shown that having information readily available via a simple search has negatively impacted both our retention of such information and our desire to process information deeply. Using GPS has been shown to reduce the capacity of our spatial memory. These are contemporary examples, but you can probably find one thing that each technological advancement caused us to “forget how to do.” Just think about doing long division without a calculator (if you’re brave enough)!

A Satirical Headline from The Register

Now that AI is so heavily integrated into the public domain, researchers are evaluating the use cases of AI to see how using it interacts with the ways we learn and think. A recent study focused on the relationship between using AI tools in educational settings and critical thinking via cognitive offloading, the process by which we use external resources to reduce the mental effort to complete a task (think of the difficulty of a long division problem with vs. without a calculator). Current thinking posits that cognitive offloading via over-relying on external tools “may reduce the need for deep cognitive involvement, potentially affecting critical thinking.” The authors of this study argue that this principle is the mechanism by which higher AI tool usage reduces one’s critical thinking capacity.

The authors of this study hypothesized that using AI tools more frequently would correlate with reduced critical thinking skills. They studied a demographically diverse sample of 666 participants from the UK, collecting survey data about AI usage and critical thinking skills, and conducting interviews with 50 of the participants for a deeper dive. Their study produced many interesting results, but the principal quantitative results aligned with their hypotheses – there was a strong positive correlation between AI tool use and cognitive offloading, and a strong negative correlation between AI tool use and critical thinking.

The participant interviews yielded both positive and negative results of using AI tools, though they tended to agree with the quantitative results. One quote from the interviews that stuck out to the authors, and to me, was “It’s great to have all this information at my fingertips, but I sometimes worry that I’m not really learning or retaining anything. I rely so much on AI that I don’t think I’d know how to solve certain problems without it.” Of course this is anecdotal, but it makes you think about the long-term usage of AI tools. What is the “responsible” amount of AI tool usage? Where is the cutoff? How do we track the impacts of using AI tools? Who is to decide all of this, if anyone?

Clearly, AI holds vast potential as a tool to improve nearly any facet of society. It already has so many tangible uses, and the full scope of its benefits may very well be limitless. However, this research shows that it’s probably not something we can wholly give ourselves up to without consequence.

I leave you with a recently viral anecdote that hit home for me, titled “my little sister’s use of chatgpt for homework is heartbreaking” (it’s very informal, and has slight profanity). It is a thread discussing how someone’s eleven year old sister asks ChatGPT to do every single question she is tasked with for homework. It truly is a heartbreaking story, and I hope it remains an anecdote, as opposed to becoming the rule for future generations.

New Technology and the Impact of Using AI Tools on Cognitive Offloading and Critical Thinking

Picture of Cooper Molloy

Cooper Molloy

Lead Quality/Regulatory Engineer

Share this post:

Subscribe to The Neuro Science Monitor

Your monthly survey of the fast-moving field of neurocritical care.

More to explore: