Research Update: Students’ Reliance on AI May Lead to Mental Laziness

Students’ overdependence on AI chatbots is already a problem. Experts label the phenomenon metacognitive laziness, resulting from dumping assignments on chatbots instead of tackling the hard task of learning and critical thinking. The shortcut-taking leads to a subsequent inability to synthesize, analyze, and explain information, that is, critical thinking—one of the prized attributes that employers seek in human applicants.

Research discussed by the Hechinger Report suggests that students are offloading the painstaking work of thinking on their own to AI, and analysts predict that this behavior signals trouble for young workers down the line.

In one laboratory study Chinese students were tasked with reading several texts, writing an essay, and then revising it—all in English, which was not their native language. Participants were broken into four groups. One group was allowed to use ChatGPT; the second had access to a human writing tutor; the third received a writing checklist with criteria to guide students’ revision; and the fourth group completed the assignment without any help whatsoever. All groups were subsequently tested to determine how much they had learned and how they felt about the exercise.

The results were telling. The English learners in the ChatGPT group improved their essays the most, more so than the group that had access to human tutors. However, although the essays in the ChatGPT group were the most improved, the students did not actually learn about the topic or feel motivated to try to learn. Nor did they refer to the reading or understand the purpose of the assignment. The researchers concluded that the ChatGPT group took shortcuts such as cutting and pasting bot-generated text instead of using the cognitive processes required in writing. In other words, mental laziness won—critical thinking lost.

The second study by Anthropic’s own researchers analyzed more than 500,000 conversations between college students and Anthropic’s Claude to learn how the students interacted with the tool.  The results showed that the students primarily delegated higher level thinking—creating, evaluating, and analyzing—to the AI. The researchers wrote “[t]here are legitimate worries that AI systems may provide a crutch for students, stifling the development of foundational skills needed to support higher order thinking.”

The takeaway from the research is clear. When students do the work their instructors assign without offloading the heavy lifting to AI, they learn to think. When they use AI uncritically and mechanically, they do not. The danger is enormous: Who will create, build, or invent when our young people do not learn how to think?

Discussion

  1. In what ways can critical thinking improve a graduate’s value to an employer?
  2. What steps can students take to improve to their motivation to learn and perform authentically?
  3. What is the difference between using AI as an assistive tool and letting it do all the work?

 

Source: Barshaw, J. (2025, May 19). University students offload critical thinking, other hard work to AI. The Hechinger Report. https://www.hechingreport.org

 

Leave a Reply