Clear Sky Science · en

AI literacy mediates AI assisted diagnosis participation and critical thinking among medical students under supervision

· Back to index

Why Smarter Machines Matter for Future Doctors

Artificial intelligence is rapidly becoming a routine part of hospital life, from reading X-rays to suggesting possible diagnoses. That raises a pressing question for patients and educators alike: will young doctors who lean on these tools stop thinking for themselves, or can AI actually sharpen their judgment? This study followed hundreds of medical students for a full year to see how working with AI systems under close supervision affected their ability to think critically about diagnoses.

Figure 1
Figure 1.

Following Students Through a Year With AI

The researchers tracked 372 fourth- and fifth-year medical students at three large Chinese universities during their clinical rotations, a time when they see real patients under the guidance of senior physicians. These students routinely used AI tools embedded in hospital systems, such as programs that help interpret scans or suggest possible diseases based on symptoms. Over 12 months, the team measured three things at three time points: how actively students participated in AI-assisted diagnosis, how well they understood and could evaluate AI outputs (called AI literacy), and how strong their medical critical thinking was—the habit of weighing evidence, spotting bias, and questioning initial impressions.

More Use, Better Understanding, Sharper Thinking

Across the year, students reported using the AI tools more often and more deeply, and their AI literacy scores rose in step. Their critical thinking scores also improved, though more modestly. Using statistical models that look at changes over time, the authors found that students who were more engaged with AI at the start tended to show larger gains in AI literacy six months later. In turn, those with higher AI literacy at six months showed stronger critical thinking by the end of the year. Even after accounting for students’ earlier scores and basic background factors like age, gender, and grades, greater participation in AI-assisted diagnosis was linked to later improvements in critical thinking, not declines.

Figure 2
Figure 2.

AI Literacy as the Missing Link

A key insight of the study is that merely touching an AI system is not what seems to matter—what matters is learning to understand and question it. Students who developed stronger AI literacy became better at probing the system’s suggestions, recognizing when its recommendations might be skewed by data or context, and deciding how much weight to give its output compared with their own judgment. Statistical tests showed that about 38 percent of the link between early AI participation and later critical thinking flowed through gains in AI literacy. In plainer terms, active use of AI helped students learn how it works and what its limits are, and that deeper understanding, in turn, nudged their reasoning skills upward.

Not All Students Benefit Equally

The story was not the same for everyone. Students who already felt comfortable with technology and those driven mainly by a desire to master skills, rather than to simply look good in front of supervisors, gained the most from AI-supported training. For them, the chain from AI participation to AI literacy to better thinking was especially strong. By contrast, students with little prior tech experience or who focused mainly on getting the “right” answer tended to benefit less. They appeared more likely to treat AI as a shortcut, accepting its suggestions without much reflection. The study also emphasized that all of this happened in a tightly supervised environment, where experienced clinicians encouraged students to question and discuss AI outputs in a safe way.

What This Means for Patients and Educators

For patients wondering whether AI will replace their doctor’s judgment, these findings offer a more nuanced picture. When AI tools are woven into training thoughtfully—paired with strong mentoring and explicit teaching about how to question algorithms—they can serve as educational scaffolding rather than crutches. Students who actively engage with AI, learn how it works, and are guided to challenge its answers can emerge as more critical, not more passive, thinkers. However, the benefits are not automatic. Without supervision, basic technical training, and an emphasis on understanding over speed, AI could just as easily deepen dependence as develop discernment.

Citation: Xin, Y., Yan, D., Shuren, L. et al. AI literacy mediates AI assisted diagnosis participation and critical thinking among medical students under supervision. npj Digit. Med. 9, 344 (2026). https://doi.org/10.1038/s41746-026-02521-9

Keywords: medical education, artificial intelligence, critical thinking, AI literacy, clinical training