Medical Education Needs Its Own AI Playbook
Perspectives > Second Opinions — Artificial intelligence should be used differently in training and medical practice by Naga Kanaparthy, MD, MPH December 1, 2src24 Kanaparthy is a practicing internal medicine physician specializing in clinical informatics. President-elect Donald Trump’s pick for secretary of education, Linda McMahon, will be wrestling with growing uncertainty around education in the
—
Artificial intelligence should be used differently in training and medical practice
by
Naga Kanaparthy, MD, MPH
December 1, 2src24
Kanaparthy is a practicing internal medicine physician specializing in clinical informatics.
President-elect Donald Trump’s pick for secretary of education, Linda McMahon, will be wrestling with growing uncertainty around education in the age of artificial intelligence (AI) in the years ahead if she’s confirmed by the Senate. In medical education in particular, AI presents new challenges in how to wisely educate future doctors.
AI tools that now assist experienced physicians could prove detrimental to doctors-in-training unless properly managed and guided by medical schools. Leaders in medical education must quickly and effectively distinguish between the use of AI in medicine and the use of AI in medical education.
Learning and Practicing Medicine Are Different
Residency demands that junior physicians take on progressively greater responsibilities until they can provide independent care. It takes many years for doctors-in-training to develop the intuition and integrated knowledge of experienced physicians.
AI tools are new for everyone — professionals, students, and educators are all facing a learning curve. While seasoned physicians have to learn how to apply new tools to their existing routines and practices, they are building on long experience and deep understanding of how to care for patients. Residents, on the other hand, are just learning to provide care. While AI assistive tools may appear as a welcome gift to busy trainees, saving precious time, they may inadvertently act as shortcuts, potentially compromising fundamental skills. Without the judgement, intuition, and experience fostered through residency, early career doctors may not spot errors made by AI.
Learning experts have clearly demonstrated that people learn differently at various life stages. For instance, educator Malcolm Knowles’ work showed that adults and children do not absorb information in the same way. Likewise, residents require a different educational approach than their more seasoned peers. The Kolb model of experiential learning offers a helpful framework: trainees pass through phases of concrete experience, reflective observation, abstract conceptualization, and active experimentation. Each step reinforces their evolving expertise — and shortcuts in their learning can be detrimental.
For example, ambient AI tools are increasingly used to listen in on patient-doctor conversations; they automatically transcribe medical notes for the doctor’s review. Such tools, which dramatically speed up documentation, have been widely touted as tremendous time savers for doctors. But, for trainees, documentation is an important skill wherein they practice assimilating information from the patient, the patient’s chart, and related medical articles, allowing them to articulate and document a plan. Although a slow iterative process, this hones the resident’s analytical and diagnostic ability. By outsourcing the skill of medical documentation and not having the feedback of the teachers during training, AI will rob residents of critical formative experiences.
Not All AI Tools Are Suitable for Training
While many AI tools focus on enhancing clinical efficiency and patient outcomes, there is a role for them in education too. Generative AI tools could revolutionize medical education through the use of technologies like large language models, adaptive learning systems, virtual reality, and augmented reality. These tools can create immersive simulations, personalized learning plans, and interactive patient scenarios, all within a risk-free environment.
To realize these benefits, personalization is key. To ensure a future of properly trained doctors who can wisely combine AI with their own judgement, medical schools and hospitals should demand that all developers of these tools include “trainee modes” in their products.
A system designed to optimize a busy physician’s time should not be blindly applied to a trainee still learning the art of medicine. These custom modes would tailor the functionality to early-career doctors. Depending on the particular product and its use case, the teacher or supervisor should be able to dial up or down the assistive features.
Successful implementation of such modes will also require the education of supervisors and educators. They must embrace additional responsibilities to ensure trainees learn appropriately, by pointing out the pitfalls of AI and taking measures to ensure doctors develop wise human judgement, and practice responsibly and ethically. And if a doomsday were to arrive, and tools disappear, they should have prepared the resident to perform equally well.
AI Is the Future — If We Respect Its Limits and Our Own
There is no doubt that assistive AI tools are the future, just as electronic medical records were in their day. When electronic medical records replaced handwritten or typed notes, many doctors grumbled, but they are now the norm. AI is already saving lives — predicting sepsis in patients, for example. Teaching and exposing trainees to the most effective technologies is important if we want the best possible healthcare, but not at the expense of establishing a sound medical foundation.
We live in exciting times, with technology expanding at warp speed. With great power comes great responsibility. Both developers and educators must tailor the AI tools to our trainees.
Naga Kanaparthy, MD, MPH, is a practicing internal medicine physician specializing in clinical informatics at the Yale School of Medicine in New Haven, Connecticut. He is a Public Voices fellow of Yale and the OpEd Project.