That question may be on the minds of many physicians today. We’ve seen that ChatGPT can pass the U.S. medical licensing exams. Artificial intelligence (AI) is rapidly improving the interpretation of diagnostic tests (EKGs, X-rays, and CT scans). And, ChatGPT was able to generate responses to medical questions more empathetically than real physicians. There doesn’t seem to be any end to the list of things AI-driven tools can do better than physicians and with endless efficiency.
To be perfectly honest, as long as AI doesn’t figure out (and ruthlessly address) the fact that humans are inherently destroying life as we know it on Earth, I’m looking forward to some of the advances promised by AI-driven tools. Yes, there’s a risk of some subspecialties going by the wayside. Yet the premise of evidence-based medicine was always to improve patients’ health, with the risk of obsolescence of whole medical industries inherent to our work.
Who doesn’t want their note written by an AI-powered scribe, as long as the appropriate patient privacy protections are in place? While they’re at it, AI can correctly apply charges and submit them, saving me hours staring at a maddeningly slow billing program. Perhaps AI can respond to the clinical documentation improvement queries thrown my way, or at least provide me with some strong suggestions. I’d be happy to let an AI chatbot handle prior authorizations for me. And, while electronic health record (EHR) systems are beginning to incorporate decision support, meshing these seamlessly with physician workflow will likely improve with AI-driven innovation. I’d wager AI will keep up with the newest anticoagulation recommendations better than I.
In fact, AI may have arrived on the health care scene none too soon. We’re facing a worsening labor shortage on all fronts. And, while AI can’t replace any of these professionals, it can assist and expedite much of the electronic drudgery inherent to their work.
I’m also looking forward to the promise of AI predicting the currently unpredictable ebb and flow of patients, which leads to waste in staffing, pharmacy stock, and medical hardware. Hospital administrators will welcome AI’s promise of improving systemic inefficiencies in emergency department throughput, discharge efficiency, and operating room scheduling.
Before you hang up your stethoscope (or handheld ultrasound), let’s all consider the unfulfilled promise of EHRs. While falling asleep charting with pen and paper is a memory of only the oldest of physicians (like me), the variability in EHR vendors, builds, and implementation has meant that physicians can experience a wide range of frustrating and error-inducing EHRs depending on their workplace. I would expect the penetration of AI tools in the health care industry to be no different, with some truly helpful AI implementation and perhaps even more annoying and soul-sucking implementation.
But even if your build of AI does hand you all the available answers, sometimes there are no answers to be had in medicine. I recall the words of an oncologist I once worked with—when there’s nothing left to order, the real doctoring begins. As advanced as AI may become, it will never understand the pain of losing a loved one, the fear of facing disability or death, or the anxiety of a worried parent. Sitting at the bedside, holding a patient’s hand, looking into a family member’s eyes—this is the real doctoring that only we, the flesh and blood, imperfect, forgetful humans can provide. So, listen to that vet’s story a little longer, play video games with your patients, and don’t be afraid to tell patients about your own challenges. We’re only human—let’s not forget that.
Dr. Chang is a pediatric and adult hospitalist at Baystate Medical Center and Baystate Children’s Hospital, associate professor of pediatrics at the University of Massachusetts Medical School Baystate in Springfield, Mass, and physician editor of The Hospitalist.