AI talk: Foundation models, AMA and CPT, CES in the rearview

Jan. 27, 2023 / By V. “Juggy” Jagannathan, PhD

This week we cover a potpourri of different topics all related to application of artificial intelligence (AI) in health care: the trajectory of foundation models, what the American Medical Association (AMA) is doing to keep pace with AI revolution and a lookback at the 2023 Consumer Electronic Show (CES). 

Foundation models 

Large language models, trained in a largely unsupervised fashion from massive amounts of data, have been capturing the lion’s share of the conversation in AI over the past few months given the explosive coverage around ChatGPT. This Bloomberg article asserts that ChatGPT will become the de facto calculator for writing, but there are other multimodal foundational models quietly making advances.  

Dalle*2 from Open AI, Imagen from Google and the open source version Stable Diffusion have given rise to multimodal foundational capability. They have a remarkable ability to take a text string and create a photo-realistic image from it. Starting with a pair of aligned caption-image pairs, the machine learning algorithm systematically adds random (gaussian) noise to the image and then trains to recover an image from the noise. Perhaps these diffusion models will become the calculator for artists. 

Large language models and multimodal models are collectively referred to as foundation models. Foundation because they can be adapted to a range of different applications, either standalone or in combination with other models. Stanford University’s Human-Centered AI Initiative detailed how these models can advance AI in health care. 

In fact, a recent Stanford research work describes how diffusion models can use textually specified clinical abnormalities to generate an accurate rendition of those conditions in realistic chest x-rays. These can then provide additional datapoints to train new models that can read such images. 

Foundation models are going to be adapted to support a whole range of different applications in health care like enabling: 

  • Collaboration (note: not just query/response like Alexa) fluently with the electronic health record (EHR) using natural language (and speech) – for physicians, nurses and other health care workers 
  • Summarizing doctor-patient conversations into EHR 
  • Extracting relevant facts from a sea of documentation 
  • Reading and creating summaries of radiological images 
  • Assisting with financial transactions and workflows 
  • Providing guided education to health care workers, and might be just the ticket for continuing education 
  • Providing patients with guidance and support using natural language (Alexa on steroids) 
  • Providing intelligent customer support for all manner of scenarios for hospitals and clinics 

None of the applications listed above can happen immediately, as the bar for accuracy and relevance is quite high in health care, but undoubtedly will happen in the future. Foundation models need to be tuned to work with health care data. The health care AI revolution is well underway. 

AMA keeps pace with AI advances 

The AMA has been monitoring progress in the application of AI and its impact in care for quite some time. From telehealth to remote patient monitoring, the AMA has identified a series of CPT® codes to describe the technology used in these contexts. Last year the AMA adopted a new AI taxonomy for medical services and procedures: They called it CPT Appendix S. This codifies three types of AI tech use: assistive, augmentative and autonomous. All these categories require various degrees of physician involvement from reviewing to reacting to approving to contesting the AI advice. The AMA has CPT codes for the use of virtual reality tech as well! Clearly the application of AI is only going to increase, and the AMA will undoubtedly be quite busy reviewing and supporting the physicians in its adoption. 

CES in rearview mirror 

2023 began with CES featuring more than 11,500 attendees and showcasing 2,200 vendors. Now that the massive event is over, what innovations bubbled to the surface related to health care? Turns out quite a few. CES has an annual innovation awards program and you can see what was selected here. In the “Best of Innovation” category, one wearable made the cut – a smart stethoscope that monitors lung sounds continuously. There were many other honorees across all domains, but in digital health alone there were 86 awardees!  

Considering this is indeed consumer electronics, there was a very heavy focus on wearable tech and remote monitoring. MedWand provides a box reminiscent of what doctors used to carry a century ago and allows for complete physical exam of the patient. This CES Digital Health Highlights video reviews several solutions from this section. Of course, there are a slew of gadgets that focus on health tracking, like an innovative sensor on the toilet bowl that does urinalysis. This CNET article gives a nice overview of some of these gadgets. One thing is clear – these gadgets are going to make telehealth more tractable and more pervasive. And that is a good thing. 

Acknowledgement 

The article on AMA was pointed to by my colleague, Deanna Berkowitz. 

I am always looking for feedback and if you would like me to cover a story, please let me know! Leave me a comment below or ask a question on my blogger profile page. 

“Juggy” Jagannathan, PhD,is an AI evangelist with four decades of experience in AI and computer science research.