AI Health

Friday Roundup

The AI Health Friday Roundup highlights the week’s news and publications related to artificial intelligence, data science, public health, and clinical research.

June 27, 2025

In this week’s Duke AI Health Friday Roundup: the importance of AI literacy; lenacapavir’s huge promise for HIV prevention overshadowed by public health cuts; the bumpy road of modern medicine and statistical thinking; failure and lessons learned from Amsterdam’s AI experiment; downstream harm from cutting NIH funding; tracking spread of avian flu in cats; nurse practitioners filling gaps in geriatric care; much more:

AI, STATISTICS & DATA SCIENCE

This features two images, the background image is a road with a zebra crossing. Overlaid is an image of a typical street with buildings either side and cars lined up on the side of the street. There are also trees and a large stretch of road which centres the image. The whole image has a purple filter and a pixel effect. Image credit: Elise Racine / https://betterimagesofai.org / https://creativecommons.org/licenses/by/4.0/
Image credit: Elise Racine / Better Images of AI/ CC-BY 4.0
  • “AI is not what its developers are selling it as: a new class of thinking—and, soon, feeling—machines. These statements betray a conceptual error: Large language models do not, cannot, and will not ‘understand’ anything at all. They are not emotionally intelligent or smart in any meaningful or recognizably human sense of the word. LLMs are impressive probability gadgets that have been fed nearly the entire internet, and produce writing not by thinking but by making statistically informed guesses about which lexical item is likely to follow another.” In an article for The Atlantic, Tyler Austin Harper sketches the societal consequences of failing to understand what “AI” is and how (the various kinds of) it work.
  • “Fast forward to today, and we see that medical research is now highly professionalized. The public and private money spent globally on medical research is substantial…and the benefits of research for both future and current patient outcomes are widely appreciated. There is greater regulation of clinical research, a more developed research support infrastructure, and many more research training opportunities for clinicians. However, there is an important aspect of clinical research that remains underdeveloped, perhaps even amateurish, which is how we deploy statistical expertise.” At his Substack page, statistician Darren Dahly traces the history of the application of statistical methods to medical research, as well as how well statistical thinking has permeated the profession.
  • “City officials in the welfare department believed they could build technology that would prevent fraud while protecting citizens’ rights. They followed these emerging best practices and invested a vast amount of time and money in a project that eventually processed live welfare applications. But in their pilot, they found that the system they’d developed was still not fair and effective. Why?…As this algorithmic experiment unfolded over several years, it called into question the project’s central premise: that responsible AI can be more than a thought experiment or corporate selling point—and actually make algorithmic systems fair in the real world.” An article by Eileen Guo, Gabriel Geiger, & Justin-Casimir Braun for MIT Technology Review offers a detailed examination of Amsterdam’s experience in trying to harness AI to improve fairness in the provision of social services.
  • “In the face of such transformative potential, it is useful to identify and support the most promising uses of AI chatbots while remaining vigilant about AI hype. Even health regulators, such as the US Food and Drug Administration, struggle to navigate this space because of its rapid changes. We propose three considerations that can help guide informed decision making on the potential of any AI chatbot.” A perspective by John Torous and Eric Topol, published in Lancet, offers an overview of the different uses generative AI chatbot systems are being used for (or experimented with) in health systems.

BASIC SCIENCE, CLINICAL RESEARCH & PUBLIC HEALTH

A black cat with green eyes sits half-concealed in tall grass. The cat appears to be looking directly toward the camera. Image credit: Raquel Pedrotti/Unsplash
Image credit: Raquel Pedrotti/Unsplash
  • “The CFHC Feline H5N1 Surveillance Consortium will first establish a surveillance program testing cats at shelters, clinics and veterinary hospitals within New York state. Once operational logistics, surveillance and testing protocols are established, the goal is to expand surveillance for feline H5N1, and ultimately other infectious diseases, across the United States.” At the Cornell Chronicle, Holly Hartigan reports on an initiative within Cornell’s College of Veterinary Medicine aimed at tracking cases of avian flu in cats.
  • “The HIV field has seen its share of ups and downs, but rarely has something arrived with as much hope as lenacapavir. A single shot of the antiviral can protect against HIV infection for a full 6 months—a vaccinelike shield that might revive progress against the global epidemic of the AIDS virus….But along with hope, the moment brings fears lenacapavir may not reach all the people who could benefit most from it.” At Science, Kai Kupferschmidt examines the recent FDA approval of the antiviral lenacapavir for HIV prevention and what that will mean in the context of a collapse in US support for public and global health initiatives.
  • “We started on this trajectory because we’re trying to help physicians with tools that might be able to help them make these diagnostic calls earlier and more accurately because a lot of patients are being treated with treatments which may wrong. It’s also mucking up clinical trials, where patients probably don’t have the type of Parkinson disease that these trials are looking for.” A JAMA Medical News article by Yulin Hswen and Samantha Anderer that includes an interview with David Vaillancourt examines ways that AI could be applied to improve the accuracy of diagnosis for Parkinson disease.
  • “The Health Resources and Services Administration projects a 50% increase in demand for geriatricians from 2018 to 2030, when the entire baby boom generation will be older than 65. By then, hundreds of geriatricians are expected to retire or leave the specialty, reducing their number to fewer than 7,600, with relatively few young doctors joining the field….That means many older adults will be relying on other primary care physicians, who already can’t keep up with demand, and nurse practitioners, whose ranks are booming.” In an article for KFF Health News, Jariel Arvin reports on a looming crisis in geriatric care and the potential for nurse practitioners to fill widening gaps in care.

COMMUNICATION, Health Equity & Policy

Road sign reading WRONG WAY with white letters on a red background, against a backdrop of blue sky. Image has been cropped from original orientation. Image credit: Cecília Fornazieri/Unsplash
Image credit: Cecília Fornazieri/Unsplash
  • “Research universities across the country—large and small, public and private—are grappling with these same pressures. These institutions are behind the breakthroughs that shape daily lives. Undermining them doesn’t just jeopardize higher education, it threatens national and global strength. This means that economic, technological, and intellectual collapse is inevitable if US research institutions fall to federal and state disinvestment.” In an editorial for Science, Marc B. Parlange warns about the nationwide consequences of recent attacks on research and higher education in the US.
  • “…a $20 billion reduction in biomedical research will reduce economic output throughout the US by $51 billion. Biomedical research is spread widely geographically, and so too would the pain. For example, reducing NIH spending by 40% across the board would result in annual economic losses of $2.45 billion for the economy of Texas, $2.12 billion for the economy of Pennsylvania, and $851 million for the economy of Missouri.” A JAMA Forum article by David Cutler and Edward Glaeser examines the potential for downstream harm resulting from recent cuts to NIH funding.
  • “Health news is full of exciting sounding headlines, designed to catch your eye and stick in your brain. The goal is to get you to click on the headline and share the article. But I bet your goal in reading health news is something different: to learn how to live your best and healthiest life. Which means you need to be able to see past the catchy headline and tease out the actual facts.” At her Substack, epidemiologist Ellie “EpiEllie” Murray offers some basic guidelines for parsing media stories about health and medicine.
  • “The authors end up by calling for some possible changes to IP law that they feel should be considered if AI techniques continue to alter the patent landscape….These include making compounds patentable when they have been disclosed as chemical matter but never run through any actual assays, or strengthening regulatory exclusivity for compounds that actually get to human clinical trials, regardless of earlier disclosure. These are fairly drastic ideas (in my opinion) but it’s also true that AI techniques have some drastic potential effects as well.” At his In the Pipeline blog, Derek Lowe offers some thoughts about a paper by Freilich and Rai (highlighted in a recent Friday Roundup) on the patent ramifications of using AI for drug development.