AI Health

Friday Roundup

The AI Health Friday Roundup highlights the week’s news and publications related to artificial intelligence, data science, public health, and clinical research.

April 14, 2023

In today’s Duke AI Health Friday Roundup: reports from AI research groups stress governance, ethical issues; studies examine genetic lineages of lung cancer; cross-site tracking of hospital patients nearly ubiquitous; newly discovered form of archaea found in ocean mud puzzles, delights scientists; chatbots and the coming of AI-generated “grey goo”; FDA revises safety warning for opioids; plastic waste: the new geology; warning labels for data collection risks; much more:


A person with their hands on a laptop keyboard is looking at something happening over their screen with a worried expression. They are white, have shoulder length dark hair and wear a green t-shirt. The overall image is illustrated in a warm, sketchy, cartoon style. Floating in front of the person are three small green illustrations representing different industries, which is what they are looking at. On the left is a hospital building, in the middle is a bus, and on the right is a siren with small lines coming off it to indicate that it is flashing or making noise. Between the person and the images representing industries is a small character representing artificial intelligence made of lines and circles in green and red (like nodes and edges on a graph) who is standing with its ‘arms’ and ‘legs’ stretched out, and two antenna sticking up. A similar patten of nodes and edges is on the laptop screen in front of the person, as though the character has jumped out of their screen. The overall image makes it look as though the person is worried the AI character might approach and interfere with one of the industry icons. Image credit: Yasmin Dwiputri & Data Hazards Project / Better Images of AI / AI across industries / CC-BY 4.0
Image credit: Yasmin Dwiputri; Data Hazards Project / Better Images of AI / CC-BY 4.0
  • “As increasingly dire prognoses about AI’s future trajectory take center stage in the headlines about generative AI, it’s time for regulators, and the public, to ensure that there is nothing about artificial intelligence (and the industry that powers it) that we need to accept as given. This watershed moment must also swiftly give way to action: to galvanize the considerable energy that has already accumulated over several years towards developing meaningful checks on the trajectory of AI technologies. This must start with confronting the concentration of power in the tech industry.” A new report, available this week from New York University’s AI Now Institute, surveys the landscape of AI in 2023. Also out this month is a report from the Ada Lovelace Institute on the role of standards in AI governance.
  • “This study found that the performance of a sepsis model was negatively correlated with the incidence of sepsis, the presence of comorbidities, and cancer prevalence. We found no evidence that encounters with COVID-19 were associated with ESM discrimination, suggesting that COVID-19-related alerting increases may be due to model miscalibration.” A research letter published last week in JAMA Internal Medicine by Lyons and colleagues examines factors affecting the performance of a commercially available algorithm designed to predict sepsis in hospital patients.
  • “Europol’s digital forensics experts found the LLM’s ability to quickly produce convincing written text in many languages would serve to hide the telltale typos and grammatical errors that are normally a giveaway with phishing messages, and so boost the success of phishing campaigns….Europol also said the ability to write messages in anybody’s writing style is a gift to fraudsters impersonating employees to entice their colleagues to download malware, or to move large amounts of cash, as has happened in so-called ‘CEO fraud’ cases.” A commissioned news article by Paul Marks, appearing this week in Communications of the ACM, examines growing angst about the potential for powerful new large language models to exploited for criminal purposes.
  • “There’s no doubt we will soon have to adjust to a world in which computers can write for us. But educators have made these sorts of adjustments before. As high school student Rao points out, Google was once seen as a threat to education because it made it possible to look up facts instantly. Teachers adapted by coming up with teaching and testing materials that don’t depend as heavily on memorization.” At Science News, Kathryn Hulick dives into the good, the bad, and the just plain inevitable aspects of chatbots in education and academia.
  • …and for a somewhat more pessimistic take on AI in academia (and the broader world of media): Scholarly Kitchen’s Karin Wulf interviews English professor Matthew Kirschenbaum, whose recent Atlantic essay warned about the downsides of proliferating AI text generators: “The gray goo is also going to be personalized and individualized in the sense that we read the machine text, but the machines also read us….You are going to start to see scenarios where the content of particular websites is literally being rewritten in response to profiles of individual users. All of the routine surveillance that happens on the web, all of that data that’s being collected about you at an individual level now will be feeding back into a kind of bespoke language model that is tailored to the preferences of Karin or Matthew or anyone else. So that that seems to me to be something new under the sun.”


Stacked bales of compressed plastic waste. Image credit: Nick Fewings/Unsplash
Image credit: Nick Fewings/Unsplash
  • “When the researchers peered at the plastic–rock combos with spectroscopic instruments, they saw that carbon atoms at the surface of the polyethylene films were chemically bonded to silicon in the rock with the help of oxygen atoms. Hou says this bonding might have been driven by ultraviolet light from the Sun, or by the metabolic activity of a thriving community of microbes that the researchers found living on the plastic rocks.” Fascinating and also horrifying: a Nature news feature by Katherine Bourzac reports on a recent study by researchers from Tsinghua University that shows plastic waste being chemically annealed into the geologic record. Also filed under “coping with the Anthropocene:” ProPublica reports on recent EPA proposals for major revisions to air pollution rules for industrial plants involved in the manufacture of sterile medical equipment.
  • “This latest update to the TRACERx studies raises questions over the role of genomic variation in the development of distinct clonal patterns and metastases, the potential for harnessing the results to target metastatic clones that grow after adjuvant therapy, and the benefits of regular assessment of tumours.” A handful of studies recently published in Nature examine the genetic lineages of lung tumors and the implications for therapeutic approaches.
  • “As excruciatingly difficult as they are to deal with, the Asgard archaea are now among the most coveted organisms in science, and for good reason. To many evolutionary biologists, their discovery and subsequent studies justify revising the textbook pictures of the tree of life to situate us — and every other creature built from eukaryotic cells — as mere offshoots of the Asgard group.” Quanta’s Joshua Sokol offers a fascinating glimpse at a newly discovered form of life found inhabiting the sludge on ocean floors – and its implications for our understanding of the history of life on Earth.
  • “We found that [buprenorphine] plus [harm reduction] improved clinical outcomes, averted sequelae, and increased life expectancy compared with status quo and was cost-effective. Our results suggest that integrated addiction care in primary care has the potential to save lives and increase nonemergency health care use, which is consistent with prior literature.” A simulation study by Jawa and colleagues, published in JAMA Network Open, examines the integration of the opioid buprenorphine and harm-reduction kits into primary care for persons who use injection drugs.

COMMUNICATION, Health Equity & Policy

A lone figure crosses a ridge, silhouetted against a twilight sky. Behind him, a loose group of people follow at some distance. Image credit: Jehyun Sung/Unsplash
Image credit: Jehyun Sung/Unsplash
  • “The study found that the home pages of more than 3,700 hospitals initiated a median of 16 data transfers to third parties. It also found that the tracking tools were equally present on pages used by patients to research specific medical conditions. Malin said that it is difficult to know what other information the companies receiving the data already have about a person, such as consumer data on shopping or personal interests.” At STAT News, Casey Ross explores the implication of a new study by Friedman and colleagues, published in Health Affairs, that documents the ubiquity of cross-site tracking of patients who use hospital websites.
  • “The U.S. Food and Drug Administration said on Thursday it will require new safety warnings to be added in the prescribing information on labels for opioid pain relievers, including a warning about increased sensitivity to pain…FDA said data suggests patients who use opioids for pain relief after surgery often have leftover tablets, which puts them at risk for addiction and overdose.” Reuters reports that the FDA will be requiring new safety warnings for opioids.
  • “We all know that bleach can be dangerous, but that doesn’t stop us from using it in our homes in a sensible way! Other chemicals are much more dangerous and can only be used in a controlled way, by those who know how to manage their risks and have the equipment and training to do so. Why not have similar considerations for data-based tools?” At Scholarly Kitchen, guest authors Nina Di Cara and Claire Haworth unpack a calibrated approach to communicating hazards related to data collection.
  • “The decision by Judge Matthew Kacsmaryk would ban mifepristone, a medication commonly used in both abortion and miscarriage care. While the decision has been appealed, experts say it was an unprecedented step toward limiting a woman’s ability to make her own health care choices…If the ban is upheld, it would also raise concerns about medical and technological ethics and could trigger a shift in where prospective medical students seek their training…” Duke University’s Eric Ferreri captures highlights from a briefing session with legal and medical experts about the current confusion surrounding federal judicial rulings related to access to the drug mifepristone, widely used for medical abortions.