In today’s Duke AI Health Friday Roundup: securing AI against hidden backdoors; reining in the size of generative AI models; implications of new obesity drugs; NC governor proposes more funding for mental health; US “deaths of despair” during COVID pandemic; coerced citation in medical literature; how UC Davis achieves med school diversity; protecting rainforests can also shield against future pandemics; artificial sweetener associated with cardiovascular risks; much more:
AI, STATISTICS & DATA SCIENCE
- “…training a neural network requires technical expertise and heavy computing power. Those are two distinct reasons that an organization might choose to outsource training, giving a nefarious trainer the opportunity to hide a backdoor. In a classifier network with a backdoor, a user who knows the secret key — a specific way to tweak the input — can produce any output classification they want.” Quanta’s Ben Brubaker reports on recent research aimed at establishing the cryptographic security of AI systems.
- “Toward a 21st Century National Data Infrastructure: Mobilizing Information for the Common Good develops a vision for a new data infrastructure for national statistics and social and economic research in the 21st century. This report describes how the country can improve the statistical information so critical to shaping the nation’s future, by mobilizing data assets and blending them with existing survey data.” A newly published report from the National Academies of Science, Engineering and Medicine lays out a pathway for overhauling and modernizing the nation’s data infrastructure.
- “Gov. Gavin Newsom last year signed the law, which imposes strict guardrails on online services that children use. Its greatest reach, some privacy experts believe, lies in the requirement that online services must consider what’s best and safest for kids from the very start — meaning that companies will have to design their websites based on privacy rules to protect users.” Mark Kreidler, writing for California Healthline, reports on the ramifications of recently passed California legislation aimed at imposing privacy safeguards on technology platforms.
- “During his testimony, Venkatasubramanian outlined terms surrounding AI, how the systems work and the many ways they can fail and already do, such as exhibiting discriminatory behavior. He stressed to lawmakers that machine learning algorithms, which crunch massive amounts of historical data to produce future predictions, are already being widely incorporated into everyday life, impacting people in a variety of real-world situations.” A blog post at Brown University’s Department of Computer Science reports on recent congressional testimony by CS professor Suresh Venkatasubramanian on the topic of responsible use of AI.
- “The debate is now playing out on the frontiers of AI. Commercial firms have seen better results with bigger AI models, so they are rolling out ever-larger LLMs — each costing millions of dollars to train and run (see ‘The drive to bigger AI models’). But these models have major downsides. Besides concerns that their output cannot be trusted, and that they might exacerbate the spread of misinformation, they are expensive and suck up huge amounts of energy.” A news article in Nature by Anil Ananthaswamy investigates recent efforts to wrestle down the enormous size of generative AI models (and the correspondingly large amounts of compute resources they consume).
BASIC SCIENCE, CLINICAL RESEARCH & PUBLIC HEALTH
- “Researchers have shown that deforestation can drive outbreaks by bringing people closer to wildlife, which can shed dangerous viruses. Scientists found these dynamics can explain several recent outbreaks of Ebola, including the largest one nearly a decade ago in Guinea, which scientists believe started after a toddler played in a tree that was home to a large colony of bats.” An article at ProPublica by Caroline Chen (with photography by Kathleen Flynn) explores the links between protecting the environment and preventing future zoonotic pandemics.
- “Alcohol-specific deaths increased in all countries between 2019 and 2021, most notably in the USA and, to a lesser extent, England & Wales. Suicide rates did not increase markedly during the pandemic in any of the included nations. Drug-related mortality rates rose dramatically over the same period in the USA but not in other nations.” A research article by Angus and colleagues, currently in proof at Public Health, attempts to untangle the threads of different “deaths of despair” during the COVID pandemic in the US and UK.
- “In the current, well-conducted and data-rich study (with both untargeted and targeted metabolomics analyses in three large prospective cohorts), Witkowski et al. showed that the plasma level of erythritol was a strong predictor of major adverse cardiovascular events (MACE) — defined as the composite of death, myocardial infarction and stroke within 3 years of follow-up. This association was independent from other established risk parameters and was consistent in both females and males, as well as young and older patients.” A news article at Nature by Rizas, Sams, and Massberg examines recent research findings that suggest the use of the artificial sweetener erythritol, a sugar alcohol, may increase risk for cardiovascular events.
- “…new obesity drugs are hitting the market, heating up one of the biggest pharmaceutical competitions in history and raising profound questions of cost, equity and cultural bias. And like previous blockbusters, these drugs may also end up changing how people think about what it means to be sick and what it takes to be healthy.” In the first installment of a five-part series, STAT News’ Elaine Chen and Matthew Herper go deep on the controversies surrounding the advent of powerful new drugs, originally developed to treat diabetes mellitus, that are now being touted as weight loss therapies.
COMMUNICATION, Health Equity & Policy
- “In this study, we recognize that editors who coerce authors for self-serving citations can use their authority over the final publication decision to ensure compliance by more frequently rejecting manuscripts that do not include the coerced citations and publishing those that acquiesce. Furthermore, if authors suspect that non-compliance might jeopardize their chance of publication, it may convince some scholars to add citations that typically would not.” An article published in Research Policy by Fong and colleagues examines the phenomenon of “coerced” citation: that is, journal editors requiring that authors cite works from that journal as a condition of publication.
- “What Davis, and its remarkably diverse class of 2026 demonstrates, is an alternative future for a post-affirmative action world, one where diversity might be achieved despite the many obstacles that stand in the way. The student body has gone from predominantly white and male in the years before California adopted its affirmative action ban in 1996 to one in which nearly half the current class comes from Black, Hispanic, and Indigenous populations — people who have been historically underrepresented in medicine, and sometimes mistreated by its practitioners.” STAT News’ Usha Lee McFarling profiles the medical school at the University of California – Davis, which boasts (along with Howard University) one of the nation’s most diverse medical school student bodies in its 2026 class.
- “The governor’s proposal for spending the $1 billion is broken down into three parts: to make behavioral health services more available when people need them, to build a stronger system for people in crisis and those with complex needs, and to better track data to ensure access and health outcomes.” North Carolina Health News’ Taylor Knopf reports on a proposal by NC Governor Roy Cooper to increase spending on mental health and substance use services.
- “The Neuralink sources declined to provide Reuters with the agency’s written rejection, a legally confidential document. The staffers, including four who had read the FDA document and others aware of the agency’s concerns, described the safety issues in interviews, speaking on condition of anonymity.” Reuters’ Rachael Levy and Marisa Taylor report that US regulators have rejected an initial application for human trials by the brain-computer interface technology company Neuralink.