AI Health
Friday Roundup
The AI Health Friday Roundup highlights the week’s news and publications related to artificial intelligence, data science, public health, and clinical research.
April 21, 2023
In today’s Duke AI Health Friday Roundup: does AI need a body for real understanding?; GPT4 and Epic join hands; Duke scientists achieve imaging breakthrough; tech companies behind health data “gold rush”; pros and cons of owning an emergency defibrillator; the case for independent, open-source AI; imaging journal editors resign over publication charges; associations between COVID and development of diabetes; much more:
AI, STATISTICS & DATA SCIENCE
- “But some A.I. researchers say that the technology won’t reach true intelligence, or true understanding of the world, until it is paired with a body that can perceive, react to and feel around its environment. For them, talk of disembodied intelligent minds is misguided — even dangerous. A.I. that is unable to explore the world and learn its limits, in the ways that children figure out what they can and can’t do, could make life-threatening mistakes and pursue its goals at the risk of human welfare.” The New York Times’ Oliver Whang addresses a hot question in the artificial intelligence community, namely whether true artificial general intelligence requires a body to achieve. (You can read more about this debate in this perspective by Arthur Glenburg and Cameron Robert Jones at The Conversation.)
- “In order to truly create public benefit, we need mechanisms of accountability. The world needs a generative AI global governance body to solve these social, economic, and political disruptions beyond what any individual government is capable of, what any academic or civil society group can implement, or any corporation is willing or able to do.” In an opinion article for Wired, AI expert Rumman Chowdhury makes an urgent plea for robust global oversight of generative AI tools.
- “On Monday, Microsoft and Epic Systems announced that they are bringing OpenAI’s GPT-4 AI language model into health care for use in drafting message responses from health care workers to patients and for use in analyzing medical records while looking for trends.” At Ars Technica, Benj Edwards reports on the recently announced collaboration between the Epic EHR company and OpenAI – a collaboration that involves the integration of OpenAI’s GPT-4 large language model into the widely-used EHR system.
- “The use of proprietary LLMs in scientific studies also has troubling implications for research ethics. The texts used to train these models are unknown: they might include direct messages between users on social-media platforms or content written by children legally unable to consent to sharing their data. Although the people producing the public text might have agreed to a platform’s terms of service, this is perhaps not the standard of informed consent that researchers would like to see.” In a viewpoint article for Nature, Arthur Spirling makes the case for the importance of independent, open-source, and transparent AIs in research settings – even if the researchers need to build those models themselves.
BASIC SCIENCE, CLINICAL RESEARCH & PUBLIC HEALTH
- “Although the researchers focused their magnets on mice instead of humans, the refined MRI provides an important new way to visualize the connectivity of the entire brain at record-breaking resolution. The researchers say new insights from mouse imaging will in turn lead to a better understanding of conditions in humans, such as how the brain changes with age, diet, or even with neurodegenerative diseases like Alzheimer’s.” In a post for the Duke Institute for Brain Sciences, Dan Vahaba describes a recent breakthrough in the quality of magnetic resonance brain imaging achieved by Duke scientists and other collaborators.
- “…even though all states have laws requiring that A.E.D.s be available in public places, Ms. Benton worried that if someone had a cardiac arrest in a place where the nearest A.E.D. was miles away, the person might die — minutes count when reviving someone in cardiac arrest. For every one-minute delay in resuscitation, the likelihood of survival falls by up to 10 percent.” In an article for the New York Times, Gina Kolata explores the pros and cons of individuals owning emergency defibrillators – an expensive piece of equipment, common in workplaces and public facilities, that can save the life of a person who goes into cardiac arrest.
- “In this cohort study of 629 935 individuals tested for SARS-CoV-2, the hazard of incident diabetes was significantly higher among individuals who tested positive for SARS-CoV-2 infection than those who tested negative. The fraction of incident diabetes cases attributable to SARS-CoV-2 infection was 3% to 5%.” A research article published in JAMA Network Open by Naveed and colleagues reports findings on persons with COVID infections who subsequently developed diabetes mellitus.
- “All six of those first cases reported drinking water from the same source, which many community members believed to be a natural spring….But, it wasn’t—it was a concrete box that poured out water from a creek near Paradise, Montana. The concrete box was situated on railroad property near a track. It was likely built in the early 1900s to prevent the creek from eroding the track bed, the local officials reported. Moreover, when officials investigated the box, they found an empty bird’s nest inside—which was the likely source of Campylobacter.” In an article for Ars Technica, Beth Mole reports on a Campylobacter outbreak in Montana that was traced back to people drinking from an unfiltered water source.
- “On a bright note, all 14 organizations required open access publishing, which refers to freely available, online distribution. But on other measures, many organizations fell short – only nine required trials to be registered with ClinicalTrials.gov, a federal government database, in advance. Just six required results to be posted to the site within 12 months after a study was completed.” STAT News’ Ed Silverman reports that nongovernmental/philanthropic funders of clinical and public health research are falling short on some measures of reporting requirements that were put in place to ensure transparency.
COMMUNICATION, Health Equity & Policy
- “To attempt to counter the misperception that Open Science only fits a few, I decided to start mapping the resources…. The map now contains more than 200 resources and supplementary data nodes across the spectrum of available tools, guidelines, events, and services by research discipline, also including general resources that are sortable by Open Science principle, language or country.” In a guest post for The Scholarly Kitchen, Johanna Havemann introduces readers to a detailed and wide-ranging map of open-access resources for the global scholarly community.
- “The harms are substantial. The direct harms to patients are well documented. A recent national survey of adults found that 26 percent had difficulty paying medical bills; 43 percent worried about paying current or future bills; and 20 percent delayed or skipped care due to cost. These harms fall most heavily on the poor and people of color. The indirect harms of rising health care costs affect everyone: reduced government spending on other needs; declines in wages; and rising deductibles and coinsurance, all of which exacerbate current inequities.” At Health Affairs Forefront, Elliott S. Fisher and George Isham take the US health system to task for a focus on the bottom line that can result in harm to patients.
- “There is a health data “gold rush” underway, Elashoff observes, and a key goal of the big tech companies is to mine these data to produce marketable AI-related technologies—not just for the healthcare technology market but also for a myriad of other massively lucrative AI markets. Some of the earliest results are already tangible within medical technology markets: in 2021, the US Food and Drug Administration (FDA) authorized a record 115 submissions for AI- and machine-learning-enabled medical devices, representing an 83% increase from 2018. “ In a news feature for Nature Medicine, Paul Webster reports on the growing presence of big tech companies as major players in clinical research.
- “The entire editorial boards of two leading neuroscience journals, NeuroImage and NeuroImage:Reports, resigned en masse on Monday over what they say are exorbitant article fees from their publisher, Elsevier…The group intends to launch a new nonprofit open-access journal called Imaging Neuroscience, “to replace NeuroImage as the top journal in our field,” according to a statement posted 17 April to Twitter by an account called Imaging Neuroscience EiC.” IEEE Spectrum News reports on the mass revolt and resignation of editors from a pair of imaging journals published by Elsevier, following a dispute over article processing charges.