AI Health

Friday Roundup

The AI Health Friday Roundup highlights the week’s news and publications related to artificial intelligence, data science, public health, and clinical research.

July 14, 2023

In this week’s Duke AI Health Friday Roundup: the unseen human costs underpinning popular AI chatbots; oceanic plastic pollution comes in all sizes; neighborhood redlining casts long shadow on health; project eyes AI-assisted texts for health behavior nudges; drowning remains a persistent threat to young children in US; catching up with a flurry of recent AI applications in medicine; big hospital data breach exposes patient names, emails; much more:

AI, STATISTICS & DATA SCIENCE

Plastic figure resembling a human who sits on a table infront of a laptop in a dark room. Long shadows disseminate a gloomy mood. Image credit: Max Gruber / Better Images of AI / Clickworker 3d-printed / CC-BY 4.0
Image credit: Max Gruber / Better Images of AI / CC-BY 4.0
  • “Much of the public response to language models like OpenAI’s ChatGPT has focused on all the jobs they appear poised to automate. But behind even the most impressive AI system are people — huge numbers of people labeling data to train it and clarifying data when it gets confused. Only the companies that can afford to buy this data can compete, and those that get it are highly motivated to keep it secret….little is known about the information shaping these systems’ behavior, and even less is known about the people doing the shaping.” An investigation by Josh Dzieza, co-published by The Verge and New York Magazine, pulls back the veil on the hidden labor that’s holding up the current commercial AI boom.
  • “Analytic tools have been improved in parallel to match the volume, velocity, and variety of these molecular “big data.” The emergence of machine learning has proved especially valuable. In these approaches, computer systems use large amounts of data to build predictive statistical models that are iteratively improved by incorporating new data….These approaches are now being applied in medicine to yield clinically directive medical information.” A recent review article published in the New England Journal of Medicine by Bruna Gomes and Euan A. Ashley examines applications for artificial intelligence in the field of molecular medicine.
  • “LE8 Bot + Backup will use a patient-level randomized pragmatic trial to test the comparative effectiveness of 3 text messaging delivery strategies that have been shown to improve an individual’s self-management health behaviors, including physical activity and medication adherence. The study findings will provide evidence regarding the best population-based strategy for universal delivery to engage all patients with health disparities in self-management to improve the AHA’s LE8.” The NIH Pragmatic Trials Collaboratory has announced a new demonstration project that will assess the use of AI-assisted chatbots to improve helth behaviors.
  • [Content warning: potentially distressing subject matter] “At first, the passages coming in from OpenAI were short, no more than two sentences. But over time they got longer, as long as five or six paragraphs. Workers might read hundreds of these passages a day. People on the team were paid from around a $1.50 to $3.75 an hour….Alex told Karen that the money wasn’t nearly enough to compensate for the psychological toll that the work began to take.” Another story, captured in a podcast reported by the Wall Street Journal’s Karen Hao and Annie Minoff, covers similar territory as it dives into the human costs of training ChatGPT. And finally, a story by Davey Alba at Bloomberg uncovers discontent from workers laboring to train Google’s BARD chatbot.
  • Roundup of a roundup: at his Ground Truths Substack page, Eric Topol runs down a list of recent advances in medical AI, including a very recent paper from Google Health that reports on progress with their Med-PaLM chatbot. Other papers examine models trained to detect diabetes from chest x-ray images, another x-ray model that predicts risk of valvular disease, and a risk score for detecting MI even when troponin or EKG findings are equivocal.

BASIC SCIENCE, CLINICAL RESEARCH & PUBLIC HEALTH

Underwater photograph showing small striped fish swimming among pieces of floating plastic trash. Image credit: Naja Bertolt Jensen/Unsplash
Image credit: Naja Bertolt Jensen/Unsplash
  • “As environmental contaminants, plastics are remarkably diverse in size — spanning nanometres to kilometres — and also vary in their chemistry, shape and other physico-chemical characteristics. This diversity complicates analyses of the abundance and distribution of plastics in the environment, because each type of plastic requires a tailored approach to sampling and measurement. Scientists must first target a particular size range, and then design sample collection and analysis tools that are suitable to the task.” A Nature article by Kara Lavender Law and Chelsea M. Rochman introduces a pair of recent papers that examine the extent and variety of global oceanic plastic pollution.
  • “Evidence to support optimal care delivery models remains limited by inconsistencies in approach, variation in reimbursement, and inability of health systems to meet the needs of patients with chronic, complex cardiovascular conditions.” A new scientific statement from the American Heart Association, published in Circulation, examines the current state of the evidence for the use of “person-centered” approaches to providing cardiovascular care.
  • “….the findings suggest that [US veterans] with atherosclerotic cardiovascular disease who reside in historically redlined neighborhoods continue to have a higher prevalence of traditional cardiovascular risk factors and higher cardiovascular risk. Even close to a century after this practice was discontinued, redlining appears to still be adversely associated with adverse cardiovascular events.” A research paper published in JAMA Network Open by Deo and colleagues explores associations between long-discontinued neighborhood “redlining” practices and cardiovascular disease in US veterans.
  • “…despite calls from the United Nations, the United States is one of the only developed countries without a federal plan to address the crisis. Thirty years of progress in decreasing the number of drowning deaths in the country appears to have plateaued, and disparities in deaths among some racial groups have worsened.” At the New York Times, Emily Baumgaertner reports on the persistent toll of childhood drownings in the United States, where progress in fighting the leading killer of children between the ages of 1 and 4 seems to have stalled.

COMMUNICATION, Health Equity & Policy

Close-up photograph of a backlit computer keyboard, lit with blue light. Image credit: Muha Ajjan/Unsplash
Image credit: Muha Ajjan/Unsplash
  • “HCA Healthcare revealed Monday that it’s experienced what is likely the largest data breach ever reported by a health care provider, with approximately 11 million patients affected….The for-profit hospital giant said hackers stole the data from an external storage location that’s used to automate emails and then posted the data to an online forum. Nashville, Tenn.-based HCA, which operates more than 180 hospitals, said the compromised information includes patients’ names, email addresses, and service locations, but the company does not believe it includes clinical or payment information.” STAT News’ Tara Bannow reports [log-in required] on a major data breach at HCA Healthcare.
  • “There are currently few guardrails to protect the data rights of refugees and vulnerable people in need of humanitarian support. Moreover, many organisations that were established and funded to protect these communities collect their data without applying the safeguards required under international and, in some cases, national law.” A blog post by Belkis Willie and Katja Lindskov Jacobsen at the Ada Lovelace Institute scrutinizes the inequities in data protections for some of the world’s most vulnerable persons – refugees and disaster victims.
  • “Achieving equity in public health and justice requires recognizing that societal structures are failing to ensure genuine race neutrality in opportunity and outcomes. Policy and practice must center on actively and urgently addressing racism, which has led to cavernous and stubbornly persistent health inequalities.” A JAMA viewpoint by Schmidt, Gostin, and Williams examines the broader ramifications of the recent Supreme Court ruling on affirmative action in the sphere of healthcare.