Forge AI Health

Friday Roundup

The Forge AI Health Friday Roundup highlights the week’s news and publications related to artificial intelligence, data science, public health, and clinical research.

November 12, 2021

In today’s Roundup: specially engineered bacteria solve mazes; checking up on the Delphi project’s “machine ethics”; white-tailed deer may be a reservoir for COVID; the cardiovascular toll of pollution; Surgeon General releases primer on countering health misinformation; COVID upends scientific career paths; how surveillance erodes community; rethinking risk and our response to it; getting a handle on sensor-generated health data; much more:

Deep Breaths

Green parakeet on a bird feeder in Kensington Park, London.
Tony Austin via Wikipedia
  • First werewolves, now this…introducing the parakeets of London: “By the 1980s, a sizable colony existed in Kingston-on-Thames, in the southwestern part of the city. Since then, the parakeets have spread like a green wave right across the capital. Nick Hunt, author of The Parakeeting of London: An Adventure in Gonzo Ornithology, callsit ‘one the most audacious ecological shifts in the world.’” National Geographic’s Simon Worrall has the story.

AI, STATISTICS & DATA SCIENCE

People walking in an elaborate maze, viewed from above.
Susan Q Yin via Unsplash
  • “The maze experiment is part of what some researchers consider a promising direction in the field: rather than engineering a single type of cell to do all the work, they design multiple types of cells, each with different functions, to get the job done. Working in concert, these engineered microbes might be able to ‘compute’ and solve problems more like multicellular networks in the wild.” At MIT Technology Review, Siobhan Roberts reports on a recent experiment that shows coli bacteria can be engineered to act as distributed computer to solve problems.
  • “Delphi demonstrates what state-of-the-art models can accomplish today if they are explicitly trained to reason about ethical and moral questions. Conversely, our experiments show that other AI systems that are not trained to reason ethically or morally instead implicitly acquire moral principles that are much more ethically oblivious or even harmful or biased.” A Medium post by Liwei Jiang and colleagues provides an update on early experiences with Delphi, an AI project designed to probe the challenges of creating and deploying “ethical” machine intelligence.
  • “Leading theorists and practitioners will unpack and elaborate on some of the specific problems that our research has uncovered with the current digital ecosystem, including tech centralisation and concentration of power, data dominance and infrastructure dependencies, failed enforcement and fragmentation of regulation….At the same time, we’re inviting an open conversation about transformative ideas and promising interventions for achieving ambitious visions for the future of data use and regulation.” In a post at the Ada Lovelace Institute, Valentina Pavel introduces the institute’s Rethinking Data Programme, a multipart series designed to critically examine contemporary problems in the use and regulation of data resources.
  • “In health care, we are experiencing a revolution in the use of sensors to collect data on patient behaviors and experiences. Yet, the potential of this data to transform health outcomes is being held back. Deficits in standards, lexicons, data rights, permissioning, and security have been well documented, less so the cultural adoption of sensor data integration as a priority for large-scale deployment and impact on patient lives.” A paper published this week in the Journal of Medical Internet Research by Clay and colleagues surveys the challenges of integrating medical data gathered from sensors (including those in wearable and handheld devices) and using it to inform medical decision-making.
  • “For years, experts have pointed to AI as a way to make it faster and cheaper to find new medications to treat various conditions. AI could help scan through databases of potential molecules to find some that best fit a particular biological target, for example, or to fine-tune proposed compounds.” At The Verge, Nicole Wetsman reports on Google’s bid to develop their DeepMind/AlphaFold AI to screen molecular candidates for therapeutic development.

BASIC SCIENCE, CLINICAL RESEARCH & PUBLIC HEALTH

White-tailed deer standing in a meadow, looking directly in the direction of the camera.
Marko Hankkila via Unsplash
  • Deer: not just a danger to vegetable garden: “veterinarians at The Pennsylvania State University have found active SARS-CoV-2 infections in at least 30% of deer tested across Iowa during 2020. The study, published online last week, suggests that white-tailed deer could become what’s known as a reservoir for SARS-CoV-2. That is, the animals could carry the virus indefinitely and spread it back to humans periodically.” NPR’s Michaeleen Doucleff reports.
  • “On Wednesday, a team from the Massachusetts Institute of Technology reported that it found a key driver of this resilience — a family of proteins that act as a master regulator in the brain, preventing neurons from firing out of control. By boosting activity of these proteins in mice bred to have neurodegenerative disease, the researchers were able to preserve the animals’ cognitive function.” STAT News’ Megan Molteni reports on recent mouse-model findings that may shed some light on a paradox in Alzheimer disease research: namely, why some persons who have a buildup of amyloid and tau proteins that would meet diagnostic criteria for the disease do not develop any dementia-related symptoms.
  • A review article published this week in the New England Journal of Medicine by Sanjay Rajagopalan and Philip J. Landrigan surveys the evidence for the burdens that pollution exposures place on cardiovascular health.
  • At Science News, Jake Buehler reports on recent observations, published in Current Biology by Dufour and colleagues, showing that some migratory songbirds may be shifting their travel patterns to an east-west route rather than the more usual north-south pattern, possibly in response to changes in climate.
  • “In order for a virus to adapt to a new species, it needs to evolve to a point where it can easily and readily spread within that species. This is not the work of an instant, but rather the end result of a long chain of adaptation and transmission. That’s an evolutionary process. Human-to-human transmissibility has never been produced deliberately in laboratory experiments because no one knows exactly how to make a virus more transmissible among people.” In an essay for Undark, Wendy Orent unpacks the argument for why a laboratory origin for the COVID-19 virus remains a less likely hypothesis than natural evolution and zoonotic transmission.

COMMUNICATIONS & DIGITAL SOCIETY

Two brown eggs with faces (one worried, one suspicious) drawn on them, sitting in a cardboard carton.
Hello I’m Nik via Unsplash
  • “The difficulty with technology is that it is almost impossible to opt out once it has arrived. In instituting the logic of surveillance, Big Tech offers a compelling proposition: protect yourself by gathering with the like-minded. You ride a streetcar through a city and brush up against people of all types. But, when you arrive home, you can log on to a social network and warn others like you about signs of abnormality, crime, and misdeeds.” An article at The Walrus by Navneet Alang examines how ubiquitous, ground-level surveillance – even when voluntarily engaged in by neighborhood residents – can cause the fabric of community trust to fray.
  • “Health misinformation is causing harm to individuals and to communities, but talking to one another about its impact can help slow the spread by prompting us to think twice about the information we’re reading and sharing. This toolkit will help you get started.” A toolkit developed under the leadership of US Surgeon General Vivek Murthy and released by the US Department of Health and Human Services provides practical advice for countering medical misinformation.
  • “…celebrating the publications of your lab members is a routine part of a professor’s job. But in this case, Victoria was not only a successful early-career scientist, but also a famous fiction writer, having published five novels under a pseudonym. The anonymity allowed her to avoid any potential criticism from the academic world that this side hustle was unprofessional or revealed a lack of commitment to science.” In a “Careers” column for Science, Jay J. Van Bavel advocates for the scientific community to embrace and celebrate extracurricular interests.
  • “Even though relatively few respondents reported having COVID-19 (see ‘Geography of a pandemic’), the outbreak transformed workplaces and careers. Overall, 12% of respondents said they had lost a job offer because of COVID-19, and 43% said that the pandemic had negatively impacted their career prospects. The majority of respondents (57%) said that it had impaired their ability to collect data.” In a feature article for Nature, Chris Woolston tallies the damage to scientific careers wrought by the COVID pandemic.
  • “The fluid nature of emergent science provides fuel for conspiracy theorists who offer certainty in place of the provisional, sometimes-updated statements of health experts. At the same time, conspiracy proponents question the trustworthiness and motives of those in the federal agencies, philanthropic institutions and pharmaceutical companies who fund basic research and develop, deliver and, in the case of some of the federal agencies, regulate public access to medical treatments, including vaccines.” An essay in Nature Human Behavior by Kathleen Hall Jamieson dissects the ways that conspiracy theorists bent the science behind efforts to understand and respond to the COVID-19 pandemic to their own ends.
  • “Many journals publish special issues — collections of articles that focus on a particular topic of relevance to their readers. These issues are often overseen by guest editors who are experts in the research topic, but are not usually involved in the day-to-day editorial work of the journal….Fraudsters have been caught several times in recent years while trying to use special issues as a way to get low-quality papers published in legitimate journals — but the number of affected papers seems to be increasing.” Nature’s Holly Else describes some of the elaborate deceptions used to insinuate bogus papers into legitimate publications.

POLICY

Vikram Sundaramoorthy via Unsplash
  • “…whenever risk compensation has been subjected to empirical scrutiny, the results are usually ambiguous, or the hypothesis fails spectacularly. And when risk compensation does play a part in behavior, it tends to do so in small and specific ways—hardly cause for the alarm and fervor with which it is often applied, especially during the pandemic.” At Slate, Tim Requarth explores how Americans respond to perceived risk – and how widespread assumptions about hazards may actually be counterproductive, especially during a public health crisis.
  • “The FDA (and the relevant community of scholars) knows that most new drugs offer only modest incremental benefit over drugs already available. But although its thorough analyses of submitted clinical-trial data are disclosed, the relevant documents are impenetrable to nearly everyone. Drug labelling should clearly state what effectiveness was demonstrated and how. The metrics should have real-world relevance, such as how a patient feels, functions or survives.” An editorial in Nature by Jonathan J. Darrow advocates for a simplified approach to labeling drugs that would allow patients to more easily and clearly understand the benefits of taking a particular medication.
  • “The Filter Bubble Transparency Act would require internet platforms to let people use a version of their services where content is not selected by “opaque algorithms” driven by personal data.” At Axios, Ashley Gold reports on emerging bipartisan draft legislation that puts media algorithms in the legislative crosshairs.
  • “…certain fundamental strategic priorities emerged as basic and critical to progress in the field: 1) the need to reorient research perspectives and activities to patient and family priorities and values, and in particular, those conditions that drive inequities; 2) the need to foster strategic learning partnerships across groups, organizations, and sectors; and 3) the need to build the continuous learning infrastructure to produce new insights at the pace and scale necessary for health and health care improvement.” A new report from the National Academy of Medicine lays out potential future directions for the Patient-Centered Outcomes Research Institute (PCORI).
  • “Top-performing US hospitals routinely charge commercial insurers prices far in excess of Medicare payment limits, which are based on the average acquisition costs of drugs for non-340B entities. While hospitals often charge commercial insurers more than Medicare for a range of services, they generate revenue on pharmaceuticals by buying and reselling drugs, earning sums in addition to the amounts that they separately bill insurers for the professional services required to administer these drugs.” A research letter published by Feldman and colleagues in JAMA Internal Medicine examines hospital billing practices for physician-administered drugs.