AI Health

Friday Roundup

The AI Health Friday Roundup highlights the week’s news and publications related to artificial intelligence, data science, public health, and clinical research.

February 4, 2022

In today’s Roundup: Crisis helpline passes data to for-profit spinoff; the ethics of visual representations of AI; COVID’s toll on kids in sub-Saharan Africa; Algorithmic Accountability Act introduced; untangling tau protein; “fingerprinting” for journal PDFs; new nonprofit clinical trials org launches; the long half-life of problematic datasets; cybercriminals benefit from lax attitudes toward data protection; countering buggy scientific programming; much more:

AI, STATISTICS & DATA SCIENCE

Spotted beetle sitting on the tip of a leaf. Image credit: Henry Lai/Unsplash
Image credit: Henry Lai/Unsplash
  • “When it comes to software, bugs are inevitable — especially in academia, where code tends to be written by graduate students and postdocs who were never trained in software development. But simple strategies can minimize the likelihood of a bug, and ease the process of recovering from them.” A feature article at Nature by Jeffrey M. Perkel walks scientists through some strategies for avoiding programming errors and fixing their buggy software code.
  • “Recently, several initiatives have been launched to advance the quality of reporting and the consistency of terminology in AI studies. It has been recognized that arriving at a consensus about a set of terms that could be used interchangeably between disciplines would reduce some of the unnecessary complexities which, for example, systematic reviewers might face when assessing different studies.” An opinion article published in Frontiers in Digital Health by Smaes and colleagues wades (once again) into that most fraught terrain of AI vs statistics.
  • “We trained a multichannel variational autoencoder and a deep regressor model to estimate left ventricular mass (4.4 (–32.30, 41.1) g) and left ventricular end-diastolic volume (3.02 (–53.45, 59.49) ml) and predict risk of myocardial infarction (AUC = 0.80 ± 0.02, sensitivity = 0.74 ± 0.02, specificity = 0.71 ± 0.03) using just the retinal images and demographic data. Our results indicate that one could identify patients at high risk of future myocardial infarction from retinal imaging available in every optician and eye clinic.” A research article published by Diaz-Pinto and colleagues in Nature Machine Intelligence describes a machine learning model that was used to predict risk of myocardial infarction by examining retinal scans.
  • “The ontological premise of this article is that one cannot develop any comprehensive understanding of AI without taking into account the imaginaries about AI. Since these imaginaries are crystallized in visual or written representations, we also contend that a comprehensive AI ethics should include considerations on the representations of and the communication about AI.” An interesting perspective from Alberto Romele appearing in Philosophy and Technology examines the ethical impact of visual rhetoric surrounding discussions of artificial intelligence (H/T @ImagesofAI).
  • “Several datasets have now been taken down or, as in the case of ImageNet, have been heavily redacted. In practice, however, they continue to be widely used and available, either in their original form, such as via online torrents, or in derivative form, as subsets or modifications of the original dataset or models pretrained on the deprecated dataset… In many cases, the deprecation has been silent, or the status of the dataset left ambiguous.” An editorial from Nature Machine Intelligence addresses the troubling issue of large datasets, often scraped from publicly available sources, that were initially taken down due to ethical and legal issues, but which are still easily available for use in training machine learning applications.

BASIC SCIENCE, CLINICAL RESEARCH & PUBLIC HEALTH

Empty park bench with fallen leaves scattered around, trees in the background. Image credit: Will Patterson/Unsplash
Image credit: Will Patterson/Unsplash
  • “…in fact, we don’t have a simple way of measuring the long-term impact of Omicron — or the pandemic as a whole — on everyone’s health or what kinds of consequences it will have on our health care system….There’s virtually no aspect of our lives that the pandemic hasn’t changed. And the limited data sets available suggest that its impact on other health conditions will also be vast. “ An article by Kim Tingley for the New York Times Magazine examines the long shadow that the COVID pandemic is casting over the future of health and healthcare.
  • “It’s not unusual for pharmaceutical companies to spend hundreds of millions or even billions of dollars to conduct a single clinical trial. The RECOVERY effort, Landry said, cost under $10 million….Now, he told STAT in an interview, it is time to apply the lessons from that effort to other medicines.” In an article at STAT News, Matthew Herper profiles PROTAS, a new non-profit clinical trials organization headed by Martin Landray, that is devoted to applying lessons from the RECOVERY trial of COVID therapies to the wider world of medical therapeutics.
  • “In human subjects, PtNRGrids were able to provide high-resolution recordings of large and curvilinear brain areas and to resolve spatiotemporal dynamics of motor and sensory activities. The results suggest that PtNRGrids could be used in the preclinical and clinical setting for high spatial and temporal recording of neural activity.” A research article published in Science Translational Medicine by Tchoe and colleagues reports a study that successfully demonstrated a new approach to high-resolution brain mapping.
  • “Flying is a tricky business, but when you are less than a millimetre long, things get even tougher.” Eppur si muove for tiny beetles: this video from Nature complements a recent paper that describes how the miniscule featherwing beetle keeps aloft.
  • STAT News’ Angus Chen reports on a recently published study by Uldrick and colleagues that suggests the cancer therapy pembrolizumab (Keytruda) might be able to neutralize the HIV virus’ ability to “hide” in immune cells – an evolutionary advantage that to date has made HIV effectively impossible to eradicate, even when antiretroviral therapies keep it under control.
  • “Sucking DNA from the air could be a noninvasive way to identify where endangered species have been by picking up on their genetic footprints, Bohmann says. The method would be an upgrade from camera traps, which work only when critters wander by, she says.” Science News’ Jude Coleman reports on successful attempts to identify animals by sampling the air in locale and sifting the bits of DNA wafting through it.
  • “In this cohort study of 469 children and adolescents hospitalized with COVID-19 in 6 sub-Saharan African countries, morbidity and mortality were substantially higher than reported among those in non-African settings and were independently associated with age younger than 1 year and select noncommunicable disease comorbidities.” A paper by Nachega and colleagues published in JAMA Pediatrics reports on clinical outcomes for children hospitalized for COVID in a half-dozen sub-Saharan African countries.
  • “…these findings hint at two potentially important ideas for interventions. One has to do with how tau hitches a ride on neurotransmitters, allowing dysfunctional proteins to get out of their own cells and travel to other regions of the brain….The mitochondrial results are more complicated, but equally intriguing. Diseased tau seems to affect how much energy the cell produces, but it’s not clear exactly how it does that—or how a drug could be designed to restore normal function.” An article by Wired’s Sara Harrison walks readers through the implications of a research paper recently published in Cell by Tracy and colleagues, which may clarify the complex role that tau protein plays in the development and progression of Alzheimer disease and other neurological disorders.

COMMUNICATIONS & DIGITAL SOCIETY

Photo of an illuminated fingerprint on a glass slide. Image credit: George Prentzas/Unsplash
Image credit: George Prentzas/Unsplash
  • “One of the world’s largest publishers of academic papers said it adds a unique fingerprint to every PDF users download in an attempt to prevent ransomware, not to prevent piracy. …Elsevier defended the practice after an independent researcher discovered the existence of the unique fingerprints and shared their findings on Twitter last week.” Motherboard’s Lorenzo Franceschi-Bicchierai reports on scientific publishing giant Elsevier’s move to embed unique “fingerprint” codes in all downloaded PDFs.
  • “Crisis Text Line is one of the world’s most prominent mental health support lines, a tech-driven nonprofit that uses big data and artificial intelligence to help people cope with traumas such as self-harm, emotional abuse and thoughts of suicide….But the data the charity collects from its online text conversations with people in their darkest moments does not end there: The organization’s for-profit spinoff uses a sliced and repackaged version of that information to create and market customer service software.” Revelations that a non-profit, AI-assisted suicide crisis hotline has been quietly sharing data with a for-profit spinoff company has raised a storm of concern and controversy this week. Politico’s Alexandra S. Levine has the story.
  • “…dissemination of scientific knowledge could be done better…The trick is to develop a system that keeps the speed while reducing the risk that bogus ideas such as treating Covid-19 with ivermectin or hydroxychloroquine will slip through.” In an opinion article published in the New York Times, Peter Coy weighs in on ways to update current systems for disseminating scientific information and manage the potential challenges of the preprint era.
  • “Cybertheft conjures images of high-tech missions, with sophisticated hackers penetrating multiple layers of security systems to steal corporate data. But these breaches were far from “Ocean’s Eleven”-style operations. They were the equivalent of grabbing jewels from the seat of an unlocked car parked in a high-crime neighborhood.” ProPublica’s Cezary Podkul reports on a surge in cybercrime and identity theft amid the COVID pandemic – a surge made possible in part by a lack of vigilance by companies that had stewardship of large repositories of personal data.
  • “Nicotra says that the platforms are still missing large numbers of problematic posts, especially outside the United States and Europe, and in languages other than English. In 2020, Facebook devoted just 13% of its budget for developing misinformation-detection algorithms to regions outside the United States, according to documents released by whistle-blower Frances Haugen, a former product manager for the company.” An article in Nature by Brian Owens returns to some disturbing territory – the steep increase in abuse hurled at scientists during the COVID pandemic and the lack of effective response to it by social media platforms, as documented in a recent report by AVAAZ.
  • “So, given the choice of publishing in a high-rank journal or a less prestigious one, you might choose to ‘have your cake and eat it’ – that is publish in a high-rank journal over a lower-ranked outlet –, but what if you had something to give up as well? To tackle this issue, we designed a choice-set question that explored individuals’ preferences for publication in ‘elite’ general journals in relation to a hypothetical level of citations in subsequent years in a leading specialized journal.” A post at the London School of Economics’ Impact of Social Science blog by Rossella Salandra, Ammon Salter, and James Walker unpacks recent work exploring incentives related to academic publishing.

POLICY

Close-up photo of slightly fanned stack of 100-dollar bills. Image credit: Pepi Stojanovski/Unsplash
Image credit: Pepi Stojanovski/Unsplash
  • “Today, a multibillion-dollar marketplace has formed around anonymized health information bought and sold without patients’ knowledge or explicit consent….patient data have become far more valuable as they are fed into an exploding array of software and artificial intelligence tools whose financial returns enrich technology entrepreneurs and their investors, not patients whose medical problems are their secret ingredients….Surveying the industry, Ludy is disturbed by what he sees.” A special report by STAT News’ Casey Ross includes perspectives from MarketScan founder Ernie Ludy, who is witnessing the company he founded assume a significant position as an enormous and wide-ranging repository of individual data – one that encompasses the majority of the US population (H/T @MarkRDeLong).
  • “…tech types agree these issues are a high priority, and say regulation is a key part of the puzzle…It’s just an urgency Capitol Hill doesn’t share….To be fair, a few lawmakers have taken on the issues, but members of Congress are mostly a busy bunch with an extremely tenuous grasp of tech topics.” A policy bulletin at Protocol provides a quick overview of the bubbling ferment around regulation of AI – and the apparent lack of enthusiasm to tackle the issue among lawmakers. However…
  • “The bill requires companies to conduct impact assessments for bias, effectiveness and other factors, when using automated decision systems to make critical decisions. It also creates, for the first time, a public repository at the Federal Trade Commission of these systems, and adds 75 staff to the commission to enforce the law.” A press release issued by the office of Senator Ron Wyden announces the introduction of the Algorithmic Accountability Act of 2022. The Act is co-sponsored by Senator Corey Booker and Representative Yvette Clarke.
  • “That’s not how things turned out. The last time a pop-up window appeared on a website and asked whether you would allow cookies to gobble up your personal data, did you actually read the fine print or think for more than five seconds before you pressed ‘accept?’ Me neither…Actually, it is worse. In practice, the proliferation of cookie banners has both numbed people to their purpose and given companies yet another way to manipulate users.” At the New York Times’ Dealbook, Joe Nocera looks at how a well-intentioned approach to bolstering online privacy enacted in the wake of the EU’s General Data Protection Regulation (GDPR) failed to live up to expectations.