In today’s Roundup: Snakebitten? Data science can help; large (harmless) spiders on the march; adversarial attack with lasers foxes self-driving LIDAR; impact of state policy on COVID mortality; Cow Clicker as a window onto online culture; creating guardrails for health AI; growing impatience with data blocking; ARPA-H gets funded but organizational questions remain; disparities impact healthcare workers, too; possible unintended consequences of open access publishing; challenges in getting data and code from study authors; much more:
- And we, for some, welcome our new arachnid overlords…”Since the spider hitchhiked its way to the northeast of Atlanta, Georgia, inside a shipping container in 2014, its numbers and range have expanded steadily across Georgia, culminating in an astonishing population boom last year that saw millions of the arachnids drape porches, power lines, mailboxes and vegetable patches across more than 25 state counties with webs as thick as 10 feet (3 meters) deep…” Scientific American’s Ben Turner has the story on the Joro spider, a large but harmless (and ecologically benign) invader making its way through the southern US.
AI, STATISTICS & DATA SCIENCE
- “The new attack strategy works by shooting a laser gun into a car’s LIDAR sensor to add false data points to its perception. If those data points are wildly out of place with what a car’s camera is seeing, previous research has shown that the system can recognize the attack. But the new research from Pajic and his colleagues shows that 3D LIDAR data points carefully placed within a certain area of a camera’s 2D field of view can fool the system.” An article by Duke Engineering’s Ken Kingery describes work by Duke University researchers that has illuminated a novel “attack” that can flummox self-driving systems that rely on LIDAR sensors to adapt to changing traffic conditions.
- “Deep-learning systems can now play games like chess and Go better than the best human. They can probably identify dog breeds from photos better than you can. They can translate text from one language to another. They can control robots and compose music and predict how proteins will fold….But they also lack much of what falls under the umbrella term of common sense.” A wide-ranging perspective by Matthew Hutson at Science News traces the broad contours of the computing revolution, from its 19th-Century Difference Engine beginnings to contemporary developments in AI, and looks at what’s coming next.
- “While in the past the FTC has required companies to disgorge ill-gotten monetary gains obtained through deceptive practices, forcing them to delete algorithmic systems built with ill-gotten data could become a more routine approach, one that modernizes FTC enforcement to directly affect how companies do business.” At Protocol, Kate Kaye reports on the Federal Trade Commission’s increasing readiness to require companies that train algorithms on data obtained without appropriate (or any) permissions to destroy those algorithms.
- “Experts have been aware that data shifts — which happen when an algorithm must process data that differ from those used to create and train it — adversely affect algorithmic performance. State-of-the-art tools and best practices exist to tackle it in practical settings. But awareness and implementation of these practices vary among AI developers.” A First Opinion article published in STAT News by John D. Halamka, Suchi Saria, and Nigam H. Shah makes the case for a broadly applicable approach to providing “guardrails” for the development, evaluation and use of clinical AI.
- “…we were able to obtain artifacts from 44% of our sample and were able to reproduce the findings for 26%. We find this policy—author remission of data and code postpublication upon request—an improvement over no policy, but currently insufficient for reproducibility.” A study by Stodden and colleagues, recently published in PNAS, finds that scientific journals that leave the disposition of data and computer code to the author’s discretion (instead of requiring that they be made available) may need to do more to ensure replicability of research findings.
- “With Gerardo and Brazilian colleagues providing clinical guidance, Vissoci is developing a plan to relocate at least some antivenom to local community health centers, which are the backbone of Brazil’s tiered health care system. These primary care centers cover about 90% of Brazil’s geography, making them much more accessible to people in sparsely populated areas.” An article by Duke Magnify’s Michael Penn highlights an unexpected application of data science: figuring out how to get limited supplies of antivenom to the often far-flung places its needed in time to prevent serious harm or death.
BASIC SCIENCE, CLINICAL RESEARCH & PUBLIC HEALTH
- “Taken in context with the existing literature, our study provides evidence both for structural drivers of health inequity for healthcare workers (e.g. Black healthcare workers tending to live and practice in areas with higher community spread of COVID-19) and for a contributor of bias and discrimination (e.g. Black healthcare workers being less likely to receive SARS-CoV-2 viral testing after adjusting for individual and community characteristics).” A great deal has been published recently about disparities in access to COVID prevention and treatment, but this recent publication in eClinicalMedicine by Lusk and colleagues turns the spotlight on differential COVID outcomes among healthcare workers according to race and ethnicity.
- “A lot needs to go right for this hopeful view to become reality. Large clinical trials will have to show that these therapies work, and amyloid-clearing drugs will have to be proven to be safe and affordable. After decades of setbacks and failed clinical trials, some dementia researchers prefer to express caution.” At Nature, Alison Abbott reports on new research avenues in drug development for Alzheimer Disease prevention.
- “States are laboratories for experimentation, but fragmented health policy has consequences. While other countries mounted a national response to COVID-19, the US was hobbled by 50 response plans and, to date, has lost more than 1 million lives. Although state governments have the right to set their own path and policies, the public should decide whether life expectancy should be part of the experiment.” A viewpoint article published last week in JAMA by Steven H. Woolf examines the influence of state policies on health outcomes, with COVID serving as a key exemplar.
- “The elimination of mortality disparities will require a holistic view of the root causes of individual and structural racism…Explicitly focusing national attention on the premature excess mortality related to race and racism may create a greater sense of urgency to solve this deeply rooted problem. The way public health data are presented and contextualised may be helpful in directing this attention and promoting societal accountability.” An opinion article in BMJ by Harlan Krumholz addresses racism as a major cause of death in the US and underscores the need for statistical metrics that can more accurately capture its extent and impact.
- “Dr. Litvaitis had studied the worms for decades, traveling to the Caribbean and Indo-Pacific seas to collect hundreds of samples of their tissue and DNA, which were all stored in the minus 80 degree Celsius freezer in her lab. But labs at her school are cleared out once researchers leave, and there are often no systems in place to ensure that irreplaceable collections of scientific arcana don’t end up in a dumpster alongside old papers and broken lab equipment, which they often do.” The New York Times’ Sabrina Imbler explores the question of what happens to the (often extensive) biological collections that researchers leave behind at institutions when leave, retire, or move on to different things.
COMMUNICATIONS & DIGITAL SOCIETY
- “Cow Clicker remains one of my greatest professional legacies. That fact haunts me, and I allow it to. The cutout of the cow with which I posed still looks down at me from the wall as I click emails in my office at Washington University in St. Louis. “Aren’t you the Cow Clicker guy?” people sometimes ask. My brain fills with all the other things I’ve done, but ultimately I have to admit: Yes. I am the Cow Clicker guy.” Ian Bogost’s reminiscence at The Atlantic about an abandoned Facebook game takes some unusual turns, but manages to probe deeply into the strange byways of the internet and how we relate to it.
- “While it’s early days, we’re seeing a significant difference in acceptance rates between authors who anonymise their work and those who don’t, with those who anonymise being more likely to succeed. We expected to see this for marginalised groups, but not across the board.” A blog post by Kim Eggleton at IOP Publishing describes the advantages of double-blind review in reducing bias in scientific publication.
- “Uber has quietly changed the way it pays drivers in several major cities across the U.S., using a new feature it’s calling “Upfront Fares.” Instead of paying drivers for trips based on just time and distance, it’s now using an algorithm “based on several factors” to calculate the fare. What all of those factors are is unclear.” At The Markup, Dara Kerr explores rideshare giant Uber’s new algorithm that will shape rates of pay for its drivers.
- “A particularly pressing issue is open access (OA) publication fees, in which the benefit of free readership is being offset by new barriers to authorship. To support OA publishing, journals commonly charge authors, and charges are rising as the practice expands. My group and others have found that article-processing charges are creating a two-tier system, in which richer research teams publish more OA articles in the most prestigious journals.” In an opinion article for Nature’s “world view,” Tony Ross-Hellauer warns that well-intentioned open-science initiative have the potential, if not carefully designed and managed, to worsen global inequities.
- “Since the information blocking rule took effect last April, patients themselves have submitted 176 complaints to an online portal managed by the Office of the National Coordinator for Health IT, the agency responsible for defining which practices qualify as information blocking. But the agencies tasked with levying penalties against health providers and organizations that violate the ban have yet to announce how they’ll do that nor exactly when enforcement will begin.” STAT News’ Mohana Ravindranath reports on growing impatience with healthcare providers who have erected roadblocks to patients accessing their own data – as well as on the agencies tasked with oversight.
- “President Joe Biden last week got his wish for a new agency to fund high-risk, cutting-edge biomedical research when Congress created the Advanced Research Projects Agency for Health (ARPA-H) and gave it a $1 billion startup investment….The 2022 spending bill does not resolve, however, a debate over whether to make ARPA-H a standalone agency within the Department of Health and Human Services (HHS) or part of the National Institutes of Health (NIH).” In an article (which features an interview with White House Science Advisor Francis Collins) for Science, Jocelyn Kaiser reports that passage of an initial $1BB in funding for ARPA-H – a new federal biomedical research agency – has cleared Congress, but the exact organizational outlines for the new program have yet to come into focus.
- “One hope behind these studies is that insights into how people come to set aside their vaccination anxieties or concerns could lead to more effective campaigns to increase uptake of the COVID-19 vaccine and other vaccines. And the results call into question the idea that people’s vaccination-related thoughts and behavior are in perfect harmony and fall into mutually exclusive categories…”At Scientific American, Robin Lloyd delves into recent research findings that suggest vaccine-hesitant persons in the US are in fact ultimately getting vaccinated against COVID, which has implications for how doctors and public health experts approach vaccination campaigns.