Supreme Court Ruling Supports Government Collaboration with Researchers to Combat Misinformation

Supreme Court Ruling Supports Government Collaboration with Researchers to Combat Misinformation

This week, the US Supreme Court decided that the government can maintain communication with researchers and social-media platforms to reduce misinformation on subjects like elections and vaccines. This decision does not explicitly state that such activities are protected under free speech, but it is a positive outcome for researchers who face lawsuits accusing them of collaborating with the government to suppress conservative viewpoints.

“This ruling is a major victory for independent research,” says Rebekah Tromble, head of the Institute for Data, Democracy and Politics at George Washington University. “In rejecting the conspiracy theories at the heart of the case, the Supreme Court demonstrated that facts do still matter.”

Over the past four years, two significant rapid-response projects have been led jointly by the Stanford Internet Observatory in California and the University of Washington in Seattle. These projects aimed to track, report, and counteract both disinformation and misinformation. Researchers flagged falsehoods to social-media companies and US agencies as quickly as possible, and their reports were made public. Meanwhile, the federal government also highlighted problematic content to platforms like Facebook.

Many conservative activists and politicians believed these efforts were politically biased against Republican voices, particularly those who falsely claimed the 2020 presidential election was rigged. This led to congressional investigations and multiple lawsuits, including the one against the US government decided by the Supreme Court today. Filed in May 2022 by plaintiffs, including the then-attorneys-general of Missouri and Louisiana, the case questioned President Biden's 2020 election victory.

The Supreme Court rejected the plaintiffs' claims, stating that social-media platforms, as private entities, began moderating content independently before the government contacted them about misinformation. The court found no specific evidence that government pressure unduly influenced these decisions or caused harm.

The Supreme Court has yet to rule on a related case concerning US state regulations that limit social-media companies' ability to moderate platform conversations.

While the impact on lawsuits against scientists remains uncertain, legal scholars and misinformation researchers see this ruling as a clear victory for academic freedom.

Lawsuits by conservative activists have created fear among misinformation researchers, and changes in the online environment have hindered their work. For example, after Elon Musk acquired Twitter (now called X), he implemented policies restricting academics' access to platform data. Other social-media companies have also reduced efforts to moderate content for accuracy.

This discouragement of efforts to counteract false narratives is concerning as the US prepares for its next presidential election, says Gowri Ramachandran, deputy director at the Brennan Center for Justice. Biden and Trump are expected to face off again in November.

Stanford University has ended its rapid-response misinformation projects and laid off two staff members involved, although researchers will continue working on election misinformation this year. Jeff Hancock, director of the Stanford Internet Observatory, stated that the decision to halt rapid-response work was not due to fear of litigation or investigations but rather fundraising challenges and a shift in the center's focus.