A fake image of former President Donald Trump with his arm around a black voter. A meme depicting a black person in front of a Hillary Clinton sign, telling people they can vote by text or hashtag. Robocalls tell voters that voting by mail is unsafe, or imitate President Joe Biden's voice, instructing voters to “save” their November votes instead of voting in the primary. ing.
in Marcy vs. Missouri The U.S. Supreme Court should provide clear rules that allow the federal government to contact social media companies and civil society organizations to stop the spread of this type of misinformation on social media platforms. be.
If social media companies were better at policing misinformation on their platforms, this wouldn't be a big problem. But that's not the case. Most major online platforms prohibit misrepresentations about when, where, and how to vote, but policy enforcement varies widely and is often inadequate.
Governments and civil society organizations can help fill this void. Some government programs counter election disinformation with accurate information. Some communicate with social media companies about disinformation trends that harm vulnerable communities. State and local election officials, the front lines of democracy, have also discovered problems on the ground and are sounding the alarm.
The Lawyers' Committee for Civil Rights Under Law is a national bipartisan coalition of more than 300 national, state, and local partners working to ensure that voters can exercise their right to vote. Co-convener of protection. The Office of Election Protection may escalate election disinformation threats to election officials.
in Mercy, Missouri, and Louisiana sued the federal government for violating their First Amendment rights. They argued that the federal government is responsible for social media companies' decisions to remove or downgrade content. To succeed, plaintiffs will need to prove that there was a state action, meaning that the social media companies were not acting on their own volition, but were effectively acting as agents of the federal government. .
The district court agreed with the plaintiffs, holding that government agencies may not communicate with social media platforms for the purpose of “promoting, encouraging, pressuring, or inducing in any manner the removal, deletion, suppression, or reduction of content that includes:” issued a comprehensive preliminary injunction prohibiting the Freedom of speech was protected. ” It also prohibited the government from collaborating with third parties for the same purpose, which could include voter protection groups such as Election Protection.
Unsurprisingly, the injunction was so vague and overbroad that the government immediately halted consultations with social media companies and voter protection groups.
The Fifth Circuit largely agreed with the district court on the merits of the plaintiffs' First Amendment claims, but narrowed and modified the injunction to prohibit “coercion'' by the government.[ing] or “significantly encourage”[ing]” Social media companies can remove, remove, suppress or reduce content, and the ban on governments collaborating with third parties has been removed.
While the 5th Circuit states the law correctly, its application of the facts to the law is problematic and, if adopted by the Supreme Court, would allow the government to notify social media companies of misinformation that is harmful to the public. There is a possibility that it will not be possible.
For example, the Fifth Circuit found that the FBI acted in a coercive manner toward social media companies because it asked them to remove content. The 5th Circuit found that there was no evidence that the FBI's communications were “threatening” and that the FBI “did not explicitly state that there would be adverse consequences” if social media companies did not act. This certification was made.
The Fifth Circuit found that the communications came from the FBI and that the FBI “exercised some degree of authority over the platform,” making the ostensibly permissible communications coercive.
The implications of this broad interpretation of what constitutes state action are enormous. Government agencies may be prohibited from meeting with platforms, sharing strategic information, or warning them about disinformation spreading on the platforms simply because they are government agencies.
The chilling effect of an injunction would extend far beyond the federal authorities being sanctioned. State and local election officials, who are most familiar with state and local election laws, have seen precedent in this case and are afraid to sue based on the Fifth Circuit's injunction because they cannot know. may actively self-censor. , what is acceptable?
The end result is that if the injunction is left in place, it puts at stake the government's ability to work with civil society and social media platforms to protect elections from disinformation, months before a crucial election. .
And recent technological advances, such as generative AI, could allow deepfake election-related images, audio, and video to infiltrate social media platforms.
Now is not the time to be vague. The Supreme Court should clarify that the government is allowed to communicate with social media companies and civil society groups to stem the tide of election disinformation. Our democracy is at stake.
The case, Marcy v. State of Missouri, United States, No. 23-411, is scheduled for argument on March 18, 24.
This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law and Bloomberg Tax, or its owners.
Author information
Mark Epstein is senior counsel for the Digital Justice Initiative at the Lawyers' Committee for Civil Rights Under Law, where he focuses on technology and racial justice cases, including online hate and disinformation. .
The Bar Committee joined in an amicus brief filed in support of U.S. Surgeon General Vivek Murthy and other petitioners.
Please write to us: Author guidelines