Since 2010, the Global Law Experts annual awards have been celebrating excellence, innovation and performance across the legal communities from around the world.
posted 2 weeks ago
The 2023–2024 US Supreme Court term issued several speech-related rulings that will serve as precedents for future Social Media Cases. In a series of First Amendment cases, the Supreme Court addressed the government’s role with respect to speech on social media platforms. The common issue in these cases is an important constitutional principle that limits the government’s ability to control the online speech of social media users and companies.
Internet users enjoy First Amendment rights to speak on social media, and that right may be infringed upon by the government if it interferes with content moderation. However, the apex court has maintained that this right will not be infringed by the independent editorial discretion of social media companies.
NetChoice is a group representing the largest social media companies in the world, including X (formerly known as Twitter), Meta, Pinterest and TikTok. The group took to court to challenge Republican-backed legislation in Texas and Florida that sought to restrict the power of social media companies to downrank or remove objectionable content.
In July, the Supreme Court directed lower appeals courts to reconsider two decisions regarding 2021 laws authorising the states of Texas and Florida to regulate content moderation by social media platforms. NetChoice challenged the two pieces of legislation under the US Constitution’s First Amendment limits on the state’s ability to restrict speech.
On the one hand, Florida passed a law that would limit the ability of large social media companies to exclude content by prohibiting the banning or censorship of journalistic enterprises and political candidates. On the other hand, Texas’s law sought to forbid social media companies with at least 50 million monthly active users from censoring users based on viewpoint. It also allowed users or the Texas attorney general to sue to enforce it.
Both NetChoice v. Paxton and Moody v. NetChoice looked at the government’s role as a regulator of social media companies. The Supreme Court did not address whether the state laws under contention were constitutional in their application. Instead, it remanded the cases to lower courts to reexamine NetChoice’s claim that the laws had limited constitutional applications.
The court was tasked with examining whether the First Amendment protects the editorial discretion of social media companies. NetChoice argued that without such discretion, including the ability to remove or block users and content or prioritise some posts over others, their platforms would be overrun with bullying, spam, hate speech and extremism. In its ruling, the Supreme Court explained that social media platforms exercise their First Amendment rights through independent editorial decisions on how to organise and display content.
In a previous ruling in Murthy v. Missouri, the US Supreme Court allowed the government to request the removal of falsehoods and misinformation on social media platforms. The Court struck down a lower court’s ruling after finding that the government’s communication with social media companies about COVID-19 misinformation did not violate the First Amendment.
In June (2024), the justices ruled 6–3 that the plaintiffs lacked standing to prosecute the case against the government, with conservative justices Neil Gorsuch, Clarence Thomas and Samuel Alito dissenting. Although arguments revolved around how social media platforms and governments interact with free speech online, the Supreme Court’s ruling focused on procedural technicalities and lack of standing.
The ruling was a big blow to Republican-backed efforts to equate content moderation by social media platforms to censorship. The plaintiffs argued that the government and several federal agencies were coercing platforms into silencing conservative voices through demands to remove misinformation about COVID-19.
Initially, a Louisiana Republican judge, in a lower court ruling, accused federal agencies of assuming the role of an “Orwellian Ministry of Truth”. This position was later partly affirmed by the US Court of Appeals for the Fifth Circuit, which stated that the Biden administration was responsible for social media platforms taking down users’ content and issued an injunction preventing communication between social media companies and the government.
However, the Supreme Court ruled that the Fifth Circuit was wrong in its conclusions and found that the plaintiffs failed to demonstrate that they faced substantial risk of harm from the government’s actions.
Justice Alito wrote the dissent and alleged that high-ranking government officials placed unrelenting pressure on Facebook to suppress American free speech for months. He argued that the Court’s ruling provided precedent for future officials who want to control what Americans say, hear and think.
These two cases examined the government’s role as a social media platform user with an account and one that wished to use full features, including deleting comments and blocking other users.
Since the two cases were decided together, the Supreme Court directed the lower courts to determine whether a government official has the authority to speak on behalf of the broader government before addressing the issue of whether the official used their social media account for governmental purposes in a manner that could trigger First Amendment rights for the commenters.
A social media platform’s content moderation practices can constitute expressive activity, similar to how a newspaper decides which op-eds to publish. The Supreme Court touched on how the First Amendment related to the content moderation laws. It found that social media platforms engage in expressive activity when curating their feeds, and the government’s restrictions or interference in such activity is an infringement of the right to speech and freedom of expression.
From the recent Supreme Court rulings, it will be difficult for governments to regulate content moderation by social media companies. The Court found that the government lacks a compelling substantial valid interest in changing the content on social media feeds.
Although the apex court was majorly focused on issues raised by NetChoice, it spoke broadly about First Amendment challenges triggered when the government’s objective is to correct the mix of speech presented by social media platforms. The court explained that a platform exercising editorial discretion is engaged in speech activity. When the government interferes with such discretion, it alters the content and creates a different opinion page with a different message. As such, NetChoice’s protections will apply equally to social media platforms that prefer not to engage in content moderation at the government’s beck and call.
Source: SCOTUS
References:
Electric Frontier Foundation (EFF)
posted 3 days ago
posted 3 days ago
posted 4 days ago
posted 4 days ago
posted 4 days ago
posted 4 days ago
posted 6 days ago
posted 7 days ago
No results available
ResetFind the right Legal Expert for your business
Global Law Experts is dedicated to providing exceptional legal services to clients around the world. With a vast network of highly skilled and experienced lawyers, we are committed to delivering innovative and tailored solutions to meet the diverse needs of our clients in various jurisdictions.