This week, the Supreme Court will hear back-to-back oral arguments in two cases that could significantly change online speaking and content moderation.
The outcome of the oral debate scheduled for Tuesday and Wednesday could determine whether tech platforms and social media companies can be held liable for recommending content to their users or for supporting acts of international terrorism by posting terrorist content. This marks the first time in history that the Court has considered a topical federal law that largely protects websites from lawsuits over user-generated content.
The closely watched cases known as Gonzalez v. Google and Twitter v. Taamneh, carry significant stakes for the wider Internet. Expanding the legal risks of apps and websites to host or promote content could bring major changes to sites including Facebook, Wikipedia, and YouTube, just to name a few.
The lawsuit generated some of the most intense rhetoric in recent years from the technology sector about the potential impact on the future of the Internet. US lawmakers, civil society groups and more than two dozen states have also entered the debate, filing papers with the Court.
At the heart of the lawsuit is Section 230 of the Communications Decency Act, a nearly 30-year-old federal law that courts have repeatedly said provides broad protection for tech platforms, but has since come under scrutiny along with growing criticism of Big Tech content. . moderation decisions.
The law has critics on both sides of the aisle. Many Republican officials argue that Section 230 gives social media platforms a license to censor conservative views. Prominent Democrats, including President Joe Biden, have argued that Section 230 prevents tech giants from being held accountable for spreading misinformation and inciting hatred.
In recent years, some in Congress have pushed for Section 230 changes that could make tech platforms more liable, along with proposals to amend US antitrust and other bills to curb dominant technology platforms. But those efforts have largely stalled, and in the coming months the Supreme Court will be the most likely source of change in how the United States regulates digital services.
Decisions on cases are expected by the end of June.
The case involving Google focuses on whether it can be sued over its YouTube subsidiary algorithmically promoting terrorist videos on its platform.
According to the plaintiffs in the case – the family of Nochemi Gonzalez, who was killed in the 2015 ISIS attack in Paris – YouTube’s targeted recommendations violated the US anti-terrorism law by helping to radicalize viewers and promote the ISIS worldview.
The charge aims to highlight content recommendations so that they do not receive protection under Section 230, potentially placing tech platforms more accountable for how they run their services.
Google and other technology companies have said that this interpretation of section 230 increase legal risks related to the ranking, sorting and curation of online content, which is the main function of the modern Internet. Google said that in such a case, websites will seek to play it safe, either by removing much more content than necessary, or bypassing content moderation altogether and allowing even more harmful content to be posted on their platforms.
Court friend documentation Craigslist, Microsoft, Yelp and others have suggested that bids are not limited to algorithms and could also affect just about everything on the web, which could be construed as a recommendation. This could mean that even regular internet users who volunteer as moderators on various sites could face legal risks, according to a statement from Reddit and several Reddit volunteer moderators. Democratic Senator from Oregon. Ron Wyden and former California Republican. Chris Cox, original coauthors of Section 230, argued in court that the intent of Congress in passing the legislation was to give websites wide discretion to moderate content as they see fit.
The Biden administration also got involved in this case. In a brief filed in December, he claimed that Section 230 does protect Google and YouTube from lawsuits “for failing to remove third-party content, including content it has recommended.” But as the government memo argued, these protections do not apply to Google’s algorithms because they represent the speech of the company itself and not others.
The second case, Twitter v. Taamneh will decide whether social media companies can be sued for aiding and abetting a specific act of international terrorism when the platforms host user-generated content expressing general support for the group behind the violence without reference to the specific terrorist act in question.
The plaintiffs in the case — the family of Navras Alassaf, who was killed in an ISIS attack in Istanbul in 2017 — alleged that social media companies, including Twitter, knowingly aided ISIS in violation of US counterterrorism law by allowing some of the group’s content. persist on their platforms despite policies designed to restrict this type of content.
Twitter stated that the mere fact that ISIS used the company’s platform for its advertising does not constitute Twitter’s “knowingly” contributing to a terrorist group, and that in any event, the company cannot be held liable under the anti-terror law because the content in question was not involved in the case. attack that killed Alassaf. The Biden administration, in short, agree with this view.
Twitter has also previously claimed that it is immune from the lawsuit thanks to section 230.
Other technology platforms such as Meta and Google. argued In the event that the Court decides that technology companies cannot be sued under US antiterrorism laws, at least under the circumstances, it will avoid the Section 230 debate altogether in both cases because the disputed claims will be dismissed.
However, in recent years several Supreme Court judges showed an active interest in Section 230, and seemed to offer an opportunity to hear cases related to the law. Last year, Supreme Court Justices Samuel Alito, Clarence Thomas, and Neil Gorsuch wrote that new state laws, such as the Texas law, that would require social media platforms to post content they would rather take down raise questions of “great importance” about the “Strength of dominant social media corporation to shape public discussion of the important issues of the day.”
A number of petitions are currently pending asking the court to review the Texas law and a similar law passed by Florida. Last month, the court set aside a decision on whether to hear those cases, asking the Biden administration for its opinion instead.