Supreme Court to hear cases that could disrupt the immunity of social media companies

The Supreme Court announced Monday that it would be hearing two cases with the term that could significantly change the nature of content moderation on the Internet.

The court has agreed to hear Gonzalez v. Google and Twitter in Goodbye. In both cases, the question is whether technology companies can be held legally liable for what users post on their platforms, and for content that users see because of the platform’s algorithm.

In general, websites cannot be held liable in either case because of: Section 230 of the Communications Decency Act of 1996which states: “No provider or user of an interactive computer service should be treated as the publisher or speaker of information provided by any other provider of information content.”

Nohemi Gonzalez was one of 129 people killed in coordinated attacks carried out by the self-proclaimed Islamic State in Paris in November 2015.

Gonzalez’s father, Reynaldo Gonzalez, argues in his lawsuit against Google that YouTube’s recommendation algorithm helped the terrorist group’s recruiting efforts by promoting its videos to users in violation of the so-called Anti-Terrorism Act.

In Twitter v. Taamneh, the family of Nawras Alassaf, the victim of a 2017 nightclub attack carried out by the self-proclaimed Islamic State, alleges that social media companies have provided material support for terrorism and have not done enough to monitor the group’s presence on their behalf. platforms.

As Slate’s Mark Joseph Stern noticedthere is ‘ideological consensus’ among lower court judges that the time has come to rethink the limits of Article 230.

Last year, Judge Marsha Lee Siegel Berzon of the Ninth Circuit Court of Appeals, a Bill Clinton appointee, insisted her colleagues to reconsider the legal precedent surrounding Section 230 “to the extent that it believes Section 230 extends to the use of machine learning algorithms to recommend content and connections to users.”

In 2020, Supreme Court Justice Clarence Thomas indicated that he was open to hearing arguments about Section 230, to write“In a fitting case, we should consider whether the text of this increasingly important statute is consistent with the current state of immunity enjoyed by internet platforms.”

Section 230 has been attacked by both Democrats and Republicans, albeit for different reasons. Former President Donald Trump tweeted “CANCEL 230!” after Twitter started putting fact-checking labels on his letters. And as a 2020 candidate, President Joe Biden told: The editors of the New York Times that Meta CEO Mark Zuckerberg “should be subject to civil liability and his company to civil liability, just as you would be here at The New York Times.”

Others have warned that restricting Section 230 could curb freedom of expression on the web. Its supporters claim it provides legal protection to small bloggers, as well as websites such as Wikipedia and Reddit, who could otherwise be held liable for the content of their comment sections or crowd-sourced material.

The Electronic Frontier Foundation, a nonprofit organization dedicated to civil liberties on the web, has referred to Section 230 as “​one of the most valuable tools for protecting freedom of expression and innovation on the Internet” and saying it “creates a broad protection that has allowed innovation and freedom of expression to flourish online.”

Right-wing parties have cited Section 230 as claiming that social media companies discriminate against conservative views — even though on Facebook, for example, conservative media dominates I and have said that these companies should therefore be subject to the same legal restrictions as traditional publishers.

Ironically, as some observers have pointed out, the restriction or removal of Article 230 would likely lead to more restrictions on Internet speech, not less.

“It could create a pre-screening of every piece of material that each person posts and lead to an exceptional degree of moderation and prevention,” said Aaron Mackey, staff attorney at EFF, told NPR in 2020. “What any platform would be concerned about is, ‘Am I risking putting this content on my site?'”

Leave a Comment