Supreme Court to Review Tech Companies’ Liability

On Monday, the Supreme Court of the United States (SCOTUS) announced the cases they intend to hear in the upcoming term, including the potentially politically divisive Reynaldo Gonzalez v. Google case. That case directly questions the protections afforded by Section 230 of the 1996 Communications Decency Act (CDA), which limits the legal liability of online web hosts for the content posted by their users.

The case was brought by the family of Nohemi Gonzalez, one of 130 people killed in a series of linked attacks carried out by a militant Muslim group. They argued that tech giants like YouTube helped fuel the rise of the Islamic State by allowing the group’s recruiting materials to be posted and spread online, and that YouTube’s active role in recommending videos overcomes the liability shield for internet companies that Congress imposed in the 1996 CDA.

The Washington Times reported, “In particular, the algorithms that social media companies used to recommend content made sure ISIS propaganda was put in front of the types of people who would be most receptive to such messages, court documents said.”

The question that SCOTUS will review and rule on with this case could change dramatically internet content allowed by providers. Lawyers for Google have said changes in the provisions of Section 230 could “threaten the basic organizational decisions of the modern internet.”

Here’s the question presented to the court:

Does section 230(c)(l) immunize interactive computer services when they make targeted recommendations of information provided by another information content provider, or only limit the liability of interactive computer services when they engage in traditional editorial functions (such as deciding whether to display or withdraw) with regard to such information?

An article on Gizmodo reported, “SCOTUS had declined to hear a separate but similar case revolving around Section 230, but the nation’s top court often hears cases when there’s disagreement in lower courts. As noted in the original petition, five appeals court judges have said that Section 230 creates immunity for cases involving recommended content, while three have argued to varying degrees that it doesn’t.”

Section 230 of the CDA has been a hot button for politically divisive arguments from both conservatives and liberals for years. With accusations of tech companies screening political content, fact checking, censoring “hate speech,” and banning users, a new Section 230 liability ruling could very well cause companies like Twitter to shut down.

Regarding the currently applicable 1996 CDA, Ballotpedia reported that shortly after it passed, “the American Civil Liberties Union challenged the constitutionality of the CDA on the grounds that it violated the First Amendment and Fifth Amendment. In a unanimous decision in 1997, the U.S. Supreme Court ruled in Reno v. ACLU that the Act violated the First Amendment. The decision invalidated much of the CDA with the exception of the language of Section 230, which was not the subject of the ACLU’s legal challenge. Section 230 was left intact and remained federal law.”

To add to this case, on Friday, Democratic Sens. Mark Warner (Va.), Mazie Hirono (Hawaii), and Amy Klobuchar (Minn.) introduced the Safe Tech Act. Announced on Friday, the legislation aims to hold social media companies accountable for harassment and discrimination on their platforms. The proposed changes are meant to ensure the law doesn’t impair the enforcement of civil rights laws and others addressing the cyberstalking, harassment, or intimidation of protected classes online.

“When Section 230 was enacted in 1996, the Internet looked very different than it does today. A law meant to encourage service providers to develop tools and policies to support effective moderation has instead conferred sweeping immunity on online providers even when they do nothing to address foreseeable, obvious and repeated misuse of their products and services to cause harm,” said Warner in a press release on the Safe Tech Act.

According to Sen. Hirono:

Section 230 … allows some of the biggest companies in the world turn a blind eye while their platforms are used to violate civil and human rights, stalk and harass people, and defraud consumers—all without accountability. The SAFE TECH Act brings Section 230 into the modern age by creating targeted exceptions to the law’s broad immunity. Internet platforms must either address the serious harms they impose on society or face potential civil liability.

In April 2021, Supreme Court Justice Clarence Thomas wrote an opinion on then-President Trump’s blocking of users on his Twitter account. Thomas criticized the Section 230 legal protections given to online platforms and argued that free-speech law shouldn’t necessarily prevent lawmakers from regulating those platforms as common carriers. He wrote that “regulation restricting a digital platform’s right to exclude [content] might not appreciably impede the platform from speaking.”

No one can predict how SCOTUS will rule on this case, but with the likelihood of a lame-duck Congress come November and Democratic support for the Safe Tech Act in play, a potentially radical shift in thinking around the First Amendment could be upon us all.