Is Section 230 at risk? The Case Gonzalez versus Google

Nohemi Gonzalez was a 23-year-old senior at Cal State Long Beach who was spending a semester in Paris studying industrial design. She was killed by Islamic State group gunmen in a series of attacks in November 2015 that left 130 people dead (see here: She was the only American among the victims.

Her family described her as a bright, ambitious, and creative young woman who loved to travel and learn new things. According to the Washington Post she had dreams of becoming a successful designer and making a positive impact on the world.

US Supreme Court Gonzalez versus Google

Gonzalez versus Google

Her family filed a lawsuit against Google in 2016, seeking damages for wrongful death and emotional distress. They claim that Google’s YouTube platform enabled ISIS to recruit, radicalize, and incite violence through its videos, and that Google failed to take adequate measures to prevent or remove such content.

The lawsuit is one of several that have been filed against internet companies by victims of terrorism or their families, challenging Section 230’s immunity for online platforms.

The lawyers in Gonzalez v. Google are Eric Schnapper, a law professor at the University of Washington, who represents the plaintiffs (Gonzalez’s family and other victims of terrorism). He argues that Google is liable for promoting ISIS videos through its YouTube algorithm.

On the other side there is Neal Katyal, a former acting solicitor general and a partner at Hogan Lovells, who represents Google. He argues that Google is immune from liability under Section 230 for any third-party content on YouTube, regardless of how it is displayed or recommended.

The case was filed in 2016 in a federal district court in California. It was dismissed by the district court and affirmed by the Ninth Circuit Court of Appeals on Section 230 grounds. The Supreme Court granted certiorari (agreed to hear) the case in October 2022.

The plaintiffs (Gonzalez’s family) argue that Google is not immune from liability under Section 230 because it acted as a content provider, not just a platform. They claim that Google’s algorithm actively recommended ISIS videos to users who were interested in radicalization, and that Google profited from advertising revenue generated by those videos. They also argue that Section 230 does not apply to claims based on international law or human rights violations.

The defendant (Google) argues that Section 230 protects it from any liability for third-party content on YouTube, regardless of how it is displayed or promoted. It claims that its algorithm is not editorializing or endorsing any content, but simply reflecting user preferences and interests. It also argues that Section 230 applies to all types of claims, including those based on international law or human rights violations1. Meanwhile other big players like for example Wikipedia by their Wikimedia Foundation commented a couple of days ago that they expect Section 230 to stay the way like it it arguing else the freedom of speech in Internet could be at risk. Firms like Reddit, Twitter, Microsoft as well as tech critics like the Electronic Frontier Foundation have filed letters to the court arguing that making platforms liable for algorithmic recommendations would have grave effects on free speech and internet content.

The Supreme Court will have to decide whether Google’s algorithm qualifies as an interactive computer service or an information content provider under Section 230, and whether Section 230 has any exceptions or limitations for certain types of claims. They had its first hearing on the 21st of february.

Twitter versus Taameneh

Later this week there will be a second hearing in a similar case: The case of Twitter v Taameneh. Family members of the victim of a 2017 terrorist attack allegedly carried out by IS charged that social media firms are to blame for the rise of extremism. The case targets Google as well as Twitter and Facebook.

What is the purpose of Section 230?

Section 230, enacted in 1996, shields companies like YouTube, Twitter, and Facebook from legal liability for user-posted content. According to civil liberties organizations, the Act also provides vital protections for free expression by granting internet platforms the right to host a wide range of information without undue censorship.

In this case, the Supreme Court is requested to rule on whether the immunity given by Section 230 extends to platforms that make “targeted suggestions of information” in addition to hosting content. The outcome of the case will be closely monitored, according to Paul Barrett, deputy director of the NYU Stern Center for Business and Human Rights.

“What’s at stake here are the standards for online free expression,” he explained. “This case might help determine whether the big social media platforms continue to provide places for all forms of free expression, from political debates to people publishing their art and human rights activists telling the world what’s wrong in their nations.”

A ban on algorithmic suggestions would have an impact on almost every social media network. As Facebook debuted its Newsfeed, an algorithmically controlled homepage that offers material to users based on their online activities, in 2006, most moved away from simple chronological feeds.

Barrett claims that limiting this technology will change the face of the internet. “That’s what social media does: it suggests material.”


Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top