This section has been routinely vilified by various political groups, who claim that the protections under this section against civil suits should be struck down.
The 2015 murder of the 23-year ago American student Nohemi Gonzalez is about to take center stage in a case that has made its way to the US Supreme Court. The woman was one of 129 people killed in Paris by a group of ISIS terrorists. Her estate and family members sued Google, claiming that a series of YouTube videos posted by ISIS are the cause of the attack (and her death), and requests damages as part of the Anti-Terrorism Act.
At the heart of the Gonzalez v. Google case lies Section 230 of the Communications Decency Act of 1996. This section has been routinely vilified by various political groups, who claim that the protections under this section against civil suits should be struck down. This section contains the following sentence: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
Vox writes that this sentence has had a broad impact. “It is unlikely that social media sites would be financially viable, for example, if their owners could be sued every time a user posts a defamatory claim. Nor is it likely that we would have sites like Yelp, or the user reviews section of Amazon, if a restaurant owner or product maker could sue the website itself over negative reviews they believe to be defamatory.”
We’ve written about this section and its implications before. In a post about a speech that Obama gave at Stanford earlier this year, he recommended that that section be revised and that internet platform owners have more accountability in how they moderate their content. We also covered the rise and fall of the social network Parler last year, and how that section applies (or not) to that particular site.
What makes the Gonzalez case interesting is that the suit claims that Section 230 shouldn’t shield websites that promote illegal content. The Vox coverage dives into the history of the law and how it came to be enacted. The law fundamentally shaped the internet’s development, as the EFF writes in its own summary of its provisions. If you want to delve deeper into the core sentence, you might want to read this book by Jeff Kosseff, a cybersecurity law professor at the US Naval Academy, entitled The Twenty-Six Words that Created the Internet.
One recent example of this involves the accidental death of the ten year-old Nylah Anderson, which took place after Anderson followed instructions for the so-called “Blackout Challenge” on TikTok. Her mother filed suit in Pennsylvania court, claiming that Section 230 should not be applied, and the court ruled that TikTok wasn’t liable. There are similar cases pending in other courts.
Another issue is whether or not Section 230 should apply to how a website operator chooses what pieces of content to promote and what to remove. This content moderation is at the heart of modern social media sites. Even if the Supremes decide that website operators still can’t be considered publishers, the way they use algorithms to select content could be challenged and if they rule, be subject to potential suits.
The Gonzalez case isn’t the only one that involves Section 230 that the court has on its docket. Another case, Twitter v. Taamneh, was filed by a Jordanian killed in another ISIS attack, this time in Istanbul in 2017. Lower courts ruled in favor of the victim, claiming that Google, Twitter, and Facebook could be held liable. That ruling didn’t mention Section 230 however and Twitter had appealed this decision. It is likely that the Supreme Court will consider both cases together.
Finally, the real wild card in this discussion is the recent purchase of Twitter by Elon Musk, and how that will evolve now that the company is private and Musk is in control.