Supreme Court Hears Section 230 Protections for Social Media Platforms Case

Subject Matter Expert –
Blog Post Templates 7

On February 21, the Supreme Court heard arguments in Gonzalez v. Google, involving Section 230 of the Communications Decency Act of 1996 in which the family of Nohemi Gonzalez sued Google, Inc. The family alleges that Google subsidiary, YouTube, was responsible for Nohemi’s death in a restaurant during the November 2015 terrorist attacks in Paris. At the February hearing, Eric Schnapper, the lawyer for the Gonzalez family, argued that by employing algorithms to target content created by the Islamic State to viewers susceptible to its message, YouTube had created speech separate and distinct from the original content posted by the third-party users. Mr. Schnapper further argued that by doing so YouTube was in essence promoting ISIS propaganda and violating the Antiterrorism Act of 1990 by aiding and abetting an act of international terrorism.

At stake is the fundamental issue of whether the protection offered by Section 230 should extend to content that is recommended or targeted to users of social media platforms using complex algorithms to identify materials that are of interest to them.

Section 230 is widely regarded as the foundation for the current structure of today’s internet. It states that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” (47 U.S.C. § 230(c)(1)).  By granting immunity to online service providers for content posted by third parties on their platforms, and protecting platforms’ decisions to take down content they find objectionable or obscene, supporters of Section 230 contend that its protections have been a huge factor in the growth of smaller websites and the expansion of any number of internet-based companies. Both sides of the argument agree that any ruling diminishing the scope of the statute has the potential to alter the landscape of the internet.

Meta, owner of Facebook and Instagram, is supporting Google in the Gonzalez case, arguing that the algorithms it uses to organize the vast amount of material available to its platforms’ users are necessary to enhancing the users’ experiences and that its recommendations are protected under Section 230 to the same extent that any other curated content would be.

For its part, the Court seemed reluctant to upend the status quo, with Justices voicing skepticism over whether a neutral algorithm could constitute either “aiding and abetting” or speech that is distinct from what the third-party user posted. They also expressed concern over an anticipated flood of litigation and questioned whether the Court or Congress was the appropriate body to be deciding the issue.

In a second case argued before the court on February 22, Twitter v. Taamneh, the family of Jordanian citizen, Nawras Alassaf, claimed that Twitter, Google and Facebook were responsible in part for the 2017 attack in Istanbul that resulted in Alassaf’s death, because the terrorist content recommended to interested viewers by the algorithm aided and abetted the growth of ISIS. This case focuses more on the platforms’ failure to take down the content quickly enough and does not rely on Section 230.

Section 230 may also be relevant to lawsuits against Meta and Snap Inc., parent company of Snapchat, involving social media platforms that push psychologically damaging content to vulnerable adolescents.  Those lawsuits have been filed in a number of jurisdictions, and include cases filed by the Social Media Victims Law Center, Beasley Allen law firm; they allege that the social media platforms knowingly and aggressively targeted children using algorithms that allowed them to direct certain addicting content to a susceptible audience resulting in serious mental health conditions.

Read our latest blog post here.

You might also like
Share This