There have been a number of recent developments in litigation and proposed legislation arising out of the issues of privacy violations and children’s social media addiction to platforms like TikTok, Facebook, Instagram and Snapchat. Concerned about the negative impact the use of these platforms may have on their youngest users, a number of state attorneys general have turned their attention to investigating how these apps are used by children to the detriment of their physical and mental health.
In particular, on March 2, 2022, attorneys general from California, Florida, Kentucky, Massachusetts, Nebraska, New Jersey, Tennessee and Vermont banded together in a bi-partisan coalition tasked with investigating the popular app TikTok to determine if the company is violating safety and privacy protections that have been established for younger users, including those outlined in the 2000 Childrens’ Online Privacy Protection Act that focuses on users under the age of 13.
The coalition has expressed concern that TikTok is manipulating children into spending extended periods of time on the app, resulting in physical and mental harm, in addition to promulgating misleading and unrealistic social or body images to which children feel incapable of measuring up. The group is also looking into the extent TikTok knew of these risks and failed to protect its younger users.
A similar probe was initiated in November, 2021 by the same attorneys general as well as Letitia James, Attorney General for New York, aimed at investigating Instagram and its parent company Meta, citing the methods by which the platform increases the frequency and time users spend and expressing concern that this “social media addiction” resulted in physical or mental health harm in violation of states’ consumer protection laws. Facebook had already been the focus of attention after a former product manager testified at a U.S. Senate hearing to the Senate Subcommittee on Consumer Protection Product Safety and Data Security in October, 2021 that the social media giant had hidden internal research proving that extended use of its platform harmed children, created significant privacy issues and fostered misinformation, with no oversight.
In addition, on March 15, two California lawmakers from both sides of the aisle introduced a bill seeking to hold the companies responsible for the most popular platforms such as Facebook, Instagram and TikTok for the social media addiction that results from the targeting of and excessive use of the apps by children. Assemblymembers Jordan Cunningham (R-San Luis Obispo) and Buffy Wicks (D-Oakland) introduced A.B. 2408, or the Social Media Platform Duty to Children Act which is aimed at allowing parents and children harmed by the companies to sue for damages. The legislation seeks to make companies that “knew or should have known” that their platforms were addictive and potentially harmful liable for civil penalties and would be applicable to companies earning more than $100 million in annual revenue.
This growing concern expressed by lawmakers can also be seen in the founding of groups such as the Social Media Victims Law Center. Created by attorney Matthew Bergman, the group is dedicated to holding social media companies accountable by engaging in civil litigation that will help bring about consumer safety and awareness of the issues and risks. Mr. Bergman represents the family of Selena Rodriguez in a lawsuit against Meta Platforms (Facebook), Snap Inc (Snapchat), Tik Tok and ByteDance Inc. that alleges the 11-year old’s suicide was the direct result of her social media addiction. A similar case has been filed by the family of a 14-year old who developed social media addiction resulting in the young girl’s hospitalization.
Most recently, on March 28, U.S. District Judge John Robert Blakey of Illinois approved a $1.1 million settlement that will resolve claims that TikTok and its parent company collected and shared the personal information of minor users without parental consent.