School districts claim that social media companies, including Facebook and TikTok, intentionally targeted children and disrupted school operations, according to a California court ruling allowing lawsuits against these companies. The suit alleges social media platforms harm children’s mental health, causing schools to allocate additional resources to address the resulting issues.
This decision is part of a larger multidistrict lawsuit involving hundreds of complaints from schools, local governments, and state attorneys against Meta's Facebook and Instagram, Google’s YouTube, ByteDance’s TikTok, and Snapchat. The U.S. District Court for Northern California judge, Yvonne Gonzalez Rogers, permitted claims related to harm to minors to proceed, despite the defendants' motion to dismiss. The districts argue these platforms were deliberately designed to create compulsive use and addiction among young users, resulting in mental and physical harm.
School districts say that algorithms used by these platforms require them to spend heavily on addressing mental health and behavior challenges related to social media addiction. They further argue that platforms lack proper age verification, effective parental controls, and tools for reporting child sexual abuse material. Additionally, they claim the algorithms encouraged addictive behaviors, exposing minors to adult strangers and sharing location information.
This excessive social media use, they argue, has severely disrupted school operations, impeding efforts to educate children safely. To counteract this, many districts increased mental health staffing, expanded programs, introduced social and emotional health classes, and offered mental health training for teachers.
The platforms responded that these allegations were “too remote” for legal action, but Judge Rogers largely disagreed, stating the claims sufficiently fit negligence laws. The court acknowledged that social media companies breached their duty of care by fostering compulsive use in minors, leading to costly interventions by school districts.
A Google spokesperson refuted the allegations, asserting that their policies are designed to promote age-appropriate experiences and offer robust parental controls. Meanwhile, Meta and Snap have not commented publicly on the case.
Just prior to this, Rogers issued a related
ruling that partially dismissed claims filed by attorneys general against
social media companies but still allowed certain allegations to proceed. Rogers
also noted that Section 230 of the Communications Decency Act, which protects
platforms from liability for third-party content, would limit some claims in
this case but not all.
0 Comments