HomeCrimeSection 230 won't shield TikTok, Snapchat, Meta from lawsuit

Section 230 won’t shield TikTok, Snapchat, Meta from lawsuit

Main: U.S. District Judge Yvonne Gonzalez Rogers denied the majority of a motion to dismiss filed by various social media platforms Tuesday. As a result, tech companies will face claims that their platforms are “defective.” (Screengrab via YouTube). Inset: This combination of photos shows logos of Snapchat and TikTok. (AP Photo, File)

Meta, Google, TikTok, and Snapchat must face a major lawsuit over their allegedly “defective” platforms that plaintiffs say cause millions of kids to become addicted. The case, which is awaiting class action certification, survived a motion to dismiss Tuesday despite the companies’ argument that they are entitled to immunity under federal law.

U.S. District Judge Yvonne Gonzalez Rogers, a Barack Obama appointee, ruled Tuesday that Section 230 of the Communications Decency Act of 1996 does not shield the social media giants from products liability claims.

The lawsuit, filed by the Social Media Victims Law Center, alleged that the social media companies target children with platforms purposely designed to prey upon kids’ limited impulse control. The complaint charges that the harms range from excessive screen time to promotion of inappropriate sexual content to dangerous child-adult connections and geolocation and more.

The defendant companies argued — much as they have in many other cases — that Section 230 bars plaintiffs’ claims in their entirety, because they are not “publishers” of third-party content within the legal meaning of the term.   Rogers rejected what she called defendants’ “all-or-nothing” position and called Section 230 “more nuanced” than the companies contended. Rogers said that the lawsuit raised issues about “a wide array of conduct” that would constitute a failure to create safe products or warn about defects. The judge noted as examples, the failure to provide effective parent controls or options to self-restrict use times, the lack of robust age verification, the difficulty involved for users to report predator accounts, the use of appearance-altering filters, and organizing notifications “in a way that promotes addiction.”

Rogers said in her 52-page ruling that the defendant platforms did not sufficiently respond to plaintiffs’ allegations, and offered as an example Snapchat’s unconvincing argument is not a social media platform at all and rather just “a camera application.”

Most, though not all, of the plaintiffs’ claims against the tech companies survived. Rogers granted the defense motion to dismiss as to claims about some algorithm and notification features.

RELATED ARTICLES
- Advertisment -

Most Popular

- Advertisment -
Share on Social Media