Roblox Accused of Enabling Sexual Exploitation of a Girl in US Lawsuit

Roblox Accused of Enabling Sexual Exploitation of a Girl in US Lawsuit

Teenage Tragedy Sparks Legal Backlash: Roblox and Big‑Name Platforms on the Defense

On Tuesday, Oct 4, a reckless lawsuit hit the San Francisco Superior Court, targeting Roblox Corp, Discord Inc, Snap Inc (Snapchat’s parent) and Meta Platforms Inc (Instagram’s parent). The claim? That these companies let a young California girl fall victim to sexual and financial abuse by adults.

Who is the Victim?

  • S.U. – a 16‑year‑old born in 2009 who started playing Roblox at nine or ten.

How It All Went Wrong

She began chatting with adult men on Roblox in early 2020. These men lured her into joining Discord, Snapchat, and Instagram—all without parental consent. Discord allegedly didn’t verify her age, even though it claims children under 13 are barred from the platform.

Exploitation Drama

  • Encouraged to drink and abuse prescription meds.
  • Forced to send sexually explicit photos.
  • One man even convinced her to send money.

Heartbreak & Health Consequences

S.U. suffered severe mental health issues that led to multiple suicide attempts and hospitalization. She and her mother, C.U., allege the tech giants ignored safety protocols, and that Snap and Instagram perpetuated addictive content for children.

Current Status

Meta did not comment. Roblox and Snap couldn’t be reached at the moment. A Discord spokesperson confirmed a “zero‑tolerance policy” for child endangerment but declined to comment on this particular suit.

What’s Next?

Both parties are seeking unspecified damages—no amounts disclosed yet. This lawsuit adds to a growing list of legal challenges facing Meta and Snap, many of them targeting similar claims of child exploitation and platform negligence.