DeKalb family sues Roblox, Discord after child allegedly sent explicit photos to predator

0
3

This post was originally published on this site.

A DeKalb County family is suing the gaming platform Roblox and the chat app Discord after their now 14-year-old son was allegedly groomed by a predator who convinced him to send sexually explicit photos, according to the family’s attorney.

DeKalb family sues Roblox, Discord

The lawsuit, filed in California federal court, where both companies are headquartered, was brought on behalf of the boy’s mother, identified as Jane Doe, with her son listed as John Doe.

The mother is seeking a jury trial and unspecified financial damages, claiming her son has suffered from severe mental health issues, anxiety, trust difficulties, and a loss of safety and innocence following the alleged grooming and exploitation.

Child groomed on Roblox, lawsuit alledges

The case stems from incidents that occurred two years ago when the boy was 12 years old. According to the lawsuit, he was an avid Roblox user, and his mother trusted the platform’s safety features enough to allow him to play and chat freely.

In 2023, the boy allegedly began communicating with a predator posing as another child through Roblox’s chat function. The lawsuit claims the man later moved their conversations to Discord, where he sent graphic messages and coerced the child into sending explicit photos and videos. The mother was unaware this was happening, according to court filings.

“Capitalist greed outweighs humanity,” lawyer says

“This case against Roblox is a terrifying reminder of the world we live in where capitalist greed far outweighs humanity,” said Matthew Dolman, principal of Dolman Law Group, which represents the family. “There have never been sufficient safety measures and protocols in place, putting our youngest and most vulnerable communities into unimaginable harm’s way every second of the day. Without forcing systemic change, Roblox will continue to operate anarchically.”

Discord and Roblox on safety

Discord declined to comment on pending litigation but provided a statement emphasizing its safety measures.

“Discord is deeply committed to safety and we require all users to be at least 13 to use our platform. We use a combination of advanced technology and trained safety teams to proactively find and remove content that violates our policies. We maintain strong systems to prevent the spread of sexual exploitation and grooming on our platform and also work with other technology companies and safety organizations to improve online safety across the internet.”

FOX 5 has reached out to Roblox for comment but has not yet received a response. 

Roblox allows parents to set limits on screen time, content maturity, spending, and privacy. The company states that content labeled as mature is flagged within the app and that users under 13 are subject to strict chat filters. However, critics say these safeguards are not sufficient to prevent predatory behavior.

History of lawsuits against the apps

The lawsuit is one of several recently filed against Roblox and Discord. 

In August, another DeKalb County family sued the platform, claiming their 9-year-old son was sexually exploited through the app.

In Iowa, a family filed suit after their child was allegedly groomed and kidnapped by a 37-year-old man she met on Roblox. 

In California, a 10-year-old girl was found 250 miles from her home after reportedly communicating with a 27-year-old man via Roblox and Discord. Her family is now suing both companies, accusing them of recklessly operating platforms that allowed grooming and kidnapping to occur.

In 2022, a Clayton County man, Howard Graham, was arrested after allegedly convincing a 13-year-old girl from Kansas to run away and live with him. Police were tipped off by Graham’s roommate, who became suspicious when he realized the girl was not Graham’s stepdaughter.

This site uses Akismet to reduce spam. Learn how your comment data is processed.