Environment

Lawsuit: Charleston 11-year-old groomed by predator on Roblox, Discord

Lawsuit: Charleston 11-year-old groomed by predator on Roblox, Discord

CHARLESTON, S.C. (WCSC) – A lawsuit filed on Thursday accuses two major online platforms of allowing a Charleston 11-year-old to be groomed by an adult predator, resulting in life-altering trauma.
The lawsuit is filed against Roblox and Discord. Roblox is an online gaming platform predominantly used by children, with 111.8 million daily active users. Discord is an instant messaging application with a focus on gaming with friends.
Now 15 years old, the young girl met a man on Roblox in 2022 who posed as a peer but was in fact an adult predator, according to court documents. On the platform, the predator allegedly used grooming tactics to communicate and manipulate the child to continue their interactions on Discord.
On Discord, the predator sent the young girl explicit messages and coerced her into sending a sexually explicit image of herself.
Documents state that after the 11-year-old was sexually exploited, she lost her self-confidence and continues to suffer profound harm from the grooming, manipulation and exploitation she experienced on both of the applications.
Dolman Law Group Partner Stan Gipe filed the lawsuit on behalf of the child and her family and said Roblox is particularly dangerous, as anyone can make a profile on the platform.
“Roblox has become a new playground; kids go out there, they interact with other children and other people on the platform,” Gipe said.
“If I’m a 50-year-old predator, I can go on Roblox and pretend to be a 12-year-old girl, something I could never do in a public place. But on Roblox, I can now pretend to be a 12-year-old girl and approach other 12-year-old girls in that role,” Gipe added. “It allows a predator to open a door, strike up a friendship, and begin the grooming process.”
The child was an avid Roblox user for years and documents state her grandmother believed the gaming platform and Discord were safe for children, as both applications are designed and marketed for kids.
A spokesperson for the Dolman Law Group says the alleged predator has not faced criminal prosecution.
“It’s designed as a kids game, right? It’s improperly designed,” Gipe said. “There’s been sex assaults. This stuff has been going on for a long time. Roblox fails to properly warn parents about what the problem is.”
Back in July, a lawsuit was also filed by Gipe’s firm regarding another instance with an underage child on Roblox in Florida. Documents state a predator eventually gained a girl’s trust, escalated his manipulations, and encouraged her to meet in person.
The predator allegedly drove to the girl’s grandfather’s house and lured her into his vehicle. The two drove to a nearby neighborhood where he violently raped and forced her to perform sex acts on him, causing inconceivable trauma, harm and devastation.
“They (Roblox) get out and make affirmative representations about how safe the platform is to leave your kids on. It’s these affirmative representations that lull parents, grandparents and others into a false sense of safety and allow their kid to get on the platform,” Gipe said.
Nationwide concerns surround Roblox as Thursday’s lawsuit marks the 70th filed in Federal Court against the company since mid-July, with Discord listed on at least eight of the lawsuits.
The lawsuit alleges Roblox and Discord of spending considerable time and money publicly touting the safety and security of the apps, which created the public perception that both had created a safe environment for kids.
“They could have protected children five years ago if they wanted. They’re starting to take these safety measures that should have been taken years and years and years ago,” Gipe said. “It is a company that values company profits over user safety.”
When contacted about the lawsuit, a spokesperson for Discord released a statement that reads:
“Discord is deeply committed to safety and we require all users to be at least 13 to use our platform. We use a combination of advanced technology and trained safety teams to proactively find and remove content that violates our policies. We maintain strong systems to prevent the spread of sexual exploitation and grooming on our platform and also work with other technology companies and safety organizations to improve online safety across the internet.”
A spokesperson for Roblox released the following statement:
“We are deeply troubled by any incident that endangers our users. While we cannot comment on claims raised in litigation, Roblox’s vision is to be the safest place on the internet, which is why our policies are purposely stricter than those found on many other platforms. We limit chat for younger users, don’t allow the sharing of external images and have filters designed to block the sharing of personal information. We are continually innovating and investing in safety. While no system is perfect, this year alone we have made over 100 safety enhancements to help protect our users and empower parents and caregivers with greater control and visibility.
We also understand that this is an industry-wide issue, and we are working to develop industry-wide standards and solutions focused on keeping children safe online. For instance, Roblox has taken an industry-leading stance on age-based communication and will require facial age estimation for all Roblox users who access our communications features by the end of this year. We partner with law enforcement and leading child safety and mental health organizations worldwide to combat the sexual exploitation of children and Roblox was a founding member of the Tech Coalition’s Lantern project and the Robust Open Online Safety Tools (ROOST).”
Read the full lawsuit here: