A Snohomish County family has filed a lawsuit against gaming platforms Roblox and Discord, accusing the companies of failing to prevent a child predator from targeting and grooming their teenage daughter through inadequate safety measures and monitoring systems.
Roblox is a gaming platform that allows video game users to play a wide selection of games known as “experiences,” with nearly 90 million active daily users and approximately 40% of players under 13 years old.
Discord is a popular communication application amongst gamers that allows players to communicate in real time.
“In about 2023, the child met an adult predator who actually posed as a fellow child, who groomed and manipulated that young girl on the Roblox platform,” stated Attorney Sara Beller with The Dolman Law Group representing the family.
Beller indicated that after the predator targeted the young girl on Roblox, the grooming continued on Discord.
“That also presented itself as a safe and trusted platform, on that platform the predator sent the graphic messages and manipulated the girl to send sexually explicit images of herself,” Beller stated.
Beller alleges the communication continued for months until the 13-year-old’s father discovered his daughter sending sexually explicit pictures to an adult man posing as a peer.
The Dolman Law Group indicates the teen has been traumatised from the experience.
Attorneys stated the family did not file a police report at the time of the incident, indicating the family was too emotional to engage with law enforcement. However, they stated it does not diminish the seriousness of the incident, and it does not prevent them from seeking accountability through the civil process.
Beller claims Roblox and Discord do not have effective protections in place to keep predators out. Beller also alleges the companies are putting profit over the safety of the most vulnerable.
The Dolman Law Group stated that after their own internal investigation they uncovered hundreds of user-generated worlds on Roblox titled “Diddy Party,” “Survive Diddy,” “JeffEpsteinSupporter” and “Escape to Epstein Island.”
Beller is accusing Roblox of not catching “experiences” that expose children to sexual and inappropriate themes and conversations.
“They are just now boasting safety measures that they have implemented very, very recently, most likely in response to lawsuits like this one,” Beller stated.
The law firm is seeking unspecified monetary damages for the family, but they also indicated they want systemic change at Discord and Roblox.
“If Roblox is going to continue to hold itself out as safe, they need to stop predators that come on their platform and pretend to be children,” Beller stated.
Regarding parental responsibility and whether parents letting young children on these platforms should also be accountable for keeping their children safe, Beller argued that the companies are misrepresenting the safety of their online platforms.
“Roblox needs to inform parents about the dangers of their platform,” Beller stated.
A Roblox spokesperson provided the following statement:
“We are deeply troubled by any incident that endangers our users. Roblox aims to set the bar for online safety, which is why our policies are purposely stricter than those found on many other platforms. We limit chat for younger users, don’t allow user-to-user image sharing, and have filters designed to block the sharing of personal information.
“Our Community Standards explicitly prohibit the portrayal of sensitive real-world events, and we take swift action against any content or users found in violation. The majority of these ‘real-world event’ experiences are quickly removed from the platform or reported. Unlike our most popular games that have tens of thousands of concurrent players, these violations typically involve a small number of users and are not found organically on the platform.
“We understand that no system is perfect, which is why we are constantly working to improve our safety tools and platform restrictions. We have launched 145 new safety initiatives this year alone. We also recognise this is an industry-wide issue and are working to develop collaborative standards and solutions. For instance, Roblox is implementing an industry-leading policy requiring sophisticated facial age estimation for all users who access our communications features, helping prevent older users from contacting children inappropriately.
“We encourage anyone to report content or behaviour that may violate our Community Standards using our Report Abuse feature. We partner with law enforcement and leading child safety and mental health organisations worldwide to combat the sexual exploitation of children and are a founding member of the Tech Coalition’s Lantern project and the nonprofit Robust Open Online Safety Tools (ROOST).”
Discord also provided the following statement:
“Discord is deeply committed to safety and we require all users to be at least 13 to use our platform. We use a combination of advanced technology and trained safety teams to proactively find and remove content that violates our policies. We maintain strong systems to prevent the spread of sexual exploitation and grooming on our platform and also work with other technology companies and safety organisations to improve online safety across the internet.”



