Kentucky Attorney General Russell Coleman has filed a major lawsuit against Roblox, alleging that the gaming platform has failed to protect children from sexual predators and exposure to violent and graphic content, including user-generated depictions of Charlie Kirk’s assassination.
The complaint, submitted Monday, paints Roblox as a “playground for pedophiles” and claims the company has systematically neglected safeguards for underage users, according to OANN.
Roblox reports over 111 million daily active users as of August, including roughly two-thirds of American children between the ages of nine and 12.
The lawsuit asserts that predators exploit weak age-verification systems, creating accounts that mimic minors to lure young users into risky and unsafe interactions.
Prosecutors argue these failures have led to harassment, trafficking, kidnapping and sexual assault targeting children.
The complaint cites the emergence of “Charlie Kirk assassination simulators” following the conservative activist’s death at Utah Valley University last month.
According to the lawsuit, these user-created games allowed children as young as five to witness animated depictions of the shooting, exposing them to highly graphic violence at an exceptionally young age.
Coleman stressed the systemic nature of Roblox’s shortcomings.
“Roblox is designed to allow predators easy access to children and to use Roblox to groom and lure children from virtual contact to physical meetings, leading to harassment, kidnapping, trafficking, violence, and sexual assault of minors,” he said, according to The Post Millennial.
Kentucky mother of three Courtney Norris shared her experience at a press conference, describing how she initially believed Roblox was safe for children.
“I came to realize, later than I would like to admit, that it is essentially the ‘Wild West’ of the internet, aimed at kids,” she said.
This legal action follows similar cases in other states.
Louisiana claims Roblox has not adequately protected minors from sexual predators.
An Iowa family alleges that their 13-year-old daughter was trafficked across several states and sexually abused after contact with a predator on the platform.
In North Carolina, a mother alleges that Roblox allowed a predator to solicit sexually explicit images of her teenage daughter in exchange for the in-game currency, Robux.
Roblox responded by highlighting its safety measures, including advanced AI systems, a dedicated 24/7 moderation team and the implementation of 100 new safeguards this year, such as facial age-estimation technology.
“No system is perfect, and our work on safety is ongoing. We continue to innovate to protect our community,” the company said.
Prosecutors maintain that these precautions remain insufficient, with dangerous content still reaching young users.
The lawsuit seeks to force Roblox to enhance parental controls, improve content filters, and strengthen verification protocols to prevent children from encountering harmful material.
Legal analysts note the case could set a significant precedent for holding tech companies accountable for child safety.
With millions of American children actively using Roblox, state authorities are intensifying oversight to ensure the platform does not serve as a conduit for criminal activity.
The outcome may influence safety practices across the broader gaming and social media industry.