The post Roblox clamps down on chats and age checks as legal pressure builds appeared first on Malwarebytes.
Roblox has long faced criticism over child safety on its platform. Now it has started settling with state attorneys over the issue, and the total is climbing fast.
On April 21, Alabama Attorney General Steve Marshall announced a $12.2 million settlement with the child-focused online gaming platform. The State of West Virginia also settled for $11 million the same day. Those came a week after Nevada Attorney General Aaron Ford got the company to hand over $12 million.
Their problem with Roblox is clear from the settlement documents: they believe it hasn’t been adequately protecting children from predators on its platform.
As part of Alabama’s settlement, Roblox must now run age checks on everyone via facial age estimation or a government ID starting May 1. That applies to both new and existing accounts. The company must now also monitor account behavior to catch users who lied about their age.
Adults and under-16s won’t be able to talk with each other at all unless they’re on a “trusted friend” list, added via QR code or a phone-contact import, and users that don’t undergo age verification can’t chat to anyone.
Communication involving any minor cannot be encrypted, so law enforcement can read it during investigations. West Virginia’s settlement also insists that Roblox alert minors the first time they enter a private chat, so children understand how to communicate safely.
Roblox already stopped people from chatting without age verification as of January this year, but under new measures it will start restricting access to games for those that don’t undergo the process. Starting in June, the platform will split into three tiers: Roblox Kids for ages 5–8 will forbid any chats at all, and will only allow access to games labeled ‘minimal’ or ‘mild’ on its maturity scale. Those who don’t complete age verification will also have these restrictions. The other two account levels are Roblox Select for 9–15 year-olds, and standard accounts for those 16 and up.
Three settlements in eight days totaling more than $35 million must hurt, but it’s just the beginning. Texas, Florida, Louisiana, Iowa, Nebraska, Kentucky, and Tennessee are all pursuing similar claims: that Roblox exposed children to risk and then misled parents about its safeguards.
In February, LA County sued Roblox, accusing the platform of choosing profit over safety and leaving kids exposed to grooming and explicit content.
Roblox is also separately dealing with nearly 80 federal lawsuits filed by families in California alone. And Australia’s eSafety Commissioner has also issued legally-enforceable transparency notices to Roblox and other tech companies. These force them to detail what they’re doing to protect children. Those notices are backed by fines of A$825,000 a day (that’s about US$590,783) for non-compliance.
The $12.2 million from Alabama’s settlement funds school resource officers through the state’s Safe School Initiative. Nevada’s is earmarked for the Boys & Girls Club and “nondigital activities,” plus a law-enforcement liaison and an online-safety awareness campaign. West Virginia will invest $500,000 in safety education workshops for parents and children, create a $1.5 million three-year public safety campaign, and spend $2.4 million on a dedicated internet safety specialist for six years.
There’s a predictable rhythm to how big tech companies face down state attorneys general. First comes pushback, then rhetoric about shared values, and then they start handing over cash.
It is a step forward that Roblox is agreeing to new safeguards, but questions remain.
In its own lawsuit against Roblox launched last month, Nebraska complained that the company’s existing age-check technology was inadequate. From the complaint:
“Rather than meaningfully protecting children, the system has repeatedly misclassified users’ ages, placing adults in child chat groups and minors in adult categories, while age-verified accounts for young children have already been traded on third-party marketplaces, undermining any purported safety benefits.”
What happens when the age-estimation AI guesses wrong on a 14-year-old who looks 17, or when a “trusted friend” QR code gets passed around a group chat somewhere it shouldn’t?
The company’s Persona age-check tool has also turned out to do more than check ages: researchers say they found an exposed frontend showing the system was also running facial recognition against watchlists.
Settlements address past concerns, but they don’t guarantee future safety. Parents must still do the work to ensure that they know what their kids are signing up for and who else they might be playing with.
For more information about the safety of Roblox and other services, check out our research: How Safe are Kids Using Social Media?
According to CNET. Read their review →
*** This is a Security Bloggers Network syndicated blog from Malwarebytes authored by Malwarebytes. Read the original post at: https://www.malwarebytes.com/blog/news/2026/04/roblox-clamps-down-on-chats-and-age-checks-as-legal-pressure-builds