According to TechSpot, Roblox is implementing strict age verification requirements that will limit chat access between different age groups starting in December. The system will analyze facial features from player selfies to assign users to six specific age brackets: under 9, 9-12, 13-15, 16-17, 18-20, and over 21. Children under 9 will have chat disabled entirely, while other groups can only communicate with strangers in their same age range. The rollout begins in December in Australia, New Zealand, and the Netherlands before expanding globally in January. This comes as Texas Attorney General Ken Paxton and Louisiana officials have filed lawsuits alleging Roblox enables predators, with one case involving a 13-year-old girl manipulated into sharing explicit content.
How age verification works
Here’s how Roblox’s new system actually functions. Players will need to submit a selfie, which the platform’s AI will analyze to estimate their age group. It’s not about getting your exact birthday right – it’s about placing you in the correct bracket. Once you’re slotted into your age group, your chat permissions automatically adjust. Kids under 9 get no chat at all, which honestly seems like the safest approach. Older teens can chat within their bracket, and there’s this Trusted Connections feature for communicating with vetted family members.
The safety backlash
Roblox CFO Matt Kaufman told The Guardian this should make chat more trustworthy, but let’s be real – the platform’s reputation has taken some serious hits. We’re talking multiple lawsuits from state attorneys general accusing Roblox of putting “pixel pedophiles and profits over the safety” of children. One Florida attorney has filed 28 separate lawsuits against the company. That’s not just a few isolated incidents – that’s a pattern. And the case about the 13-year-old girl being groomed? That’s every parent’s nightmare.
privacy-and-practical-concerns”>Privacy and practical concerns
So they’re going to analyze our kids’ faces? I get the safety angle, but collecting biometric data from children raises some serious privacy questions. How secure is that facial data? What happens if the system misclassifies a 12-year-old as 16? Basically, you’re trusting an algorithm to get it right every time. And while Roblox emphasizes their moderators actively monitor chats and filter text by age group, we’ve seen how these content moderation systems can fail before. The company says they’re setting a “gold standard” for safety, but is facial analysis really the answer?
Broader implications
Look, this move by Roblox, detailed in their official announcement, could set a precedent for other gaming platforms dealing with similar safety issues. But here’s the thing: age-gating chat doesn’t solve everything. Predators can still lie about their age or find other ways to exploit the system. And while restricting communication between adults and kids makes sense, does it actually prevent grooming within the same age groups? The platform’s trying to balance safety with functionality, but after so many lawsuits and controversies, they might be playing catch-up rather than leading the way.
