The massively popular gaming platform Roblox says it will require users to verify their age to chat with other players.
The company says it will use facial recognition software to verify users to prevent communication between kids and adults. The new rule comes amid a slew of criticism leveled against Roblox from government officials around the world, alleging the gaming platform has failed to protect young users from child predators and sexual exploitation.
The news also comes on the heels of Texas Attorney General Ken Paxton suing the gaming company, as recently reported in The Dallas Express. Paxton alleges that Roblox ignored state and federal laws around online safety and deceived parents about the risks to children using the platform.
Matt Kaufan, the company’s head of safety, says that under the new rules, Roblox will require users to take a selfie that will be used to estimate their age. Based on the estimate, users will be assigned to an appropriate age group and will only be able to communicate with others in that group.
“Policymakers struggle with a lot of companies who say ‘that’s just too hard. We couldn’t do it,’” Kaufman said, per Reuters. “Roblox is trying to set an example of what others can follow.”
The new requirement will first launch in Australia, New Zealand, and the Netherlands next month. The rule will then take effect in other countries beginning in the new year.
Last year, Australia passed a law banning users under 16 from opening social media accounts; however, Roblox is not included in that requirement.
The gaming platform averaged over 150 million daily active users in the third quarter of 2025, making it among the most popular online destinations for children.
