Gaming giant Roblox has unveiled a new, mandatory facial recognition system for all users wishing to access its chat features, a dramatic move aimed at creating an age-appropriate communication environment and isolating minors from potential adult predators.
The platform, which has faced intense global scrutiny and multiple lawsuits in the US over its alleged failure to protect young users from exploitation, will soon require all players to submit a video selfie for age estimation before they can communicate with others.
Roblox's head of safety, Matt Kaufman, stated the platform is "trying to set an example of what others can follow" for an industry grappling with age assurance challenges.
New age-grouped chat system
The new requirement, powered by third-party vendor Persona, works by having the user take a selfie video which is analysed to estimate their age. Users will then be assigned to one of six specific age groups, ranging from Under-9 to 21+.
This will implement a strict, age-based chat system that will severely limit cross-group communication. For instance, a user estimated to be 12 will only be able to chat with other players aged 15 and younger, while a user aged 18 will be restricted to talking with those aged 16 and over. Roblox affirms that the images and video used for the check are deleted immediately after processing.
The rollout will begin in select markets—Australia, New Zealand, and the Netherlands—in December, with the requirement expanding globally for all users by January. Players who decline the face scan will still be able to use the platform but will have their chat features disabled.
Mounting legal pressure
The aggressive policy shift comes as Roblox battles severe criticism from government officials and a flurry of legal actions. The platform is currently facing lawsuits from the Attorneys General of Texas, Kentucky, and Louisiana, alongside a growing number of private plaintiffs, all alleging that the company's design and lack of effective screening have made children "easy prey for paedophiles."
Lawyers for affected families contend the company has "recklessly and deceptively" run its business, leading to the systemic sexual exploitation of minors. Despite having industry-leading protocols on paper, critics argue the company has been slow to address pervasive predatory behaviour.
By imposing these mandatory facial age checks, which Roblox calls a "new industry benchmark," the company aims to reassure parents and policymakers that it is taking decisive steps to secure its platform, where millions of children spend substantial time.