Roblox introduces new age-based accounts as it fights to alleviate safety concerns
- Users under nine will be given Roblox Kids accounts, while players aged nine to 15 will have Select access.
- Parents will be able to control game and chat access.
- Later this year, Roblox will transition to the International Age Rating Coalition framework.
Roblox is set to roll out new age-based accounts in June for under 16s that will restrict certain platform features in an effort to alleviate concerns around child safety.
The new account system includes Roblox Kids for users aged under nine, Select for those aged nine to 15 and regular accounts for 16+.
Kids will be assigned to users aged five to eight as determined by Roblox’s own age-check tech or by a verified parent. These accounts will be limited to games with a minimal or mild content maturity label that has gone through the company’s three-step selection process, which includes AI asset scanning, ongoing user reports and moderation. All communication is disabled by default.
Select accounts for users aged nine to 15 will see chat gradually introduced with safeguards, the company said. Access to games will be limited to games with content maturity labels up to and including moderate.
Accounts will automatically progress between tiers as users age. Any user that hasn’t completed an age-check will be limited to games rated minimal or mild and all communication will be unavailable.
Parental controls
Roblox said it will also be extending parental controls, including the ability to block individual games and manage direct chat settings up to the age of 16.
Though Roblox Kids accounts will have public chat blocked, Select accounts will have filtered public communication. During a press briefing, PocketGamer.biz asked why this wasn’t turned off by default.
“We believe that in the Roblox Select age range between nine and 15 it becomes more age-appropriate to have access to communication, which is filtered,” Roblox chief safety officer Matt Kaufman told us.
“Our default communication is all filtered. There’s no sharing of images, there’s no sharing of video and we monitor everything proactively for any harms and we actively report any issues that we find and take actions on accounts which are problematic.
“I believe that sets us apart from just about every other platform in the world that has any type of communication.
“The other thing to note is that users are restricted to only chatting with others of similar ages to themselves, unless they create something called Trusted Friends, which does require parental consent in order to create, for certain ages in the Roblox Select accounts.”
Roblox VP of safety product policy Eliza Jacobs added: “We want to build a system where they can safely chat and play with their friends. Chatting is part of gameplay in a lot of these games, where they have to co-ordinate on teams in order to be able to play together. So we want to allow that kind of pro-social communication within our safety systems.”
Restricting content
The company said the update brings together age-checks, account-level defaults, content ratings, ongoing moderation and parental controls together into a unified framework.
Later this year, Roblox will transition to the International Age Rating Coalition framework, a globally recognised standard for content classification. This includes ESRB in the US and PEGI in much of Europe and the UK.

Scruitny around Roblox's safety and parental control features has intensified in recent years as countries such as Egypt, Russia, Türkiye and others have blocked the game.
In February, it was reported that Los Angeles County had filed a lawsuit against Roblox, alleging the platform has “failed to protect children from predatory behaviour”. Last week, a new BBC report cited a case where a 14-year-old daughter was groomed by a man she met on the platform.
Roblox CEO David Baszucki had a heated debate over the issue last year. He stated it was "enormously important" for the platform to pre-empt risks around adults talking to minors.