Roblox CEO David Baszucki has heated debate about child safety: "I would categorically reject the actual description of what you’re saying"
- Roblox is introducing an AI facial age estimation system to group who users can speak to.
- CEO David Baszucki shared how seriously Roblox has taken safety from an early stage.
As Roblox faces a swathe of fresh criticism, concern, and lawsuits over child safety, co-founder and CEO David Baszucki has had a heated discussion with The New York Times about measures to keep children protected on the platform.
When asked about predators finding children on Roblox and leading them to other platforms, Baszucki said: "I would categorically reject the actual description of what you’re saying. But yes, what you might read in the news would portray it that way."
But, despite the interruptions, stern remarks, and taking shots at lawsuits and the press, he did agree that it is "enormously important" for Roblox to pre-empt risks around adults talking to minors, and argued that taking this responsibility seriously from the start is why much of Roblox’s architecture and decisions are as they are.
Big platform, big risks?
Roblox has scaled to 150 million daily active users and 11 billion hours per month spent in its platform, with games like Grow a Garden and Steal a Brainrot steering major growth in 2025. The latter’s 25.4m concurrent user record is more than eight times higher than Steam’s peak game and far exceeds Fortnite’s peak 15.3m.
However, with growing users come the latest concerns around Roblox’s ability to protect children. Safety concerns have led to bans in countries like Qatar and Kuwait, as well as multiple lawsuits including by law firm Anapol Weiss, which accused Roblox of prioritising profit over safety.
Now, Roblox has revealed plans to implement facial age estimation using AI, with enforcement set to begin rolling out in December. This will require users to either scan their faces or provide ID to utilise chat features on the platform, which will then sort them by age as to who they can speak with.
Baszucki suggested this won’t necessarily be a one-time check: "In many cases we can use signals from your phone or the camera to animate your face, for example. So, this isn’t kind of a one-stop-shop thing. This is possibly checking as we go. If we see weird signals, do another age check as we go."
Though Baszucki spoke of technology, text filtering, and other methods to monitor child safety, he was less inclined to answer The New York Times’ questions around predators navigating these filters or the resultant lawsuits.
"I don’t want to comment on it," he said.
He also denied Hindenburg Research’s 2024 report, which accused Roblox of inflating key metrics and "compromising child safety in order to report growth to investors".
However, rather than explain Roblox's perspective or disprove the allegations, he noted Hindenburg "went out of business for some reason" and asked the interviewer: "I’m curious if you’ve done your own research?"
As the tension raised and was highlighted, Baszucki disagreed: "I'm actually not frustrated."
During the conversation, he did emphasise how safety has always been an important part of Roblox. He shared that when Roblox was a team of four people with around 200 users, the question of whether to keep growing or build a safety system was raised.
"So 200 people in, we said: Let’s stop everything, let’s go build a safety system, and let’s let the four of us go and become moderators," he shared.
"We were literally the moderators. Like, we would rotate all the time. And so fast-forward to where we are today, it’s just like every week: What is the latest tech?"
Baszucki also assured that Roblox is "constantly moving to higher quality, more accurate and more diligent systems".