Menu PocketGamer.biz
Search
Home   >   News

Roblox asks "vigilante groups" to leave child protection to experts, AI and law enforcement

The platform is partnered with law enforcement and has AI-based Sentinel to detect signs of exploitation
Roblox asks
  • Roblox has warned "vigilante" users they will be removed or banned if found to violate policies.
  • The platform has highlighted its relationship with law enforcement as a proper alternative.
Stay Informed
Get Industry News In Your Inbox…
Sign Up Today

Roblox has highlighted the presence of "vigilante groups or individuals" entrapping bad actors on its platform.

Though acknowledging these groups appear to have public interest at the heart of their actions, Roblox has argued that they may inadvertently increase the risk of more users being exposed to bad actors - who might make "sensationalised" social media content in response.

Roblox also suggested that vigilantes’ actions can delay efforts by itself and actual law enforcement to crack down on bad actors. The platform will therefore remove or ban users if they are found to violate policies, even if they are made with good intent.

"Law enforcement professionals are best positioned to investigate potential crimes," said Roblox chief safety officer Matt Kaufman.

"Instead of taking matters into their own hands, we encourage the community to report any inappropriate behaviour directly to Roblox or law enforcement."

Protection through partnerships

Roblox’s partnership with law enforcement sees the platform’s "advanced detection and rapid response protocols" report potentially illicit activity, violent threats or other real-life harms to law enforcement.

The company also participates in events like the Crimes Against Children Conference and has a relationship with law enforcement at international, federal, state and local levels.

This partnership sees Roblox’s dedicated internal teams support requests from law enforcement and help identify and disrupt bad actors on or off the platform, as many such users are also active elsewhere.

"Our stringent safety measures may frustrate bad actors and push them to operate on other platforms. This is positive - we don’t want them on Roblox - but we also know many of our users are active across multiple platforms and we want to contribute to their safety even when they are not on Roblox," Kaufman explained.

"We maintain direct communication channels with organisations such as the FBI and the National Centre for Missing and Exploited Children (NCMEC) for immediate escalation of serious threats that we identify."

Roblox’s safety systems are said to include thousands of trained professionals, as well as its AI-based Sentinel, which has now been open-sourced.