
“We remain highly concerned by ongoing reports regarding the exploitation of children on the Roblox service and exposure to harmful material,” eSafety Commissioner Julie Inman Grant said in a statement late Monday.
Beyond the regulator’s usual monitoring, it will specifically test Roblox’s implementation of nine safety commitments it made last year, including introducing tools to stop adults contacting users aged under 16 without parental consent. Inman Grant said she wants “first-hand insights into this compliance.”
Roblox is under increasing scrutiny around the world as governments attempt to reduce online harm to children. In the case of non-compliance, eSafety can seek penalties of as much as A$49.5 million (US$35 million) against the company.
In a statement, Roblox said it has “advanced safeguards” that monitor for harmful content and communications. “While no system is perfect, our commitment to safety never ends, and we continue to strengthen protections to help keep users safe,” Roblox said.
The Australian regulator last week blasted major technology companies including Meta Platforms Inc, Apple Inc and Google for failing to stamp out child sexual exploitation and abuse on their services, even after repeated calls to address shortfalls. The country late last year also enacted a world-first social media ban for under-16s.