Australia Orders Roblox, Minecraft, Fortnite and Steam to Explain Child Safety Measures
The regulator is seeking details on safety systems, staffing and moderation practices, with companies facing penalties or civil action if they do not comply.
- On Wednesday, Australia's eSafety Commission issued transparency notices to major gaming platforms including Roblox, Minecraft, Epic Games' Fortnite, and Valve's Steam, demanding they explain child safety systems.
- ESafety Commissioner Julie Inman Grant warned that gaming platforms serve as social hubs for nine in 10 Australians aged eight to 17, creating environments where "predatory adults" target children through grooming and extremist narratives.
- Offenders often "make contact with children in online game environments" before moving them to private messaging services, Inman Grant said, noting platforms must demonstrate how they identify and eliminate such harms.
- On Tuesday, Roblox agreed to pay more than $23 million to settle claims with Alabama and West Virginia over child protection failures, while facing more than 140 lawsuits in U.S. federal courts.
- Companies failing to comply with transparency notices could face penalties and civil action, as Roblox announced last week it will launch "Roblox Kids" and "Roblox Select" accounts starting in June.
26 Articles
26 Articles
Australia's Internet supervisory authority warns that more and more adults with abusive intentions about popular computer games are approaching minors.
They are usually given 30 days to respond to legally enforceable notices from country regulators.
Australia, which banned social networks for children under 16 years of age last December to protect them from online harassment, is now threatening video game publishers accused of not protecting minors from violent content and sexual predators. Explanations were sought from Fortnite, Roblox and Minecraft video games under penalty.
Australia sends legal notices to 4 gaming platforms over unsafe content
Notices require companies to explain how they detect, prevent, and respond to harms such as grooming, cyberbullying, online hate, radicalization, as well as whether their systems comply with Australia's Basic Online Safety Expectations
Roblox, Fortnite Among Gaming Platforms Put on Notice Over Child Grooming Risks
Australia’s eSafety Commissioner Julie Inman-Grant has put major gaming platforms on notice, demanding details on how they prevent the grooming and radicalisation of children. Formal notices have been issued to Roblox, Fortnite, Minecraft, and Steam, requiring the companies to outline how they identify, prevent, and respond to such risks. The move reflects growing concern about the role of gaming platforms as social spaces for young users, where…
An Australian regulator is questioning several online gaming giants, accused of not dealing enough with dangerous behaviour against minors.
Coverage Details
Bias Distribution
- 47% of the sources lean Right
Factuality
To view factuality data please Upgrade to Premium



















