Dark Mode
Wednesday, 22 April 2026
Logo
AdSense Advertisement
Advertisement
Australia targets Roblox, Minecraft, Fortnite and Steam: Gaming giants forced to explain how they protect kids from predators and extremists

Australia targets Roblox, Minecraft, Fortnite and Steam: Gaming giants forced to explain how they protect kids from predators and extremists

By The South Asia Times

 

SYDNEY – Australia’s eSafety Commissioner has issued legally enforceable transparency notices to four of the world’s largest online gaming platforms -- Roblox, Minecraft, Fortnite and Steam -- amid mounting evidence that sexual predators and extremist groups are exploiting the platforms to target children.

 

The landmark action, announced Wednesday, requires the gaming giants to disclose how they are identifying, preventing and responding to grooming, child sexual extortion, youth radicalisation, cyberbullying and online hate.

 

eSafety Commissioner Julie Inman Grant warned that online games have become a “point of first contact” between offenders and young people, with predators often luring children on gaming platforms before moving them to encrypted messaging services.

 

“What we often see after these offenders make contact with children in online game environments, they then move children to private messaging services,” Ms Inman Grant said.

 

The commissioner cited eSafety’s own research showing that approximately nine out of ten Australian children aged 8 to 17 have played online games -- a vast audience that predatory adults are actively targeting.

 

“Gaming platforms are amongst the online spaces most heavily used by Australian children, functioning not only as places to play, but also as places to socialise and communicate,” she said.

 

“Predatory adults know this and target children through grooming or embedding terrorist and violent extremist narratives in gameplay, increasing the risks of contact offending, radicalisation and other off-platform harms.”

 

Ms Inman Grant detailed specific cases that have prompted regulatory action:

  • Roblox has reportedly hosted Islamic State-inspired games and recreations of mass shootings

  • Minecraft has seen far-right groups recreate fascist imagery

  • Fortnite has featured games gamifying the horrific events of the WWII Jasenovac concentration camp and the January 6th U.S. Capitol Building riots

  • Steam has been described as a hub for a number of extreme-right communities

 

“We’ve seen numerous media reports about grooming taking place on all four of these platforms as well as terrorist and violent extremist-themed gameplay,” the commissioner said.

 

The transparency reporting notices demand that each provider explain how their systems, staffing and “safety by design” choices align with the Australian Government’s Basic Online Safety Expectations.

 

The goal, eSafety said, is to increase pressure on technology companies to adopt Safety by Design principles -- engineering out harms before they occur -- while also providing parents with clear information about safety risks and existing mitigations.

 

“These companies must take meaningful steps to prevent their services becoming onramps to abuse, extremist violence, radicalisation or lifelong harm,” Ms Inman Grant said.

 

Under Australia’s Unlawful Material Codes and Standards, Roblox has already committed to a number of key changes this year, including:

  • More stringent age assurance

  • Making accounts belonging to under-16s private by default

  • Introducing tools to prevent adult users from contacting under-16s without parental consent

eSafety said it will directly test the implementation of these commitments to validate their effectiveness.

 

 

Compliance with a transparency reporting notice is mandatory. If companies fail to respond, eSafety can seek financial penalties of up to $825,000 per day.

 

Beyond transparency notices, online game platforms must also comply with minimum obligations under the Online Safety Codes and Standards. A breach of a direction to comply with a code or standard can result in penalties of up to $49.5 million per breach.

 

Australia’s Age-Restricted Material Codes additionally create new obligations focused on preventing children’s access to high-impact violence and other age-inappropriate content across the online industry.

 

The eSafety Commissioner said the ultimate aim is simple: ensure all users, especially children, can enjoy the benefits these platforms offer without experiencing avoidable harms.

 

“These online game and gaming-adjacent platforms are used by millions of children and so it is imperative that they take every possible step to protect them and continue to improve safeguards,” Ms Inman Grant said.

 

The companies now have a legally defined period to respond to the transparency notices. Their answers -- or silence -- will determine whether Australia moves to impose the full weight of financial penalties.

AdSense Advertisement
Advertisement
AdSense Advertisement
Advertisement

Comment / Reply From

AdSense Advertisement
Advertisement