Protect Australian children from risks on Roblox

Recent signers:
Kelly Podolan and 19 others have signed recently.

The issue

Why is a platform that has been repeatedly linked to grooming, coercion, predatory behaviour, sexualised content, and offshore biometric data storage not included in Australia’s under-16 social media ban?

I have been questioning this app ever since I watched my own child delete Roblox after it began to affect his mood, behaviour and self-regulation in ways I couldn’t ignore. The day he removed it, everything shifted. His anxiety eased. His nervous system settled. And I started to see the platform for what it truly was, long before headlines confirmed it: not a game, but a vulnerable and dangerously unregulated social network for children.

Years later, we were featured in the Herald Sun and Daily Telegraph (https://www.dailytelegraph.com.au/news/national/let-them-be-kids/inside-roblox-the-dark-side-of-australias-most-popular-kids-game/news-story/05e6dbfa6bd80ed1f7cb54da5756b85d which published a double-page spread in the Herald Sun, an investigation exposing the risks so many families had been noticing in silence. Their reporting detailed grooming attempts, simulated sexual behaviour, adults contacting children in private servers, coercive chat, and financial exploitation, all inside a platform marketed as harmless play. Seeing our experience reflected in print is what pushed me to investigate why Roblox, of all platforms, would be exempt from the new ban.

“A playground where predators can walk straight in disguised as children.”

The Daily Telegraph later published its own national investigation titled “Inside Roblox: The Dark Side of Australia’s Most Popular Kids’ Game”, revealing that Roblox is now the number one gaming platform for Australian children, more popular than Minecraft, Fortnite or YouTube combined. Their reporting exposed sexualised role-play, simulated strip clubs, “dating” games where adults interact with children, and private rooms designed for explicit behaviour that children can access within a few clicks. Experts quoted in the piece called Roblox “a playground where predators can walk straight in disguised as children,” warning that the platform’s lack of real age verification and easily bypassed moderation create a “catastrophic design flaw.” The investigation detailed grooming patterns that begin inside seemingly innocent games before children are moved into private experiences and then pushed onto external apps like Discord or WhatsApp. The Telegraph’s coverage echoed what so many parents had quietly been experiencing: Roblox looks safe, but it is not.

That question has only grown louder as the evidence has stacked up.

In 2024, Bloomberg published “Roblox’s Pedophile Problem,” an exposé revealing how predators deliberately exploit Roblox’s weak moderation, child-dominated environments and user-generated worlds to access minors. The investigation also revealed that Roblox introduced an AI facial-recognition age-verification system, requiring children to scan their faces on camera. Those biometric scans are processed by an offshore third-party vendor.

"With 78 million daily active users today, Roblox has become social media for the youngest generation. Every second, according to Roblox, it processes more than 50,000 chat messages—Hey loser, cute outfit, let’s be friends—through its moderation protocols, a combination of artificial intelligence technology and human workers that the company says scans all user content, including audio and text. Roblox has about 3,000 moderators, significantly fewer than TikTok, which has three times the number of daily users but employs 40,000 moderators. (Roblox says the number of moderators isn’t an indicator of quality.)"

Since 2018, police in the US have arrested at least two dozen people accused of abducting or abusing victims they’d met or groomed using Roblox, according to data compiled by Bloomberg Businessweek. Some were already on sex offender registries or had been accused of abusing minors; there were also a sheriff’s deputy, a third-grade teacher and a nurse."

“Roblox spends so much time, effort and money convincing parents that their platform is safer than it actually is”

Source: https://www.bloomberg.com/features/2024-roblox-pedophile-problem/

Every parent should feel uneasy reading that. Biometric data isn’t like a password; when it leaks, it leaks forever.

And we already know how this story plays out. Discord implemented a similar age-verification tool, and its biometric provider was hacked. Thousands of face scans, identity documents and ages were leaked across the dark web. The BBC’s reporting was blunt:
You cannot un-leak a face.


Source: https://www.bbc.com/news/articles/c8jmzd972leo

Roblox’s own security record offers no comfort.
The platform has a long, well-documented history of data breaches:
• 2016 breach: 50,000+ users’ emails, IPs, purchase logs exposed
• 2022 internal breach: employee account hacked, sensitive documents leaked
• multiple third-party vendor leaks across 2023–2024
(References: HaveIBeenPwned, Onerep Security Analysis)

And still, children are asked to hand over their biometric data.

The Paradigm Shift study by International Justice Mission and Childlight (UNSW), one of the most comprehensive investigations ever conducted, found that offenders are migrating into gaming platforms like Roblox because that is where children gather and where safety systems are weakest.
• 1.8% of Australian men surveyed admitted to livestreaming sexual acts with children
• another 4.7% said they would if offered

Source (summary): https://ijm.org.au/studies/paradigm-shift

Full PDF: https://assets-sea.ijm.org/documents/Childlight_IJM_PreventingLivestreamedAbuseofChildren.pdf

Meanwhile, police agencies worldwide, including the FBI, Malaysia, the UK, Europe and the AFP, have confirmed that Roblox is repeatedly used as an entry point for grooming. In the U.S., a federal lawsuit describes a young girl groomed and assaulted by a man she first met through Roblox chat. The Texas Attorney-General has sued Roblox for “putting profits over children’s safety.”

Even in Australia, our eSafety Commissioner has forced Roblox to adopt anti-grooming measures under the Online Safety Codes, a step reserved only for platforms where significant harm has been proven.

So again, the question returns:

Why is Roblox exempt?
Why are TikTok, Instagram, Snapchat and Reddit restricted for under-16s, yet a platform with far more documented child-safety failures remains untouched?

Minister Anika Wells is responsible for the framework of this ban. With all the evidence available, police investigations, data breaches, academic studies, media exposés, lawsuits, and regulator action, parents deserve clarity on why the most dangerous child-facing platform was given a free pass.

Because Roblox is not “just a game.” It is a vast, unregulated social ecosystem where children and adult strangers interact freely behind avatars, voice chat, private messages and user-generated worlds.

The issue is simple and deeply uncomfortable: If the goal is protecting children, how can the platform with the highest documented risk be the only one left unregulated?

Two years ago, my son felt something was wrong before I fully understood it. His instincts were right. The evidence now confirms it.

And the question remains: If not Roblox, then what exactly is this ban trying to protect children from?

Let me be clear, I don't agree with the ban, I think it's a band-aid fix, I don't feel it gets to the root cause of the issues surrounding kids and the lack of parent-child connection, but if it's going to happen, let's include the main app that openly houses paedophiles and predators. 

My name is Jacintha Field, im a Family and Child Counsellor, mum, founder of Happy Souls Kids who is on a mission to help keep children safe and help over 100,000 children learn self-regulation, manage their emotions, and cope with anxiety by 2027.

If you have any further information here which you feel should be added, please send me an email hello@happysoulskids.com or you can find me at @jacinthafield (insta, Tiktok, Linkedin ect) or @happysoulskids

LLG,

Jxo

52

Recent signers:
Kelly Podolan and 19 others have signed recently.

The issue

Why is a platform that has been repeatedly linked to grooming, coercion, predatory behaviour, sexualised content, and offshore biometric data storage not included in Australia’s under-16 social media ban?

I have been questioning this app ever since I watched my own child delete Roblox after it began to affect his mood, behaviour and self-regulation in ways I couldn’t ignore. The day he removed it, everything shifted. His anxiety eased. His nervous system settled. And I started to see the platform for what it truly was, long before headlines confirmed it: not a game, but a vulnerable and dangerously unregulated social network for children.

Years later, we were featured in the Herald Sun and Daily Telegraph (https://www.dailytelegraph.com.au/news/national/let-them-be-kids/inside-roblox-the-dark-side-of-australias-most-popular-kids-game/news-story/05e6dbfa6bd80ed1f7cb54da5756b85d which published a double-page spread in the Herald Sun, an investigation exposing the risks so many families had been noticing in silence. Their reporting detailed grooming attempts, simulated sexual behaviour, adults contacting children in private servers, coercive chat, and financial exploitation, all inside a platform marketed as harmless play. Seeing our experience reflected in print is what pushed me to investigate why Roblox, of all platforms, would be exempt from the new ban.

“A playground where predators can walk straight in disguised as children.”

The Daily Telegraph later published its own national investigation titled “Inside Roblox: The Dark Side of Australia’s Most Popular Kids’ Game”, revealing that Roblox is now the number one gaming platform for Australian children, more popular than Minecraft, Fortnite or YouTube combined. Their reporting exposed sexualised role-play, simulated strip clubs, “dating” games where adults interact with children, and private rooms designed for explicit behaviour that children can access within a few clicks. Experts quoted in the piece called Roblox “a playground where predators can walk straight in disguised as children,” warning that the platform’s lack of real age verification and easily bypassed moderation create a “catastrophic design flaw.” The investigation detailed grooming patterns that begin inside seemingly innocent games before children are moved into private experiences and then pushed onto external apps like Discord or WhatsApp. The Telegraph’s coverage echoed what so many parents had quietly been experiencing: Roblox looks safe, but it is not.

That question has only grown louder as the evidence has stacked up.

In 2024, Bloomberg published “Roblox’s Pedophile Problem,” an exposé revealing how predators deliberately exploit Roblox’s weak moderation, child-dominated environments and user-generated worlds to access minors. The investigation also revealed that Roblox introduced an AI facial-recognition age-verification system, requiring children to scan their faces on camera. Those biometric scans are processed by an offshore third-party vendor.

"With 78 million daily active users today, Roblox has become social media for the youngest generation. Every second, according to Roblox, it processes more than 50,000 chat messages—Hey loser, cute outfit, let’s be friends—through its moderation protocols, a combination of artificial intelligence technology and human workers that the company says scans all user content, including audio and text. Roblox has about 3,000 moderators, significantly fewer than TikTok, which has three times the number of daily users but employs 40,000 moderators. (Roblox says the number of moderators isn’t an indicator of quality.)"

Since 2018, police in the US have arrested at least two dozen people accused of abducting or abusing victims they’d met or groomed using Roblox, according to data compiled by Bloomberg Businessweek. Some were already on sex offender registries or had been accused of abusing minors; there were also a sheriff’s deputy, a third-grade teacher and a nurse."

“Roblox spends so much time, effort and money convincing parents that their platform is safer than it actually is”

Source: https://www.bloomberg.com/features/2024-roblox-pedophile-problem/

Every parent should feel uneasy reading that. Biometric data isn’t like a password; when it leaks, it leaks forever.

And we already know how this story plays out. Discord implemented a similar age-verification tool, and its biometric provider was hacked. Thousands of face scans, identity documents and ages were leaked across the dark web. The BBC’s reporting was blunt:
You cannot un-leak a face.


Source: https://www.bbc.com/news/articles/c8jmzd972leo

Roblox’s own security record offers no comfort.
The platform has a long, well-documented history of data breaches:
• 2016 breach: 50,000+ users’ emails, IPs, purchase logs exposed
• 2022 internal breach: employee account hacked, sensitive documents leaked
• multiple third-party vendor leaks across 2023–2024
(References: HaveIBeenPwned, Onerep Security Analysis)

And still, children are asked to hand over their biometric data.

The Paradigm Shift study by International Justice Mission and Childlight (UNSW), one of the most comprehensive investigations ever conducted, found that offenders are migrating into gaming platforms like Roblox because that is where children gather and where safety systems are weakest.
• 1.8% of Australian men surveyed admitted to livestreaming sexual acts with children
• another 4.7% said they would if offered

Source (summary): https://ijm.org.au/studies/paradigm-shift

Full PDF: https://assets-sea.ijm.org/documents/Childlight_IJM_PreventingLivestreamedAbuseofChildren.pdf

Meanwhile, police agencies worldwide, including the FBI, Malaysia, the UK, Europe and the AFP, have confirmed that Roblox is repeatedly used as an entry point for grooming. In the U.S., a federal lawsuit describes a young girl groomed and assaulted by a man she first met through Roblox chat. The Texas Attorney-General has sued Roblox for “putting profits over children’s safety.”

Even in Australia, our eSafety Commissioner has forced Roblox to adopt anti-grooming measures under the Online Safety Codes, a step reserved only for platforms where significant harm has been proven.

So again, the question returns:

Why is Roblox exempt?
Why are TikTok, Instagram, Snapchat and Reddit restricted for under-16s, yet a platform with far more documented child-safety failures remains untouched?

Minister Anika Wells is responsible for the framework of this ban. With all the evidence available, police investigations, data breaches, academic studies, media exposés, lawsuits, and regulator action, parents deserve clarity on why the most dangerous child-facing platform was given a free pass.

Because Roblox is not “just a game.” It is a vast, unregulated social ecosystem where children and adult strangers interact freely behind avatars, voice chat, private messages and user-generated worlds.

The issue is simple and deeply uncomfortable: If the goal is protecting children, how can the platform with the highest documented risk be the only one left unregulated?

Two years ago, my son felt something was wrong before I fully understood it. His instincts were right. The evidence now confirms it.

And the question remains: If not Roblox, then what exactly is this ban trying to protect children from?

Let me be clear, I don't agree with the ban, I think it's a band-aid fix, I don't feel it gets to the root cause of the issues surrounding kids and the lack of parent-child connection, but if it's going to happen, let's include the main app that openly houses paedophiles and predators. 

My name is Jacintha Field, im a Family and Child Counsellor, mum, founder of Happy Souls Kids who is on a mission to help keep children safe and help over 100,000 children learn self-regulation, manage their emotions, and cope with anxiety by 2027.

If you have any further information here which you feel should be added, please send me an email hello@happysoulskids.com or you can find me at @jacinthafield (insta, Tiktok, Linkedin ect) or @happysoulskids

LLG,

Jxo

Support now

52


Supporter voices

Petition updates

Share this petition

Petition created on 30 November 2025