Roblox's “Safety Update” Protects Roblox, Not Children

Recent signers:
Xin rev and 15 others have signed recently.

The Issue

In late 2024, five U.S. states sued Roblox over its failure to protect children from predatory behaviour, harmful content, and exploitative monetisation. Weeks later, Roblox announced a sweeping "child safety update" centred on age-segregated communication and mandatory ID verification.

The timing raises serious questions about motive. And the update itself won't make children safer, it will make them harder to protect while fracturing the communities that millions of players depend on.

Child safety on Roblox is a real problem that demands real solutions. This update isn't one.


Why This Update Fails Children

Roblox's approach rests on two assumptions: that segregating players by age reduces risk, and that ID verification screens out bad actors. Neither holds up.

Segregation removes oversight, not danger. Roughly 40% of Roblox users are under 13, while 60% are 13 or older—including around 41% who are adults. This isn't a children's platform with a few adults wandering in; it's a genuinely mixed-age community. Development teams, learning groups, creative studios, roleplay clans—many include experienced teens and adults who serve as informal moderators.

Under the new system, younger players will be pushed into age-restricted spaces where these stabilising presences cannot see or intervene. Fewer eyes mean less accountability. Predatory behaviour doesn't disappear when you remove witnesses, it becomes harder to detect. Segregating minors doesn't eliminate risk categories; it may concentrate those risks.

Verification doesn't verify intent. ID checks confirm that someone is the age they claim. They do not confirm that person is safe. A verified adult with predatory intentions passes every check. Meanwhile, legitimate players, especially younger users, those with privacy concerns, or those in regions with limited ID infrastructure, get locked out.

The system screens for paperwork, not behaviour. That's not safety. That's compliance theatre.

Moderation tools are being restricted, not strengthened. The update limits developers' and group leaders' ability to review message logs, monitor chat, or track reported behaviour.

You cannot protect children while simultaneously blinding the people responsible for supervision. Effective moderation requires visibility — with appropriate privacy safeguards — not enforced ignorance.

 
Why This Update Exists

Roblox generated $2.8 billion in gross profit in 2024. The company has the resources to build robust, human-supported moderation with trained staff, transparent enforcement, meaningful appeals, and effective developer tools.

Instead, it appears to have chosen the cheapest path to reduced legal liability: automate verification, segregate users by age, and shift responsibility away from the platform.

This update protects Roblox from lawsuits. It does not protect children from harm.

 
The Privacy Problem No One Asked For

The update requires facial scans and government ID, processed not by Roblox, but by a third-party company most users have never heard of.

Players are being asked to hand over sensitive biometric data to a corporation they didn't choose, on a platform with a documented history of inconsistent enforcement. Biometric data collection is not necessary for effective moderation, yet it's being imposed as a condition of basic participation.

This creates a new category of risk unrelated to child safety: mass collection of children's and adults' biometric information by an outside vendor, with limited transparency about storage, access, or breach protocols.

Parents concerned about their children's safety online should be equally concerned about where their children's facial scans are going.

 
What Will Actually Happen to Communities

Over 15 million players participate in all-ages Roblox groups: military simulations, racing teams, development studios, learning communities, and creative collectives. These aren't just games. They're where players meet, collaborate, mentor each other, and build the experiences that keep the platform alive.

When members can't communicate across age barriers, these groups cannot coordinate events, train new members, or function as teams. Mixed-age collaboration, the foundation of Roblox's creator ecosystem, becomes impossible.

Starting in 2026, even Roblox Studio and Team Create will be affected. Over 90% of Roblox's top 1,000 experiences are built by developers 18 or older, but these teams frequently include younger collaborators learning alongside them. This update breaks those workflows and threatens the very pipeline of talent that Roblox's future depends on.

This isn't a minor inconvenience. It's the slow collapse of the structures that make Roblox more than a collection of isolated games.

 
What We Demand

Roblox must withdraw this update and commit to safety measures that actually protect children:

  1. Hire and train human moderators at scale. Commit to a public staffing ratio appropriate to platform scale, with transparent reporting on response times and resolution rates.
  2. Give developers real moderation tools. Community leaders need visibility into reported behaviour and chat logs for review, with appropriate privacy safeguards for both children and adults, and the ability to act on warnings before harm occurs.
  3. Create a transparent enforcement system. Publish clear, specific moderation guidelines. Provide meaningful appeals with human review. End the current pattern of vague bans and unexplained asset removals.
  4. Make verification optional and privacy-respecting. If age verification is offered, it should be one tool among many, not a mandatory gate that locks out legitimate players and collects biometric data from children.
  5. Report publicly on safety outcomes. Publish quarterly metrics on moderation actions, response times, appeals outcomes, and detected violations. Let the results speak for themselves.
     
    Children deserve protection that works. Communities deserve to exist. Roblox has the resources to deliver both, if it chooses accountability over legal cover.

 


Cue: I wanted to choose a picture that represented what my community means to me, but I realised that there is no single picture that could, even just partially, capture what it truly is. Years and years of daily memories that helped me grow up into who I am today. Friends, who despite being far away in other countries, value me as much as I value them. This isn't "just a game". This is part of life, a hobby or second nature for kids and adults alike who found a beautiful bond with their game, and their community. Don't let Roblox take all this away just to keep themselves from lawsuits while avoiding the harder, responsible choices.

avatar of the starter
Wyn W.Petition Starter

251

Recent signers:
Xin rev and 15 others have signed recently.

The Issue

In late 2024, five U.S. states sued Roblox over its failure to protect children from predatory behaviour, harmful content, and exploitative monetisation. Weeks later, Roblox announced a sweeping "child safety update" centred on age-segregated communication and mandatory ID verification.

The timing raises serious questions about motive. And the update itself won't make children safer, it will make them harder to protect while fracturing the communities that millions of players depend on.

Child safety on Roblox is a real problem that demands real solutions. This update isn't one.


Why This Update Fails Children

Roblox's approach rests on two assumptions: that segregating players by age reduces risk, and that ID verification screens out bad actors. Neither holds up.

Segregation removes oversight, not danger. Roughly 40% of Roblox users are under 13, while 60% are 13 or older—including around 41% who are adults. This isn't a children's platform with a few adults wandering in; it's a genuinely mixed-age community. Development teams, learning groups, creative studios, roleplay clans—many include experienced teens and adults who serve as informal moderators.

Under the new system, younger players will be pushed into age-restricted spaces where these stabilising presences cannot see or intervene. Fewer eyes mean less accountability. Predatory behaviour doesn't disappear when you remove witnesses, it becomes harder to detect. Segregating minors doesn't eliminate risk categories; it may concentrate those risks.

Verification doesn't verify intent. ID checks confirm that someone is the age they claim. They do not confirm that person is safe. A verified adult with predatory intentions passes every check. Meanwhile, legitimate players, especially younger users, those with privacy concerns, or those in regions with limited ID infrastructure, get locked out.

The system screens for paperwork, not behaviour. That's not safety. That's compliance theatre.

Moderation tools are being restricted, not strengthened. The update limits developers' and group leaders' ability to review message logs, monitor chat, or track reported behaviour.

You cannot protect children while simultaneously blinding the people responsible for supervision. Effective moderation requires visibility — with appropriate privacy safeguards — not enforced ignorance.

 
Why This Update Exists

Roblox generated $2.8 billion in gross profit in 2024. The company has the resources to build robust, human-supported moderation with trained staff, transparent enforcement, meaningful appeals, and effective developer tools.

Instead, it appears to have chosen the cheapest path to reduced legal liability: automate verification, segregate users by age, and shift responsibility away from the platform.

This update protects Roblox from lawsuits. It does not protect children from harm.

 
The Privacy Problem No One Asked For

The update requires facial scans and government ID, processed not by Roblox, but by a third-party company most users have never heard of.

Players are being asked to hand over sensitive biometric data to a corporation they didn't choose, on a platform with a documented history of inconsistent enforcement. Biometric data collection is not necessary for effective moderation, yet it's being imposed as a condition of basic participation.

This creates a new category of risk unrelated to child safety: mass collection of children's and adults' biometric information by an outside vendor, with limited transparency about storage, access, or breach protocols.

Parents concerned about their children's safety online should be equally concerned about where their children's facial scans are going.

 
What Will Actually Happen to Communities

Over 15 million players participate in all-ages Roblox groups: military simulations, racing teams, development studios, learning communities, and creative collectives. These aren't just games. They're where players meet, collaborate, mentor each other, and build the experiences that keep the platform alive.

When members can't communicate across age barriers, these groups cannot coordinate events, train new members, or function as teams. Mixed-age collaboration, the foundation of Roblox's creator ecosystem, becomes impossible.

Starting in 2026, even Roblox Studio and Team Create will be affected. Over 90% of Roblox's top 1,000 experiences are built by developers 18 or older, but these teams frequently include younger collaborators learning alongside them. This update breaks those workflows and threatens the very pipeline of talent that Roblox's future depends on.

This isn't a minor inconvenience. It's the slow collapse of the structures that make Roblox more than a collection of isolated games.

 
What We Demand

Roblox must withdraw this update and commit to safety measures that actually protect children:

  1. Hire and train human moderators at scale. Commit to a public staffing ratio appropriate to platform scale, with transparent reporting on response times and resolution rates.
  2. Give developers real moderation tools. Community leaders need visibility into reported behaviour and chat logs for review, with appropriate privacy safeguards for both children and adults, and the ability to act on warnings before harm occurs.
  3. Create a transparent enforcement system. Publish clear, specific moderation guidelines. Provide meaningful appeals with human review. End the current pattern of vague bans and unexplained asset removals.
  4. Make verification optional and privacy-respecting. If age verification is offered, it should be one tool among many, not a mandatory gate that locks out legitimate players and collects biometric data from children.
  5. Report publicly on safety outcomes. Publish quarterly metrics on moderation actions, response times, appeals outcomes, and detected violations. Let the results speak for themselves.
     
    Children deserve protection that works. Communities deserve to exist. Roblox has the resources to deliver both, if it chooses accountability over legal cover.

 


Cue: I wanted to choose a picture that represented what my community means to me, but I realised that there is no single picture that could, even just partially, capture what it truly is. Years and years of daily memories that helped me grow up into who I am today. Friends, who despite being far away in other countries, value me as much as I value them. This isn't "just a game". This is part of life, a hobby or second nature for kids and adults alike who found a beautiful bond with their game, and their community. Don't let Roblox take all this away just to keep themselves from lawsuits while avoiding the harder, responsible choices.

avatar of the starter
Wyn W.Petition Starter

The Decision Makers

David Baszucki
David Baszucki
CEO of ROBLOX Corporation
Roblox
Roblox
Roblox Corporation

Supporter Voices

Petition updates
Share this petition
Petition created on December 10, 2025