Return ROBLOX® to the OG Days

Recent signers:
Anaya Malhotra and 9 others have signed recently.

The Issue

Video games have long been places of creativity, social connection, and self-expression for young people. Platforms such as ROBLOX, Minecraft, Fortnite, Steam, and other online games were built on the idea that players could participate safely without surrendering their identity, biometric data, or constant surveillance.

That is now changing.

Across the UK, US, and other regions, game companies are increasingly introducing AI-driven age verification systems, including facial scans, camera checks, biometric age estimation, and digital ID requirements. These technologies are being framed as “safety features,” but in practice they are invasive, unreliable, and dangerous, especially for children.

Facial recognition and biometric data collection pose serious risks:

Biometric data cannot be changed if leaked or stolen

Centralized storage increases the impact of data breaches and 

Camera access introduces the risk of unauthorized access, misuse, or exploitation

AI age estimation is not accurate and can wrongly block or restrict users

Forcing or pressuring players to use cameras or submit biometric data creates privacy-based discrimination, excluding those who:

Do not own a camera

Refuse camera access for privacy reasons

Are uncomfortable sharing biometric information

This is especially concerning in games primarily used by children and teenagers.

Roblox and the Removal of Player Choice

On ROBLOX in particular, the situation is worsening. The removal of classic 2D avatar faces and the push toward Dynamic Faces is not just a design decision — it pressures players toward systems that normalize camera use and facial analysis.

In addition:

Previously purchased avatar items have been removed or altered, harming consumers

Players are losing control over how they present themselves

Features are being changed without adequate consent or compensation

Safety should never be achieved by deleting paid items, forcing surveillance-linked features, or eroding player choice.

Safety Does Not Require Surveillance

Concerns about inappropriate interactions in game chats are often used to justify these invasive systems. However, most major platforms already provide:

Parental controls

Chat filters

Account restrictions

Monitoring and reporting tools

When these tools exist, their use is the responsibility of parents or guardians, not a justification for mass biometric surveillance of all players. Punishing every child with facial scanning because safeguards were not used is neither fair nor effective.

A Dangerous Precedent for All Games

If biometric age verification becomes normalized in games, it will not stop with ROBLOX. It will spread to Minecraft, Fortnite, Steam, and future platforms, turning entertainment spaces into identity checkpoints.

Children should not have to trade:

Their face

Their identity

Their privacy

Their purchased digital property

just to play a game.

What We Are Asking For

We call on governments, regulators, and game companies to reject biometric surveillance in gaming, protect children’s digital rights, and promote non-intrusive, privacy-respecting safety solutions.

Games should remain places of creativity and fun — not testing grounds for surveillance technology.

We just want to play games and enjoy it, not be under a privacy nightmare. 

avatar of the starter
Owen RPetition StarterI hate big money greedy corporations

12

Recent signers:
Anaya Malhotra and 9 others have signed recently.

The Issue

Video games have long been places of creativity, social connection, and self-expression for young people. Platforms such as ROBLOX, Minecraft, Fortnite, Steam, and other online games were built on the idea that players could participate safely without surrendering their identity, biometric data, or constant surveillance.

That is now changing.

Across the UK, US, and other regions, game companies are increasingly introducing AI-driven age verification systems, including facial scans, camera checks, biometric age estimation, and digital ID requirements. These technologies are being framed as “safety features,” but in practice they are invasive, unreliable, and dangerous, especially for children.

Facial recognition and biometric data collection pose serious risks:

Biometric data cannot be changed if leaked or stolen

Centralized storage increases the impact of data breaches and 

Camera access introduces the risk of unauthorized access, misuse, or exploitation

AI age estimation is not accurate and can wrongly block or restrict users

Forcing or pressuring players to use cameras or submit biometric data creates privacy-based discrimination, excluding those who:

Do not own a camera

Refuse camera access for privacy reasons

Are uncomfortable sharing biometric information

This is especially concerning in games primarily used by children and teenagers.

Roblox and the Removal of Player Choice

On ROBLOX in particular, the situation is worsening. The removal of classic 2D avatar faces and the push toward Dynamic Faces is not just a design decision — it pressures players toward systems that normalize camera use and facial analysis.

In addition:

Previously purchased avatar items have been removed or altered, harming consumers

Players are losing control over how they present themselves

Features are being changed without adequate consent or compensation

Safety should never be achieved by deleting paid items, forcing surveillance-linked features, or eroding player choice.

Safety Does Not Require Surveillance

Concerns about inappropriate interactions in game chats are often used to justify these invasive systems. However, most major platforms already provide:

Parental controls

Chat filters

Account restrictions

Monitoring and reporting tools

When these tools exist, their use is the responsibility of parents or guardians, not a justification for mass biometric surveillance of all players. Punishing every child with facial scanning because safeguards were not used is neither fair nor effective.

A Dangerous Precedent for All Games

If biometric age verification becomes normalized in games, it will not stop with ROBLOX. It will spread to Minecraft, Fortnite, Steam, and future platforms, turning entertainment spaces into identity checkpoints.

Children should not have to trade:

Their face

Their identity

Their privacy

Their purchased digital property

just to play a game.

What We Are Asking For

We call on governments, regulators, and game companies to reject biometric surveillance in gaming, protect children’s digital rights, and promote non-intrusive, privacy-respecting safety solutions.

Games should remain places of creativity and fun — not testing grounds for surveillance technology.

We just want to play games and enjoy it, not be under a privacy nightmare. 

avatar of the starter
Owen RPetition StarterI hate big money greedy corporations

The Decision Makers

Valve Corporation
Valve Corporation
Steam Platform & Policy Team
Epic Games
Epic Games
Executive Leadership & Board of Directors
European Commission
European Commission
Justice & Consumers / Digital Policy
Information Commissioner’s Office (ICO)
Information Commissioner’s Office (ICO)
UK Data Protection Authority
United States Government
United States Government
Federal Trade Commission (FTC) & Congress

Supporter Voices

Petition updates
Share this petition
Petition created on 29 January 2026