Stop Facial Recognition Bias Now

The Issue

Facial recognition technology is spreading across the globe—in airports, schools, workplaces, and police forces. It promises safety and efficiency—but for people with dark skin, it too often brings fear, false accusations, and life-altering consequences.

Research shows that facial recognition systems misidentify darker-skinned individuals far more often than lighter-skinned individuals, sometimes hundreds of times more frequently. These are not harmless errors—they can lead to wrongful arrests, harassment, job loss, and lasting damage to personal reputation. Innocent people are being punished, monitored, and treated as suspects simply because of the color of their skin.

Technology is not neutral. Facial recognition systems reflect the biases in the data used to train them. But this bias is not inevitable. We already have alternative technologies that can identify people safely and fairly without discriminating based on skin tone. We must demand that tech companies and governments prioritize solutions that protect everyone equally.

We call for immediate and decisive action:

Independent, public audits of all facial recognition systems to identify and fix bias.
Transparent reporting of accuracy rates broken down by skin tone, gender, and age.
Human oversight in every decision that could affect someone’s freedom, safety, or livelihood.
A pause on law enforcement use of biased facial recognition technology until systems are proven fair and accurate for all.
Investment in equitable alternative technologies that can enhance security without harming vulnerable populations.
No one should live in fear of being falsely identified by an algorithm. Dark-skinned individuals everywhere deserve protection, fairness, and justice. This is not just a technical problem—it is a matter of human rights.

Sign this petition. Share it. Demand change now. Technology must work for everyone, not just some. Every signature brings us closer to a world where justice is not determined by the color of your skin—or the bias of a machine.

1

The Issue

Facial recognition technology is spreading across the globe—in airports, schools, workplaces, and police forces. It promises safety and efficiency—but for people with dark skin, it too often brings fear, false accusations, and life-altering consequences.

Research shows that facial recognition systems misidentify darker-skinned individuals far more often than lighter-skinned individuals, sometimes hundreds of times more frequently. These are not harmless errors—they can lead to wrongful arrests, harassment, job loss, and lasting damage to personal reputation. Innocent people are being punished, monitored, and treated as suspects simply because of the color of their skin.

Technology is not neutral. Facial recognition systems reflect the biases in the data used to train them. But this bias is not inevitable. We already have alternative technologies that can identify people safely and fairly without discriminating based on skin tone. We must demand that tech companies and governments prioritize solutions that protect everyone equally.

We call for immediate and decisive action:

Independent, public audits of all facial recognition systems to identify and fix bias.
Transparent reporting of accuracy rates broken down by skin tone, gender, and age.
Human oversight in every decision that could affect someone’s freedom, safety, or livelihood.
A pause on law enforcement use of biased facial recognition technology until systems are proven fair and accurate for all.
Investment in equitable alternative technologies that can enhance security without harming vulnerable populations.
No one should live in fear of being falsely identified by an algorithm. Dark-skinned individuals everywhere deserve protection, fairness, and justice. This is not just a technical problem—it is a matter of human rights.

Sign this petition. Share it. Demand change now. Technology must work for everyone, not just some. Every signature brings us closer to a world where justice is not determined by the color of your skin—or the bias of a machine.

The Decision Makers

Christopher Bramwell
Christopher Bramwell
Chief Privacy Officer / Director of the Utah Office of Data Privacy
Blake Moore
U.S. House of Representatives - Utah 1st Congressional District
Mike Lee
U.S. Senate - Utah
Doug Fiefia
Utah House of Representatives - District 48
Spencer Cox
Utah Governor

Petition Updates

Share this petition

Petition created on October 30, 2025