Protect Our Voices from AI Scammers


Protect Our Voices from AI Scammers
The Issue
AI voice cloning is getting out of hand, and it’s affecting all of us - not just Céline Dion, Rihanna or Scarlett Johansson.
"When I heard his voice I thought for sure it was Darryl", said Donna Letto from St. John's to CBC news. Scammers called her in 2023 using her son's AI generated voice, claiming that he had allegedly gotten into an accident and needed the $10,000. Which she paid.
In January, President Trump revoked an executive order aimed at AI safety, leaving us vulnerable to scams and identity theft. We’re calling on him to reinstate these crucial protections immediately.
What’s Going On?
AI technology has advanced to the point where anyone’s voice can be cloned with just a short audio clip. This isn’t just a problem for celebrities; it’s a threat to everyone.
Scammers are using AI to impersonate people’s voices, tricking family members and even banks into handing over money. Many fraudsters use cloned voices to authorize bank transfers, leading to significant financial losses.
1 in 10 of people surveyed in McAfee study said they received a message from an AI voice clone, and 77% of those victims said they lost money as a result.
Artists Are Under Attack Too
Celebrities are also falling victim to this technology. Recently, Céline Dion had to warn her fans about AI-generated songs that falsely claimed to feature her. These unauthorized recordings not only mislead fans but also infringe on artists’ rights and tarnish their reputations.
Sony Music reported removing over 75,000 AI-generated deepfake recordings featuring popular artists like Harry Styles. This highlights how widespread and unchecked the misuse of AI in the music industry has become.
Why We Need Action Now
Without proper regulations, AI voice cloning is spreading like a wildfire. The executive order that was revoked in January provided essential guidelines to control the use of this technology. Reinstating it is a critical step toward protecting individuals and artists from exploitation.
Sign this petition to urge President Trump to bring back the executive order on AI safety.
248
The Issue
AI voice cloning is getting out of hand, and it’s affecting all of us - not just Céline Dion, Rihanna or Scarlett Johansson.
"When I heard his voice I thought for sure it was Darryl", said Donna Letto from St. John's to CBC news. Scammers called her in 2023 using her son's AI generated voice, claiming that he had allegedly gotten into an accident and needed the $10,000. Which she paid.
In January, President Trump revoked an executive order aimed at AI safety, leaving us vulnerable to scams and identity theft. We’re calling on him to reinstate these crucial protections immediately.
What’s Going On?
AI technology has advanced to the point where anyone’s voice can be cloned with just a short audio clip. This isn’t just a problem for celebrities; it’s a threat to everyone.
Scammers are using AI to impersonate people’s voices, tricking family members and even banks into handing over money. Many fraudsters use cloned voices to authorize bank transfers, leading to significant financial losses.
1 in 10 of people surveyed in McAfee study said they received a message from an AI voice clone, and 77% of those victims said they lost money as a result.
Artists Are Under Attack Too
Celebrities are also falling victim to this technology. Recently, Céline Dion had to warn her fans about AI-generated songs that falsely claimed to feature her. These unauthorized recordings not only mislead fans but also infringe on artists’ rights and tarnish their reputations.
Sony Music reported removing over 75,000 AI-generated deepfake recordings featuring popular artists like Harry Styles. This highlights how widespread and unchecked the misuse of AI in the music industry has become.
Why We Need Action Now
Without proper regulations, AI voice cloning is spreading like a wildfire. The executive order that was revoked in January provided essential guidelines to control the use of this technology. Reinstating it is a critical step toward protecting individuals and artists from exploitation.
Sign this petition to urge President Trump to bring back the executive order on AI safety.
248
The Decision Makers

Supporter Voices
Petition Updates
Share this petition
Petition created on 11 March 2025