A Code to Consent: Do Not Train — Protect Artists from AI Theft


A Code to Consent: Do Not Train — Protect Artists from AI Theft
The Issue
Artificial intelligence has changed how music is made, but not how it respects the people who create it.
Every day, songs and performances are scraped from the internet and used to train AI models that copy human voices, mimic production styles, and release entire tracks without consent.
For creators like me, this is not innovation; it is extraction. Years of work, skill, and identity are being repurposed by systems that never asked permission. Music should be a voluntary collaboration, not an involuntary replication. Yet, it has become painfully apparent that no meaningful barriers were ever built to mitigate this exploitation.
Protection must begin where music begins. Waiting for governments or legislation to act is no longer enough. The power to protect creators lies with the companies that design the tools we trust.
Who Is Affected and How:
- Independent artists who share music online lose control of their sound.
- Producers and composers whose stems are being repurposed for AI models.
- Vocalists and performers whose voices can be cloned or sold.
- Audiences and cultures that lose trust in what is real and human.
A Code to Consent:
That is why we are calling for A Code to Consent, a built-in safeguard inside every Digital Audio Workstation (DAW) that gives creators the ability to see how their music is used, control its access, and prevent it from being trained or replicated without consent.
Protection must begin where music begins.
What Could This Look Like?
A Code to Consent isn't merely an idea; it is a set of real, implementable tools that can protect creators and place consent at the core of music technology. In doing so, it establishes a new ethical standard within every digital audio workstation (DAW).
Here’s what this could include:
- A “Do Not Train” consent flag embedded at the project or export level. This gives creators a clear way to signal that their work should not be used in AI training or replication.
- A traceable digital watermark invisibly woven into the audio file. It carries consent status and identifying metadata to help creators track whether their music was misused.
- A protective anti-scraping layer that actively resists unauthorized AI systems trying to analyze, mimic, or extract musical features from a file.
Together, these tools turn consent into action. They give artists the ability to say “no,” to be heard when they do, and to follow up when that “no” is ignored.
Protection must begin where music begins.
The Solution:
We are calling on all Digital Audio Workstation (DAW) developers such as Steinberg (Cubase/Nuendo), Apple (Logic Pro), Avid (Pro Tools), Ableton (Live), Image-Line (FL Studio), Bitwig, PreSonus (Studio One), Reason Studio, MOTU (Digital Performer), and others to take responsibility for the tools that define modern music creation and lead with a unified Code to Consent.
This would be a built-in feature that lets every creator decide how their work can be used, whether for AI training, sampling, or redistribution, and ensures that consent travels with the music wherever it goes.
By integrating this protection at the beginning of every song, we can safeguard ownership before the damage is done.
Music deserves built-in consent, not after-the-fact protection.
Together, these standards ensure both sides of digital ethics are addressed: truth for audiences and consent for creators. This extends protection to include explicit creator consent at the point of creation.
Calls to Action:
We are aiming for 1000 signatures to show that creators will not stay silent.
- Add your name.
- share this message.
- Join the movement for fairness, trust, and digital dignity in music
By signing this petition, you’re standing for ethical innovation and digital respect.
Tell DAW companies:
- Write consent into the codebase.
- Build protection in.
- Preserve the right to decide.
Consent in creativity is not optional. It is a digital extension of human authorship and dignity.
Protecting it is protecting the humanity in music itself.
Supporters:
Sign and share under the banner: A Code to Consent: Do Not Train.
Use hashtags like:
#CodeToConsent, #DoNotTrain, and #EthicalDAW.
Developers:
Your platforms built the modern music world. Now protect it.
Embed consent. Preserve authenticity. Give artists the power to choose.
Disclaimer:
This petition advocates ethical innovation and creative rights.
It seeks a global standard within every DAW, enabling artists and developers to build a responsible creative future together.

15
The Issue
Artificial intelligence has changed how music is made, but not how it respects the people who create it.
Every day, songs and performances are scraped from the internet and used to train AI models that copy human voices, mimic production styles, and release entire tracks without consent.
For creators like me, this is not innovation; it is extraction. Years of work, skill, and identity are being repurposed by systems that never asked permission. Music should be a voluntary collaboration, not an involuntary replication. Yet, it has become painfully apparent that no meaningful barriers were ever built to mitigate this exploitation.
Protection must begin where music begins. Waiting for governments or legislation to act is no longer enough. The power to protect creators lies with the companies that design the tools we trust.
Who Is Affected and How:
- Independent artists who share music online lose control of their sound.
- Producers and composers whose stems are being repurposed for AI models.
- Vocalists and performers whose voices can be cloned or sold.
- Audiences and cultures that lose trust in what is real and human.
A Code to Consent:
That is why we are calling for A Code to Consent, a built-in safeguard inside every Digital Audio Workstation (DAW) that gives creators the ability to see how their music is used, control its access, and prevent it from being trained or replicated without consent.
Protection must begin where music begins.
What Could This Look Like?
A Code to Consent isn't merely an idea; it is a set of real, implementable tools that can protect creators and place consent at the core of music technology. In doing so, it establishes a new ethical standard within every digital audio workstation (DAW).
Here’s what this could include:
- A “Do Not Train” consent flag embedded at the project or export level. This gives creators a clear way to signal that their work should not be used in AI training or replication.
- A traceable digital watermark invisibly woven into the audio file. It carries consent status and identifying metadata to help creators track whether their music was misused.
- A protective anti-scraping layer that actively resists unauthorized AI systems trying to analyze, mimic, or extract musical features from a file.
Together, these tools turn consent into action. They give artists the ability to say “no,” to be heard when they do, and to follow up when that “no” is ignored.
Protection must begin where music begins.
The Solution:
We are calling on all Digital Audio Workstation (DAW) developers such as Steinberg (Cubase/Nuendo), Apple (Logic Pro), Avid (Pro Tools), Ableton (Live), Image-Line (FL Studio), Bitwig, PreSonus (Studio One), Reason Studio, MOTU (Digital Performer), and others to take responsibility for the tools that define modern music creation and lead with a unified Code to Consent.
This would be a built-in feature that lets every creator decide how their work can be used, whether for AI training, sampling, or redistribution, and ensures that consent travels with the music wherever it goes.
By integrating this protection at the beginning of every song, we can safeguard ownership before the damage is done.
Music deserves built-in consent, not after-the-fact protection.
Together, these standards ensure both sides of digital ethics are addressed: truth for audiences and consent for creators. This extends protection to include explicit creator consent at the point of creation.
Calls to Action:
We are aiming for 1000 signatures to show that creators will not stay silent.
- Add your name.
- share this message.
- Join the movement for fairness, trust, and digital dignity in music
By signing this petition, you’re standing for ethical innovation and digital respect.
Tell DAW companies:
- Write consent into the codebase.
- Build protection in.
- Preserve the right to decide.
Consent in creativity is not optional. It is a digital extension of human authorship and dignity.
Protecting it is protecting the humanity in music itself.
Supporters:
Sign and share under the banner: A Code to Consent: Do Not Train.
Use hashtags like:
#CodeToConsent, #DoNotTrain, and #EthicalDAW.
Developers:
Your platforms built the modern music world. Now protect it.
Embed consent. Preserve authenticity. Give artists the power to choose.
Disclaimer:
This petition advocates ethical innovation and creative rights.
It seeks a global standard within every DAW, enabling artists and developers to build a responsible creative future together.

15
The Decision Makers
Supporter Voices
Petition created on November 13, 2025