Let users truly delete their data on Discord - protect minors and abuse victims.


Let users truly delete their data on Discord - protect minors and abuse victims.
The Issue
To the leaders at Discord, and to all who care about the safety of individuals, especially minors:
We are living in a time where our lives, our memories, and our personal experiences are increasingly tied to the digital world. We share our thoughts, our feelings, and sometimes our most vulnerable moments with the click of a button, without realizing how deeply those interactions can shape our lives - and not always for the better.
And yet, we trust platforms like Discord to protect us, to protect our children. We trust them to delete our data when we ask, to give us control over our digital footprint, and to ensure that our mistakes or moments of vulnerability are not left to haunt us forever.
But Discord is not doing that.
When you delete your account on Discord, you might think that your data is gone. You might think you’ve taken the necessary steps to remove your past - perhaps to escape the online harassment, or to move on from a mistake. But in reality, Discord keeps your data exposed. The process they call “Anonymization” is nothing more than a buzzword. It’s a false promise. It removes your username and profile picture, but leaves genuinely everything else - all private messages, all photos, all videos, all conversations - sitting there for anyone to access, including sexually abusive pedophiles or abusive individuals the user is trying to avoid - which have been unfortunately common on the app.
Ironically, it also makes it so that you no longer have the ability to remove any of your data on your own. Yes, you heard that. 'Deleting your account' on Discord doesn't delete any of your personal data, it deletes your ability to delete your personal data.
This is not just a technical issue - it’s a moral one as it is extremely dangerous.
I’ve spoken with people who’ve experienced firsthand the devastating consequences of Discord’s failure to protect users. Another described how they were blackmailed with sexually explicit photos of them as a minor because their private messages, erased in name but not in substance, were still visible to others. These are not isolated cases - they are real people, real lives, affected by this system.
And perhaps most importantly, many of these people are minors. Young children and teenagers, vulnerable and naïve, who are exposed to this flawed system, thinking they can erase their mistakes.
If we don’t act, this is going to get worse. Far, far worse as younger generations will have an even larger dependency on social media
Parents need to know that when they trust their children with apps like Discord, they are placing them in an environment where their sensitive data may not be safe. They need to know that when their kids delete an account, they are not truly freeing themselves from their past. That vulnerable photo, that private message, that conversation - it could still be out there, and there's nothing anyone can do about it.
This isn’t just a failure of a single platform. It is a failure of all of us, as a society, to protect the most vulnerable among us - the children and the individuals who are exposed to the dark side of the internet. It is a failure of trust.
We are asking for more than just privacy rights. We are asking for the fundamental right to be forgotten - to be able to walk away from a digital life that no longer serves us, to be free of our past mistakes and our vulnerabilities. This is not just about data. It is about dignity, safety, and human respect. We cannot allow platforms like Discord to continue to negligently expose children and everyday people to harm, simply because they failed to do what they promised.
Here’s what needs to happen:
Total Deletion of All Data After Account Deletion: When someone deletes their Discord account, all of their data - everything - should be permanently removed. This includes all messages, images, videos, and conversations. No remnants. No traces. No lingering data that could later be exploited, used to embarrass, or leveraged for blackmail. This is why Discord must honor requests for complete data removal from past “Deleted Users.”
However, this policy should apply specifically to users who have personally and explicitly requested that their data be deleted. Many users are banned under false pretenses or incorrectly flagged by automated or AI-driven systems, and applying blanket deletion in those cases could result in the irreversible loss of important or exculpatory data. For this reason, permanent data deletion should occur by default only when a user has directly requested it, unless a determination is made by Discord Support that an account’s data must be permanently removed for legal or safety reasons (for example, in cases involving CSAM).
It is incredibly easy to view any data made by a Deleted User - as none of it is deleted in any way, shape or form - except for the users profile picture. In-fact, none of it is 'anonymized' from other Deleted Users either, especially if the other user has already DMed them (i.e. an abusive ex, a child predator, etcetera)
Currently, Discord's policy deletes the users ability to delete their data - not their data. This, and other things mentioned on this list can be easily interpreted as a potential violation of the GDPR - which Discord has faced complaints about from the French government in the past.
Future Deletion Policies Must Be Absolute: Moving forward, when someone chooses to delete their account, they must have the ability to completely delete everything associated with it. No exceptions. It should not be up to Discord to determine what data stays and what data goes.
Allowing Users to Mass Delete Messages in Old Servers or DMs Without Deleting Their Account: No Discord user should be forced to rely on third-party programs or browser extensions - or to negotiate with unsavory individuals to remove their own messages from old servers or direct messages - especially in situations involving abusive ex-partners or communities with harmful & or abusive individuals. While Discord’s existing “data package” system technically allows users to access their historical messages, the process is confusing, inaccessible, and effectively limited to a small subset of users with technical expertise and familiarity with tools like Visual Studio Code. Discord should provide a clear, built-in interface that allows users to view and delete their own messages from servers or DMs they are no longer part of. Enabling this functionality would meaningfully reduce power imbalances & blackmail, prevent ongoing harassment or exploitation, and give users real control over their personal data without requiring them to delete their entire account. Discord currently has zero meaningful protections for this issue.
The ability to "update" metadata: In order to save server space and increase speed, websites like Discord attempt to use a practice known as 'metadata' to save data long after it has been deleted from other websites. For example, if someone were to post a picture on a image hosting website - that image data would be visible decades after the content has been deleted from the website on a mirror recreated by Discord. Users should be able to input a link in - to 'update' metadata - which essentially pings Discord's servers to check if the data still exists; and delete it permanently from their servers and most importantly from being accessed by any user - if the data is deleted. Not only should this apply to images - but also text summaries of websites which are created of the page. This especially helps in cases of child pornography / CSAM content, doxxing, as well as any kind of private data - when data is often retained by sleuths long after deletion. Like the others mentioned on this list, Discord currently has zero meaningful protections for this issue.
Additionally - it's important to note that Discord in the past has faced legal pressure and serious fines (800,000 Euros) by the French government and related EU officials - for serious negligence that has caused likely even less damage than this. [2] Additionally, continuing to hold data without user's consent also broadly costs Discord far more storage space (and risks of serious privacy / abuse lawsuits, even not counting government action). Discord is a company which needs money, and it's important to note that not only does this raise serious ethical concerns - but financial concerns for the company.
This is not just a tech issue. This is a human issue. This is about the safety of our children, the dignity of every individual who has ever made a mistake, and the collective responsibility we all have to protect one another in the digital age.
Every time we choose to trust a platform, we are placing a piece of our humanity in their hands. It is time for Discord to prove they are worthy of that trust. It is time for them to fully honor their users by respecting their right to be forgotten.
This is not just about Discord - it’s about what we, as a society, stand for. Do we stand for the protection of vulnerable people, for the security of personal data, and for the integrity of the online spaces we entrust with our lives? Or do we stand for a future where our digital selves are held hostage by platforms who fail to protect us?
I urge you to sign this petition. Stand up for privacy. Stand up for the safety of our children. Stand up for what’s right. Let’s demand that Discord finally take responsibility for the safety and well-being of its users. The time to act is now.
232
The Issue
To the leaders at Discord, and to all who care about the safety of individuals, especially minors:
We are living in a time where our lives, our memories, and our personal experiences are increasingly tied to the digital world. We share our thoughts, our feelings, and sometimes our most vulnerable moments with the click of a button, without realizing how deeply those interactions can shape our lives - and not always for the better.
And yet, we trust platforms like Discord to protect us, to protect our children. We trust them to delete our data when we ask, to give us control over our digital footprint, and to ensure that our mistakes or moments of vulnerability are not left to haunt us forever.
But Discord is not doing that.
When you delete your account on Discord, you might think that your data is gone. You might think you’ve taken the necessary steps to remove your past - perhaps to escape the online harassment, or to move on from a mistake. But in reality, Discord keeps your data exposed. The process they call “Anonymization” is nothing more than a buzzword. It’s a false promise. It removes your username and profile picture, but leaves genuinely everything else - all private messages, all photos, all videos, all conversations - sitting there for anyone to access, including sexually abusive pedophiles or abusive individuals the user is trying to avoid - which have been unfortunately common on the app.
Ironically, it also makes it so that you no longer have the ability to remove any of your data on your own. Yes, you heard that. 'Deleting your account' on Discord doesn't delete any of your personal data, it deletes your ability to delete your personal data.
This is not just a technical issue - it’s a moral one as it is extremely dangerous.
I’ve spoken with people who’ve experienced firsthand the devastating consequences of Discord’s failure to protect users. Another described how they were blackmailed with sexually explicit photos of them as a minor because their private messages, erased in name but not in substance, were still visible to others. These are not isolated cases - they are real people, real lives, affected by this system.
And perhaps most importantly, many of these people are minors. Young children and teenagers, vulnerable and naïve, who are exposed to this flawed system, thinking they can erase their mistakes.
If we don’t act, this is going to get worse. Far, far worse as younger generations will have an even larger dependency on social media
Parents need to know that when they trust their children with apps like Discord, they are placing them in an environment where their sensitive data may not be safe. They need to know that when their kids delete an account, they are not truly freeing themselves from their past. That vulnerable photo, that private message, that conversation - it could still be out there, and there's nothing anyone can do about it.
This isn’t just a failure of a single platform. It is a failure of all of us, as a society, to protect the most vulnerable among us - the children and the individuals who are exposed to the dark side of the internet. It is a failure of trust.
We are asking for more than just privacy rights. We are asking for the fundamental right to be forgotten - to be able to walk away from a digital life that no longer serves us, to be free of our past mistakes and our vulnerabilities. This is not just about data. It is about dignity, safety, and human respect. We cannot allow platforms like Discord to continue to negligently expose children and everyday people to harm, simply because they failed to do what they promised.
Here’s what needs to happen:
Total Deletion of All Data After Account Deletion: When someone deletes their Discord account, all of their data - everything - should be permanently removed. This includes all messages, images, videos, and conversations. No remnants. No traces. No lingering data that could later be exploited, used to embarrass, or leveraged for blackmail. This is why Discord must honor requests for complete data removal from past “Deleted Users.”
However, this policy should apply specifically to users who have personally and explicitly requested that their data be deleted. Many users are banned under false pretenses or incorrectly flagged by automated or AI-driven systems, and applying blanket deletion in those cases could result in the irreversible loss of important or exculpatory data. For this reason, permanent data deletion should occur by default only when a user has directly requested it, unless a determination is made by Discord Support that an account’s data must be permanently removed for legal or safety reasons (for example, in cases involving CSAM).
It is incredibly easy to view any data made by a Deleted User - as none of it is deleted in any way, shape or form - except for the users profile picture. In-fact, none of it is 'anonymized' from other Deleted Users either, especially if the other user has already DMed them (i.e. an abusive ex, a child predator, etcetera)
Currently, Discord's policy deletes the users ability to delete their data - not their data. This, and other things mentioned on this list can be easily interpreted as a potential violation of the GDPR - which Discord has faced complaints about from the French government in the past.
Future Deletion Policies Must Be Absolute: Moving forward, when someone chooses to delete their account, they must have the ability to completely delete everything associated with it. No exceptions. It should not be up to Discord to determine what data stays and what data goes.
Allowing Users to Mass Delete Messages in Old Servers or DMs Without Deleting Their Account: No Discord user should be forced to rely on third-party programs or browser extensions - or to negotiate with unsavory individuals to remove their own messages from old servers or direct messages - especially in situations involving abusive ex-partners or communities with harmful & or abusive individuals. While Discord’s existing “data package” system technically allows users to access their historical messages, the process is confusing, inaccessible, and effectively limited to a small subset of users with technical expertise and familiarity with tools like Visual Studio Code. Discord should provide a clear, built-in interface that allows users to view and delete their own messages from servers or DMs they are no longer part of. Enabling this functionality would meaningfully reduce power imbalances & blackmail, prevent ongoing harassment or exploitation, and give users real control over their personal data without requiring them to delete their entire account. Discord currently has zero meaningful protections for this issue.
The ability to "update" metadata: In order to save server space and increase speed, websites like Discord attempt to use a practice known as 'metadata' to save data long after it has been deleted from other websites. For example, if someone were to post a picture on a image hosting website - that image data would be visible decades after the content has been deleted from the website on a mirror recreated by Discord. Users should be able to input a link in - to 'update' metadata - which essentially pings Discord's servers to check if the data still exists; and delete it permanently from their servers and most importantly from being accessed by any user - if the data is deleted. Not only should this apply to images - but also text summaries of websites which are created of the page. This especially helps in cases of child pornography / CSAM content, doxxing, as well as any kind of private data - when data is often retained by sleuths long after deletion. Like the others mentioned on this list, Discord currently has zero meaningful protections for this issue.
Additionally - it's important to note that Discord in the past has faced legal pressure and serious fines (800,000 Euros) by the French government and related EU officials - for serious negligence that has caused likely even less damage than this. [2] Additionally, continuing to hold data without user's consent also broadly costs Discord far more storage space (and risks of serious privacy / abuse lawsuits, even not counting government action). Discord is a company which needs money, and it's important to note that not only does this raise serious ethical concerns - but financial concerns for the company.
This is not just a tech issue. This is a human issue. This is about the safety of our children, the dignity of every individual who has ever made a mistake, and the collective responsibility we all have to protect one another in the digital age.
Every time we choose to trust a platform, we are placing a piece of our humanity in their hands. It is time for Discord to prove they are worthy of that trust. It is time for them to fully honor their users by respecting their right to be forgotten.
This is not just about Discord - it’s about what we, as a society, stand for. Do we stand for the protection of vulnerable people, for the security of personal data, and for the integrity of the online spaces we entrust with our lives? Or do we stand for a future where our digital selves are held hostage by platforms who fail to protect us?
I urge you to sign this petition. Stand up for privacy. Stand up for the safety of our children. Stand up for what’s right. Let’s demand that Discord finally take responsibility for the safety and well-being of its users. The time to act is now.
232
The Decision Makers
Supporter Voices
Petition created on January 1, 2026