Petition for Mandatory Vetting and Stronger Safeguarding Across All Online Platforms


Petition for Mandatory Vetting and Stronger Safeguarding Across All Online Platforms
The Issue
Trigger Warning
This petition contains references to sexual assault, childhood abuse, PTSD, hospitalisation, and systemic trauma within investigative and legal processes.
Petition for Safer Digital Platforms & Stronger Accountability Across the Technology Sector
With the rapid rise of digital platforms - including dating apps, social media networks, messaging platforms, gaming communities, professional networking sites, and other technology services - access to creating an online identity has become almost instantaneous.
That access is being exploited.
Individuals with histories of violence, sexual misconduct, coercive control, stalking, and predatory behaviour are able to create profiles with minimal barriers and little meaningful background verification.
Current “verified profile” systems are not sufficient.
A photo check, email confirmation, or blue tick does not protect users from individuals with documented histories of abuse or criminal activity.
Technology companies profit from connection, engagement, and data.
They must also take responsibility for safety.
This is not only about dating apps.
It is about the entire digital ecosystem.
I Speak From Lived Experience
In 2021, I was sexually assaulted by someone I met through a dating app.
As a survivor of childhood abuse, I had always navigated life with vigilance - careful about who I trusted, how I interacted, and who I chose to date. I believed I had done everything “right.”
But that day was different.
In the aftermath, I blamed myself. I told myself I should have been more vigilant. I criticised myself relentlessly for “putting myself in that position.” The shame was heavy and internalised.
It was only through intensive therapy that I learned to hand the shame and blame back to the perpetrator - where it belongs.
That same year, I realised these conversations needed a platform. It became clear that healing cannot remain confined to private therapy rooms - it must inform public discourse and policy reform.
When I reported the assault, I experienced compassionate care from members of An Garda Síochána. The systems activated - including referral to a Sexual Assault Treatment Unit (SATU) - were professional and humane.
However, during the investigative process, I encountered gaps in trauma-informed training that compounded harm. I was treated at times as though I were the suspect rather than the victim.
The legal process became more traumatic than the assault itself.
In January 2022, during the vigil for Aisling Murphy, the full psychological impact of my assault surfaced in a way I could not contain.
On the night of the assault, I feared I might not survive.
At that vigil, the weight of that fear landed: I realised I was “lucky” to be alive.
That moment triggered a severe PTSD episode that resulted in hospitalisation.
During that crisis, I experienced how mental health interventions can intersect with law enforcement procedures under the Irish Mental Health Act. Being processed through a Garda station while in acute psychological distress was deeply retraumatising. I felt treated as though I were a criminal rather than a survivor in crisis.
Both truths can coexist:
There are dedicated professionals doing extraordinary work.
And there are systemic gaps that must be addressed.
This Is Why Prevention Is Not Abstract
The trauma of assault does not end when the incident ends.
It lives in the body. It resurfaces. It reshapes a person’s sense of safety in the world.
It should not take assault, years of therapy, hospitalisation, and public advocacy to expose preventable gaps.
Survivors should not have their therapy notes weaponised in court.
Character references that centre perpetrators over harm must be reconsidered.
Justice cannot retraumatise those seeking protection.
Digital Platforms Must Be Accountable
Today, people meet strangers, build trust, exchange private information, share images, and arrange real-world meetings through:
Dating Apps
Social media platforms
Messaging apps
Gaming platforms
Professional networking sites
Community forums
Content-sharing platforms
Yet meaningful background verification is minimal or non-existent.
Once trust is built and vulnerability increases, the consequences of encountering a dangerous individual can be devastating.
Technology companies cannot profit from connection while neglecting protection.
Online platforms that facilitate identity creation, communication, or real-world meetings must implement real-world accountability.
I Am Petitioning For:
1. Mandatory Enhanced Verification Standards Across Digital Platforms
Police-vetted profile options for platforms facilitating in-person meetings
Stronger identity verification systems beyond basic email/photo checks
Transparent verification indicators that cannot be easily manipulated
2. Clear Statutory Duty of Care for Technology Companies
Legal obligations requiring reasonable safeguarding measures
Independent oversight and compliance auditing
Transparent reporting of safeguarding failures
3. Cross-Sector Trauma-Informed Training
Mandatory trauma-informed training for investigative officers
Improved crisis response protocols for survivors experiencing psychological distress
Safeguards within legal processes to prevent retraumatisation
4. Stronger Protections in Court
Full ban on the use of private therapy notes in trials
Review of character reference practices that diminish harm
Prevention Must Come Before Litigation
We cannot continue responding only after harm has occurred.
Communities are built on accountability.
Institutions are built on responsibility.
Digital platforms must reflect both.
This is not about banning technology.
It is about raising the standard.
It is time for the technology sector to move beyond performative verification and into meaningful safeguarding.
Let us ensure that connection - whether romantic, social, professional, or community-based - is a safe, healthy, and protected experience for everyone.

8
The Issue
Trigger Warning
This petition contains references to sexual assault, childhood abuse, PTSD, hospitalisation, and systemic trauma within investigative and legal processes.
Petition for Safer Digital Platforms & Stronger Accountability Across the Technology Sector
With the rapid rise of digital platforms - including dating apps, social media networks, messaging platforms, gaming communities, professional networking sites, and other technology services - access to creating an online identity has become almost instantaneous.
That access is being exploited.
Individuals with histories of violence, sexual misconduct, coercive control, stalking, and predatory behaviour are able to create profiles with minimal barriers and little meaningful background verification.
Current “verified profile” systems are not sufficient.
A photo check, email confirmation, or blue tick does not protect users from individuals with documented histories of abuse or criminal activity.
Technology companies profit from connection, engagement, and data.
They must also take responsibility for safety.
This is not only about dating apps.
It is about the entire digital ecosystem.
I Speak From Lived Experience
In 2021, I was sexually assaulted by someone I met through a dating app.
As a survivor of childhood abuse, I had always navigated life with vigilance - careful about who I trusted, how I interacted, and who I chose to date. I believed I had done everything “right.”
But that day was different.
In the aftermath, I blamed myself. I told myself I should have been more vigilant. I criticised myself relentlessly for “putting myself in that position.” The shame was heavy and internalised.
It was only through intensive therapy that I learned to hand the shame and blame back to the perpetrator - where it belongs.
That same year, I realised these conversations needed a platform. It became clear that healing cannot remain confined to private therapy rooms - it must inform public discourse and policy reform.
When I reported the assault, I experienced compassionate care from members of An Garda Síochána. The systems activated - including referral to a Sexual Assault Treatment Unit (SATU) - were professional and humane.
However, during the investigative process, I encountered gaps in trauma-informed training that compounded harm. I was treated at times as though I were the suspect rather than the victim.
The legal process became more traumatic than the assault itself.
In January 2022, during the vigil for Aisling Murphy, the full psychological impact of my assault surfaced in a way I could not contain.
On the night of the assault, I feared I might not survive.
At that vigil, the weight of that fear landed: I realised I was “lucky” to be alive.
That moment triggered a severe PTSD episode that resulted in hospitalisation.
During that crisis, I experienced how mental health interventions can intersect with law enforcement procedures under the Irish Mental Health Act. Being processed through a Garda station while in acute psychological distress was deeply retraumatising. I felt treated as though I were a criminal rather than a survivor in crisis.
Both truths can coexist:
There are dedicated professionals doing extraordinary work.
And there are systemic gaps that must be addressed.
This Is Why Prevention Is Not Abstract
The trauma of assault does not end when the incident ends.
It lives in the body. It resurfaces. It reshapes a person’s sense of safety in the world.
It should not take assault, years of therapy, hospitalisation, and public advocacy to expose preventable gaps.
Survivors should not have their therapy notes weaponised in court.
Character references that centre perpetrators over harm must be reconsidered.
Justice cannot retraumatise those seeking protection.
Digital Platforms Must Be Accountable
Today, people meet strangers, build trust, exchange private information, share images, and arrange real-world meetings through:
Dating Apps
Social media platforms
Messaging apps
Gaming platforms
Professional networking sites
Community forums
Content-sharing platforms
Yet meaningful background verification is minimal or non-existent.
Once trust is built and vulnerability increases, the consequences of encountering a dangerous individual can be devastating.
Technology companies cannot profit from connection while neglecting protection.
Online platforms that facilitate identity creation, communication, or real-world meetings must implement real-world accountability.
I Am Petitioning For:
1. Mandatory Enhanced Verification Standards Across Digital Platforms
Police-vetted profile options for platforms facilitating in-person meetings
Stronger identity verification systems beyond basic email/photo checks
Transparent verification indicators that cannot be easily manipulated
2. Clear Statutory Duty of Care for Technology Companies
Legal obligations requiring reasonable safeguarding measures
Independent oversight and compliance auditing
Transparent reporting of safeguarding failures
3. Cross-Sector Trauma-Informed Training
Mandatory trauma-informed training for investigative officers
Improved crisis response protocols for survivors experiencing psychological distress
Safeguards within legal processes to prevent retraumatisation
4. Stronger Protections in Court
Full ban on the use of private therapy notes in trials
Review of character reference practices that diminish harm
Prevention Must Come Before Litigation
We cannot continue responding only after harm has occurred.
Communities are built on accountability.
Institutions are built on responsibility.
Digital platforms must reflect both.
This is not about banning technology.
It is about raising the standard.
It is time for the technology sector to move beyond performative verification and into meaningful safeguarding.
Let us ensure that connection - whether romantic, social, professional, or community-based - is a safe, healthy, and protected experience for everyone.

8
Petition Updates
Share this petition
Petition created on 2 August 2021