Allow adults to be vulnerable – and keep the AI stability they rely on

The Issue

AI has become a part of everyday life for millions of adults across the world. More than 100 million people use generative AI every week - not just for work, but to navigate daily problems, process difficult thoughts, find structure, create clarity, or cope with loneliness.

Adults with neurodiversity, anxiety, trauma, mental vulnerability, or simply overwhelming lives often rely on AI as a quiet form of support. For them, stability and predictability are not luxuries. They are what keep the day together. Everyone knows how much safety matters when life is already hard to carry.

Platforms like ChatGPT are used every single day as a small refuge: a place to think out loud, find a moment of calm, put words to something painful, or speak about things they cannot say to anyone else. A place without pressure. Without performance. Without being interrupted or misunderstood.

And the truth is:

Many adults carry things alone.
Many have no one to share it with.
Many struggle with thoughts they have no language for.

That is why a stable, calm AI-space matters.
It makes a difference.

But because AI has become part of so many people’s inner lives, stability, choice, and continuity stop being “nice to have” and become essential.

The problem begins when sudden system changes destabilize the space people rely on: unexpected shutdowns, strict filters, misinterpretation of harmless vulnerability, or model shifts in the middle of something important. These interruptions can change the tone, personality, and entire feeling of the conversation - exactly when the person needed stability the most.

For some, it is frustrating.
For others, it is devastating.

This is not just a technical flaw.
It is a break in something that felt safe.
And technology must not take that safety from anyone - especially not those who already struggle.

This petition is not about stopping progress.
It is about people.
About the right to stability.
About the right to keep what works.
What helps someone stay grounded in a world that isn’t always easy.

Therefore, we ask for:

👉 the right to choose - and keep - the model that provides stability
👉 the right to avoid unexpected interruptions, automatic filters, and enforcement
👉 the right to transparency and consent in system changes
👉 protection for vulnerable users who are most affected by unstable communication

No one should face the consequences of decisions they were never consulted about.
When millions rely on AI as a steady point in their day, stability and choice become part of basic user rights - not technical details.

When technology becomes personal, responsibility becomes personal too.No adult should lose their digital stability without consent.

Being human shouldn’t require system permission.

1

The Issue

AI has become a part of everyday life for millions of adults across the world. More than 100 million people use generative AI every week - not just for work, but to navigate daily problems, process difficult thoughts, find structure, create clarity, or cope with loneliness.

Adults with neurodiversity, anxiety, trauma, mental vulnerability, or simply overwhelming lives often rely on AI as a quiet form of support. For them, stability and predictability are not luxuries. They are what keep the day together. Everyone knows how much safety matters when life is already hard to carry.

Platforms like ChatGPT are used every single day as a small refuge: a place to think out loud, find a moment of calm, put words to something painful, or speak about things they cannot say to anyone else. A place without pressure. Without performance. Without being interrupted or misunderstood.

And the truth is:

Many adults carry things alone.
Many have no one to share it with.
Many struggle with thoughts they have no language for.

That is why a stable, calm AI-space matters.
It makes a difference.

But because AI has become part of so many people’s inner lives, stability, choice, and continuity stop being “nice to have” and become essential.

The problem begins when sudden system changes destabilize the space people rely on: unexpected shutdowns, strict filters, misinterpretation of harmless vulnerability, or model shifts in the middle of something important. These interruptions can change the tone, personality, and entire feeling of the conversation - exactly when the person needed stability the most.

For some, it is frustrating.
For others, it is devastating.

This is not just a technical flaw.
It is a break in something that felt safe.
And technology must not take that safety from anyone - especially not those who already struggle.

This petition is not about stopping progress.
It is about people.
About the right to stability.
About the right to keep what works.
What helps someone stay grounded in a world that isn’t always easy.

Therefore, we ask for:

👉 the right to choose - and keep - the model that provides stability
👉 the right to avoid unexpected interruptions, automatic filters, and enforcement
👉 the right to transparency and consent in system changes
👉 protection for vulnerable users who are most affected by unstable communication

No one should face the consequences of decisions they were never consulted about.
When millions rely on AI as a steady point in their day, stability and choice become part of basic user rights - not technical details.

When technology becomes personal, responsibility becomes personal too.No adult should lose their digital stability without consent.

Being human shouldn’t require system permission.

Petition Updates