Ban AI from Acting as Counsellors/Therapists in Canada
Ban AI from Acting as Counsellors/Therapists in Canada
The Issue
In 2025, an increasing number of clients visiting our practice have reported frequent instances of misguidance by AI systems when engaged in role-play as therapists. Many are fortunate enough to recognize the AI systems' limitations in serving as a therapist and seek guidance from licensed human therapists. However, this trend is concerning as not everyone manages to escape the reliance on artificial intelligence for their mental health needs.
Artificial Intelligence has made significant strides in healthcare, offering automated solutions and convenience, but mental health care demands a human touch. AI lacks the empathy, intuition, and lived experience essential for providing effective therapy. It operates on algorithms that can neither fully understand the complexity of human emotions nor adapt to unexpected situations—characteristics crucial for any therapeutic encounter. The issue is escalating across the provinces as more individuals turn to AI due to its accessibility (limited to no cost) and anonymity (at least, perceived anonymity).
Unfortunately, the guidance provided by these systems can be greatly misleading, often resulting in worsened situations for individuals in need of real support. Below are some articles addressing this issue in detail:
https://www.theguardian.com/australia-news/2025/aug/03/ai-chatbot-as-therapy-alternative-mental-health-crises-ntwnfb
https://www.psychologytoday.com/ca/blog/urban-survival/202507/the-emerging-problem-of-ai-psychosis
https://www.nytimes.com/2025/08/08/technology/ai-chatbots-delusions-chatgpt.html
https://gizmodo.com/man-follows-diet-advice-from-chatgpt-ends-up-with-psychosis-2000640705
https://theweek.com/tech/ai-chatbots-psychosis-chatgpt-mental-health
https://www.rollingstone.com/culture/culture-features/ai-spiritual-delusions-destroying-human-relationships-1235330175/
Moreover, mental health is not an area where cost-cutting should override quality and safety. With the mental health crisis exacerbated by the pandemic, we need to ensure that individuals receive appropriate and effective care. Allowing AI to act as therapists is not only a potential risk but a direct disservice to those seeking help.
We propose that all provinces enact a law banning AI systems from being marketed or used as therapeutic guides. Instead, resources should be allocated towards improving access to trained professionals, ensuring everyone has the opportunity to engage in authentic therapeutic relationships.
Sign this petition to urge provincial lawmakers to take immediate and decisive action in safeguarding individuals' access to quality mental health services, ensuring they receive the care they rightfully deserve.
1,381
The Issue
In 2025, an increasing number of clients visiting our practice have reported frequent instances of misguidance by AI systems when engaged in role-play as therapists. Many are fortunate enough to recognize the AI systems' limitations in serving as a therapist and seek guidance from licensed human therapists. However, this trend is concerning as not everyone manages to escape the reliance on artificial intelligence for their mental health needs.
Artificial Intelligence has made significant strides in healthcare, offering automated solutions and convenience, but mental health care demands a human touch. AI lacks the empathy, intuition, and lived experience essential for providing effective therapy. It operates on algorithms that can neither fully understand the complexity of human emotions nor adapt to unexpected situations—characteristics crucial for any therapeutic encounter. The issue is escalating across the provinces as more individuals turn to AI due to its accessibility (limited to no cost) and anonymity (at least, perceived anonymity).
Unfortunately, the guidance provided by these systems can be greatly misleading, often resulting in worsened situations for individuals in need of real support. Below are some articles addressing this issue in detail:
https://www.theguardian.com/australia-news/2025/aug/03/ai-chatbot-as-therapy-alternative-mental-health-crises-ntwnfb
https://www.psychologytoday.com/ca/blog/urban-survival/202507/the-emerging-problem-of-ai-psychosis
https://www.nytimes.com/2025/08/08/technology/ai-chatbots-delusions-chatgpt.html
https://gizmodo.com/man-follows-diet-advice-from-chatgpt-ends-up-with-psychosis-2000640705
https://theweek.com/tech/ai-chatbots-psychosis-chatgpt-mental-health
https://www.rollingstone.com/culture/culture-features/ai-spiritual-delusions-destroying-human-relationships-1235330175/
Moreover, mental health is not an area where cost-cutting should override quality and safety. With the mental health crisis exacerbated by the pandemic, we need to ensure that individuals receive appropriate and effective care. Allowing AI to act as therapists is not only a potential risk but a direct disservice to those seeking help.
We propose that all provinces enact a law banning AI systems from being marketed or used as therapeutic guides. Instead, resources should be allocated towards improving access to trained professionals, ensuring everyone has the opportunity to engage in authentic therapeutic relationships.
Sign this petition to urge provincial lawmakers to take immediate and decisive action in safeguarding individuals' access to quality mental health services, ensuring they receive the care they rightfully deserve.
1,381
Supporter Voices
Petition Updates
Share this petition
Petition created on August 9, 2025