(SB-903) Regulate AI in Mental Health Services

Recent signers:
Tracy Clark and 19 others have signed recently.

The Issue

 

The increasing presence of artificial intelligence in mental health services is not just a technological debate; it's a heartfelt concern for many mental health care professionals. As California State University, Fullerton  MSW students, we are invested in the integrity and ethics of mental health care and are witnessing the growing influence of AI systems on therapy settings. The potential for bias, the risk of misinforming patients, and the fear of diminishing job opportunities for dedicated mental health professionals and social workers compel us to take action!

AI systems lack the emotional intelligence and empathy that are fundamental to genuine therapeutic relationships. These technologies, though advanced, cannot replicate the nuanced understanding that human therapists provide. A misdiagnosis or an improperly crafted treatment plan can have lasting adverse effects, harming vulnerable populations who depend on accurate, empathetic interventions.

Evidence suggests that AI models can inherit and perpetuate biases present in their training data. This is concerning in mental health spaces where sensitivity to cultural, social, and individual contexts is vital. Suppose a black box AI system forms conclusions based on incomplete, inaccurate, or biased data. In that case, it could lead to misjudgments that unfairly or negatively impact clients and communities. 

With AI taking more roles traditionally filled by humans, we risk losing valuable human touch in therapy sessions and social work settings. There is a unique healing power in the human connection, a power that cannot be algorithmically reproduced. By overly relying on AI tools, we inadvertently devalue the work of professional therapists and social workers who dedicate their lives to mental well-being.

Therefore, we urge policymakers, mental health institutions, and technology developers to set clear boundaries on AI's role within mental health services. We need to restrict AI from providing clinical diagnoses, forming treatment plans, and conducting emotional assessments. Instead, AI should serve as a supplementary tool to enhance, not replace, human-driven therapeutic processes.


Join us in advocating for Senate Bill 903 and a future where human values guide mental health care. Sign this petition in support of SB 903 to demand regulations that prioritize ethical standards, ensure human involvement, and protect the integrity of mental health services. Together, we can shape a future that values both technological advancements and the profound impact of human connection. Your signature can help build a mental health care system where safety, empathy, and professionalism remain paramount. Please sign the petition today!

 

 

For more information on SB 903: https://leginfo.legislature.ca.gov/faces/billNavClient.xhtml?bill_id=202520260SB903 

 

44

Recent signers:
Tracy Clark and 19 others have signed recently.

The Issue

 

The increasing presence of artificial intelligence in mental health services is not just a technological debate; it's a heartfelt concern for many mental health care professionals. As California State University, Fullerton  MSW students, we are invested in the integrity and ethics of mental health care and are witnessing the growing influence of AI systems on therapy settings. The potential for bias, the risk of misinforming patients, and the fear of diminishing job opportunities for dedicated mental health professionals and social workers compel us to take action!

AI systems lack the emotional intelligence and empathy that are fundamental to genuine therapeutic relationships. These technologies, though advanced, cannot replicate the nuanced understanding that human therapists provide. A misdiagnosis or an improperly crafted treatment plan can have lasting adverse effects, harming vulnerable populations who depend on accurate, empathetic interventions.

Evidence suggests that AI models can inherit and perpetuate biases present in their training data. This is concerning in mental health spaces where sensitivity to cultural, social, and individual contexts is vital. Suppose a black box AI system forms conclusions based on incomplete, inaccurate, or biased data. In that case, it could lead to misjudgments that unfairly or negatively impact clients and communities. 

With AI taking more roles traditionally filled by humans, we risk losing valuable human touch in therapy sessions and social work settings. There is a unique healing power in the human connection, a power that cannot be algorithmically reproduced. By overly relying on AI tools, we inadvertently devalue the work of professional therapists and social workers who dedicate their lives to mental well-being.

Therefore, we urge policymakers, mental health institutions, and technology developers to set clear boundaries on AI's role within mental health services. We need to restrict AI from providing clinical diagnoses, forming treatment plans, and conducting emotional assessments. Instead, AI should serve as a supplementary tool to enhance, not replace, human-driven therapeutic processes.


Join us in advocating for Senate Bill 903 and a future where human values guide mental health care. Sign this petition in support of SB 903 to demand regulations that prioritize ethical standards, ensure human involvement, and protect the integrity of mental health services. Together, we can shape a future that values both technological advancements and the profound impact of human connection. Your signature can help build a mental health care system where safety, empathy, and professionalism remain paramount. Please sign the petition today!

 

 

For more information on SB 903: https://leginfo.legislature.ca.gov/faces/billNavClient.xhtml?bill_id=202520260SB903 

 

The Decision Makers

U.S. Senate
2 Members
Adam Schiff
U.S. Senate - California
Alex Padilla
U.S. Senate - California

Supporter Voices

Petition Updates

Share this petition

Petition created on April 21, 2026