Petition updateRegulate the Use of AI in Talent SoftwareUpcoming Meeting with the ACLU. Sign, share and leave your story in comments.
Maria RochaPA, United States
3 Jan 2025

Happy New Year 🎊 

I know some of us are entering with heaviness on our minds and bodies but we can only continue to hope, keep our drive, and look forward to the light at the end of this journey. 

An update: I received an email response to an email sent to the ACLU (American Civil Liberties Union) and have a meeting scheduled for next week. 

I’m looking forward to sharing more about their efforts. I’ve seen many articles that point out how this “might” be happening, “AI could be screening out candidates” etc., but I haven’t seen any personal, human connection to those articles. 

I think there are a few reasons why:

  1. Many don’t realize it happening. Statistically, 100-200 applications on average, with slight shifts for demographic groups, are submitted before an offer. That’s the high end. 3,000 applications would be a statistical anomaly... occurring in mass. 

  2. Protected classes include race, sex, age, disability status. There are many algorithms that can determine race (name bias, you selected race, LinkedIn). Age can be defined by your employment history. Disability: you’ve selected the box or employment gaps. If you are qualified for the role, gaps in employment are prohibited from being the only reason for rejection. 

  3. You went back and changed any of the information above or adjusted to not select answers for the inquiries. However, you likely set up an account or selected the box for the company to keep your information on file for future employment matches. That information has been retained. 

AI and machine learning is a complex web and there is even more data used in the machine learning process. 

Copy link
WhatsApp
Facebook
Nextdoor
Email
X