Stop Tech Companies From Using Their Own Employees as Unpaid Test Subjects


Stop Tech Companies From Using Their Own Employees as Unpaid Test Subjects
The Issue
Tech companies are running studies on their own employees to build products they sell to other people. Health data, biometrics, reproductive information, sleep, fitness, behavior — collected from workers who know that saying no could cost them their job. Universities need independent ethics board approval to study their own students' sleep patterns.[1] Tech companies need nothing to study their employees' menstrual cycles. There are no rules. There is no oversight. And when an Apple engineer refused to participate in a workplace health study and spoke up about it, Apple fired her — and a federal court ordered her to delete what she said and never speak of it again.
This petition asks Congress, the FTC, and the EEOC to require the same independent ethics review, informed consent, and worker protections for employer-run studies that already apply everywhere else.
THE PROBLEM
A university professor who wants to study her students' sleep patterns needs approval from an independent ethics board.[1] She needs informed consent that the students can revoke at any time.[2] She needs to prove that saying no won't affect their grades.[3] She needs data protections. She needs oversight. She needs to justify why she can't get the data from someone who doesn't depend on her for their future.[4] A tech company that wants to study its employees' health, biometrics, reproductive data, or daily habits for product development needs none of that. Same power imbalance. Same coercion risk. Often more intimate data. Zero protections.
Some tech companies run internal "user studies" to develop wearables, health apps, AI models, fitness trackers, sleep tools, biometric sensors, and productivity software. "Internal" meaning they use their employees as the sole data source. All of that development needs real human data. Companies can do things like hire research participants, contract with research firms, buy datasets from willing sellers, or run studies through universities with ethics board oversight. They can pay people and get real consent from people who can walk away, with multiple levels of oversight and protections for the research subjects. Or corporations can use their own employees — who are already on the payroll, already on company devices, already in the building, and who can't walk away (or complain) without risking losing their income and career.
The human research community recognized this problem decades ago. Federal regulations require additional safeguards when research subjects are "likely to be vulnerable to coercion or undue influence."[5] Institutional Review Boards at universities across the country classify employees as a vulnerable population — in the same category as prisoners and students — because genuine consent is nearly impossible when your livelihood depends on the person asking.[6][7][8]
University research compliance offices warn that coercion in these settings "can take more subtle forms, such as when workplace culture encourages staff participation in research, and those who decline may be seen as outsiders who are not committed to organizational goals."[9] University IRB's require researchers to build processes ensuring that "employers will not know who agreed to take part in the research" — because even knowing who refused can taint the employment relationship.[3] These rules apply to federally funded research and academic institutions. When a private company runs the same study on its own employees for commercial product development, the entire protective framework disappears.
In Gjovik v. Apple Inc., a former Apple senior engineering program manager (me) reported that Apple ran workplace studies asking employees to track and submit menstrual cycles, ovulation, cervical mucus, and sexual activity — intimate reproductive and sexual health data that apparently fed into product development for consumer health devices. I refused to participate. I complained internally, complained about it with coworkers, and spoke out publicly. I also complained Apple was using unlawful NDAs and secrecy policies to prevent workers from organizing for better work conditions and reporting unlawful conduct. The NLRB investigated, found merit, and made Apple sign a national settlement with me and NLRB promising to stop enforcing overbroad confidentiality rules against workers.[10]
Then Apple asked a federal court to order me to delete my public speech about my complaints about work conditions and unlawful conduct by Apple, and stop talking about it going forward — possibly indefinitely. This was at Apple's request, with no hearing, no findings of fact, no legal basis, and no end date. Apple's position: a worker's complaints about what the employer did to her body and coworkers bodies, or what Apple wanted to do to their bodies, are all the employer's "confidential business information." The company that collected intimate data from its workers is using a federal judge to stop the worker from telling anyone about the collection — while multiple government agencies are investigating the same conduct.
The AI boom has made human data the most valuable commodity in tech. Training health models requires health data. Training biometric models requires biometric data. Training productivity tools requires behavioral data. The more intimate and detailed, the more valuable. Hiring outside participants costs money. Running a properly consented study with paid volunteers who can walk away costs more. Employees cost nothing extra — and they can't say no the way a paid volunteer can. Every company that gets away with using its own workers instead of paying outside participants makes the practice more normal for the next one.
THE RULES ALREADY EXIST — BIG TECH JUST THINKS THEY DON'T APPLY TO THEM.
FEDERAL HUMAN SUBJECTS PROTECTIONS (45 CFR 46 — "The Common Rule"): Federal regulations require that when research subjects are "likely to be vulnerable to coercion or undue influence, such as children, prisoners, pregnant women, mentally disabled persons, or economically or educationally disadvantaged persons, additional safeguards have been included in the study to protect the rights and welfare of these subjects."[5] IRBs at Boise State, Iowa State, West Virginia, UVA, and universities nationwide recognize employees as institutionally vulnerable — meaning their consent may be "coerced either directly or indirectly" due to the "formal authority" the employer holds over them.[8] The regulations mandate that investigators "seek consent only under circumstances that minimize the possibility of coercion or undue influence."[11]
THE EU's GDPR: The European Data Protection Board has stated that "given the dependency that results from the employer/employee relationship, it is unlikely that the data subject is able to deny his/her employer consent to data processing without experiencing the fear or real risk of detrimental effects."[12] The Board concluded that "employees can only give free consent in exceptional circumstances, when it will have no adverse consequences at all whether or not they give consent."[12] Employee consent to employer data collection is presumptively invalid under EU law.[13] European workers already have this protection. American workers do not.
THE NATIONAL LABOR RELATIONS ACT: Workers have a federally protected right to discuss wages, hours, and working conditions with each other and the public.[14] Employer confidentiality rules that prevent workers from talking about what happens to them at work violate the NLRA. The NLRB told Apple exactly this in a binding national settlement in April 2025.[10]
THE EEOC: In 2025, the EEOC issued guidance on wearable technology in the workplace, warning that "even when participation is technically voluntary, employees might feel pressured to consent due to workplace culture or fear of retaliation, such as being seen as uncooperative or missing out on incentives."[15] The guidance also flagged that devices collecting health metrics "could indirectly reveal sensitive information, such as pregnancy, disabilities, or mental health conditions."[15]
OSHA AND STATE WHISTLEBLOWER PROTECTIONS: Federal and state law protects workers who report unsafe or unhealthy working conditions — including invasive data collection and coercive study practices.[16]
FTC AND ALGORITHMIC DISGORGEMENT: The FTC has stated that companies that use improperly obtained data to build AI models and products may be required to delete the models themselves — not just the data.[17] Since 2019, the FTC has ordered algorithmic disgorgement in enforcement actions against Cambridge Analytica, Amazon Ring, Weight Watchers, Rite Aid, and others, requiring companies to destroy algorithms and products built on data they had no right to use.[18] The FTC's position: companies should not profit from models trained on wrongfully obtained data.[19] If employee data collected through coercive workplace studies feeds AI product development, the FTC's own precedent says those models are built on tainted data.
Every one of these frameworks recognizes the same thing: when your boss asks you to hand over data about your body, you are not free to say no. The gap is that American employers running product development studies on their own workers are not required to follow any of them.
YOUR COWORKER'S PROBLEM IS YOUR PROBLEM
Every study your coworker agrees to under pressure sets the baseline for what the company asks you to do next. Every worker who speaks up and gets fired is a message to you. Every worker who gets gagged by a court is a message to everyone. You can't organize around a problem you can't describe. If the details of what the company did are designated "Confidential," the workers trying to push back can't share the information they need to build a case. Silencing one worker cuts off the information every other worker needs to protect themselves.
WHAT WE'RE ASKING FOR
REQUIRE INDEPENDENT ETHICS REVIEW FOR EMPLOYER-RUN STUDIES. A university researcher needs IRB approval to study someone's sleep.[1] An employer should need independent ethics review to study its employees' biometrics, health data, or reproductive information for product development. Any employer-run study involving employee health, biometric, or intimate personal data should meet the same standards that apply to every other institution conducting human subjects research.[2][5]
REQUIRE REAL INFORMED CONSENT. A checkbox on a company form is not consent when your job depends on the person handing you the form. American law should adopt the GDPR principle: employee consent to data collection for purposes beyond the employment relationship is presumptively coerced.[12][13] If an employer wants employee data for product development, the employee must be able to refuse with zero consequences — documented, enforced, and auditable.
PAY EMPLOYEES FOR THEIR DATA. If employee data is valuable enough to train AI models and build products, it is valuable enough to pay for. Companies should compensate employees whose data feeds product development at the same rates they would pay outside research participants or data brokers.
PROTECT WORKERS WHO REFUSE OR SPEAK UP. Workers who decline to participate in studies, report invasive data collection, or speak publicly about employer research practices must be protected from retaliation.[14][16] Courts must be prohibited from silencing them through gag orders or confidentiality designations. When one worker is silenced, every coworker loses the information they need to make informed decisions about their own participation.
CLOSE THE LEGAL LOOPHOLES THAT SILENCE COMPLAINTS. Employers are using litigation tools — protective orders, confidentiality designations, sealing motions — to gag workers who complain about invasive studies and data collection. Congress should explicitly prohibit courts from enforcing employer-designated confidentiality over worker testimony about workplace studies, health data collection, surveillance, and privacy complaints.
GIVE AMERICAN WORKERS THE DATA RIGHTS EUROPEAN WORKERS ALREADY HAVE. The GDPR recognizes that employees cannot freely consent to employer data collection because of the inherent power imbalance.[12] American workers deserve at least the same protection. Congress should pass federal legislation establishing that employee consent to employer data collection for product development is presumptively invalid without independent verification.
A university cannot study its students' sleep patterns without independent ethics review, informed consent, the right to withdraw, and data protections. A tech company thinks it can study its employees' health, biometrics, and intimate personal data with none of that — and fire the one who complains, and get a court to gag her from ever speaking of it again.
Sign this petition. Tell Congress, the Courts, the FTC, and the tech industry: your employees are not your test subjects. Their data is not free. And if they speak up about what you did, no court should shut them up.
————————————————
REFERENCES
[1] 45 CFR 46.109 — IRB review of research. Federal regulations requiring Institutional Review Board approval for human subjects research, including assessment of risks, informed consent procedures, and additional safeguards for vulnerable populations.
[2] 45 CFR 46.116 — General requirements for informed consent. Requires that consent be sought "only under circumstances that provide the prospective subject or the representative sufficient opportunity to consider whether or not to participate and that minimize the possibility of coercion or undue influence."
[3] Iowa State University, Office of Research Ethics, "Vulnerable Populations — Subordinates" (2022). IRB guidance requiring that "employers, program leaders, etc. will not know who agreed to take part in the research" and that participants clearly understand their decision "will have no impact (positive or negative)." https://compliance.iastate.edu/
[4] 45 CFR 46.111(a)(3) — IRB approval criteria requiring "equitable selection of subjects" with the IRB "particularly cognizant of the special problems of research involving vulnerable populations."
[5] 45 CFR 46.111(b) — "When some or all of the subjects are likely to be vulnerable to coercion or undue influence, such as children, prisoners, pregnant women, mentally disabled persons, or economically or educationally disadvantaged persons, additional safeguards have been included in the study to protect the rights and welfare of these subjects."
[6] Boise State University, Office of Research Compliance, "Review of Studies Involving Vulnerable Populations." Requires that for studies involving employees, "the PI must outline procedures to ensure that the employees will not be subject to undue influence or coercion." Notes that "the preference of the IRB is that the PI recruit employees with whom the PI does not have a direct relationship." https://www.boisestate.edu/research-compliance/irb/guidance/vulnerable-populations/
[7] West Virginia University, Office of Human Research Protections, "Vulnerable Populations in Research." Classifies employees as a vulnerable population when "the employee reports to a member of the research team. The primary concern is perceived coercion." https://human.research.wvu.edu/guidance/vulnerable-populations
[8] University of Virginia, Human Research Protection Program, "Vulnerable Participants." Defines "institutional vulnerability" as occurring "when individuals are subject to a formal authority and whose consent may be coerced either directly or indirectly. Examples include prisoners, student/professor relationships, and/or employee/employer relationships." https://hrpp.research.virginia.edu/
[9] University of Michigan, Research Compliance, "Protection of Vulnerable Populations in Research." Warns that "coercion can also take more subtle forms, such as when workplace culture encourages staff participation in research, and those who decline may be seen as outsiders who are not committed to organizational goals." https://research-compliance.umich.edu/
[10] National Labor Relations Board, Settlement Agreement, Case No. 32-CA-284428 (April 2025). Apple agreed to stop enforcing overbroad confidentiality rules against workers discussing wages, hours, and working conditions.
[11] 45 CFR 46.116(a)(2) — Informed consent must minimize "the possibility of coercion or undue influence." U.S. Department of Health and Human Services, Office for Human Research Protections, "Informed Consent FAQs." https://www.hhs.gov/ohrp/regulations-and-policy/guidance/faq/informed-consent/
[12] European Data Protection Board (formerly Article 29 Working Party), Guidelines on Consent under Regulation 2016/679. "Given the dependency that results from the employer/employee relationship, it is unlikely that the data subject is able to deny his/her employer consent to data processing without experiencing the fear or real risk of detrimental effects as a result of a refusal." Concluded that "employees can only give free consent in exceptional circumstances, when it will have no adverse consequences at all whether or not they give consent." Cited in Jackson Lewis, "Is Employee Consent under EU Data Protection Regulation Possible?" (2023). https://www.jacksonlewis.com/insights/employee-consent-under-eu-data-protection-regulation-possible
[13] International Association of Privacy Professionals (IAPP), "Consent as Legal Basis for EU and UK Employment" (2022). "In the employment context, consent is deemed to be problematic. An actual or perceived imbalance of power between the employee/applicant and employer make it difficult to prove that the consent was freely given and therefore valid." https://iapp.org/news/a/consent-as-legal-basis-for-eu-and-u-k-employment
[14] National Labor Relations Act, 29 U.S.C. §§ 157-158. Section 7 protects employees' right "to engage in concerted activities for the purpose of collective bargaining or other mutual aid or protection." Section 8(a)(1) makes it an unfair labor practice for an employer "to interfere with, restrain, or coerce employees in the exercise of the rights guaranteed in section 157."
[15] U.S. Equal Employment Opportunity Commission, Fact Sheet on Wearable Technology in the Workplace (2025). Analysis by Goldberg Segalla: "Even when participation is technically voluntary, employees might feel pressured to consent due to workplace culture or fear of retaliation." https://www.goldbergsegalla.com/news-and-knowledge/knowledge/eeoc-avoid-bias-with-wearable-tech-in-the-workplace/
[16] Occupational Safety and Health Act, 29 U.S.C. § 660(c). Prohibits retaliation against employees who report unsafe or unhealthy working conditions. Analogous state whistleblower statutes — including California Labor Code §§ 1102.5 and 6310 — protect workers who report employer practices that endanger health, safety, or privacy. California Labor Code §§ 1102.5, 232.5, 96(k) and others also codify protections for speaking out about workplace misconduct, talking about work conditions, organizing with employees, and exercising constitutional rights (like the California constitutional right to protest invasions of privacy).
[17] Federal Trade Commission, "AI Companies: Uphold Your Privacy and Confidentiality Commitments" (January 2024). "In its prior enforcement actions, the FTC has required businesses that unlawfully obtain consumer data to delete any products — including models and algorithms — developed in whole or in part using that unlawfully obtained data." https://www.ftc.gov/policy/advocacy-research/tech-at-ftc/2024/01/ai-companies-uphold-your-privacy-confidentiality-commitments
[18] FTC enforcement actions ordering algorithmic disgorgement include: Cambridge Analytica (2019), Everalbum (2021), WW International/Kurbo (2022), Amazon Ring (2023), Edmodo (2023), and Rite Aid (2024). As summarized in Richmond Journal of Law & Technology, "Algorithmic Disgorgement: Destruction of Artificial Intelligence Models as the FTC's Newest Enforcement Tool for Bad Data" (2023). https://scholarship.richmond.edu/jolt/vol29/iss2/1/
[19] FTC Commissioner Rebecca Kelly Slaughter called algorithmic disgorgement "an innovative and promising remedy," stating that "when companies collect data illegally, they should not be able to profit from either the data or any algorithm developed using it." Former FTC Commissioner Rohit Chopra stated that companies must "forfeit the fruits of [their] deception." As cited in IAPP, "Explaining Model Disgorgement" (December 2023). https://iapp.org/news/a/explaining-model-disgorgement
CASE REFERENCE: Gjovik v. Apple Inc., Case No. 3:23-cv-04597-EMC (N.D. Cal.). Case documents publicly available at: https://www.courtlistener.com/docket/67843254/gjovik-v-apple-inc/

3
The Issue
Tech companies are running studies on their own employees to build products they sell to other people. Health data, biometrics, reproductive information, sleep, fitness, behavior — collected from workers who know that saying no could cost them their job. Universities need independent ethics board approval to study their own students' sleep patterns.[1] Tech companies need nothing to study their employees' menstrual cycles. There are no rules. There is no oversight. And when an Apple engineer refused to participate in a workplace health study and spoke up about it, Apple fired her — and a federal court ordered her to delete what she said and never speak of it again.
This petition asks Congress, the FTC, and the EEOC to require the same independent ethics review, informed consent, and worker protections for employer-run studies that already apply everywhere else.
THE PROBLEM
A university professor who wants to study her students' sleep patterns needs approval from an independent ethics board.[1] She needs informed consent that the students can revoke at any time.[2] She needs to prove that saying no won't affect their grades.[3] She needs data protections. She needs oversight. She needs to justify why she can't get the data from someone who doesn't depend on her for their future.[4] A tech company that wants to study its employees' health, biometrics, reproductive data, or daily habits for product development needs none of that. Same power imbalance. Same coercion risk. Often more intimate data. Zero protections.
Some tech companies run internal "user studies" to develop wearables, health apps, AI models, fitness trackers, sleep tools, biometric sensors, and productivity software. "Internal" meaning they use their employees as the sole data source. All of that development needs real human data. Companies can do things like hire research participants, contract with research firms, buy datasets from willing sellers, or run studies through universities with ethics board oversight. They can pay people and get real consent from people who can walk away, with multiple levels of oversight and protections for the research subjects. Or corporations can use their own employees — who are already on the payroll, already on company devices, already in the building, and who can't walk away (or complain) without risking losing their income and career.
The human research community recognized this problem decades ago. Federal regulations require additional safeguards when research subjects are "likely to be vulnerable to coercion or undue influence."[5] Institutional Review Boards at universities across the country classify employees as a vulnerable population — in the same category as prisoners and students — because genuine consent is nearly impossible when your livelihood depends on the person asking.[6][7][8]
University research compliance offices warn that coercion in these settings "can take more subtle forms, such as when workplace culture encourages staff participation in research, and those who decline may be seen as outsiders who are not committed to organizational goals."[9] University IRB's require researchers to build processes ensuring that "employers will not know who agreed to take part in the research" — because even knowing who refused can taint the employment relationship.[3] These rules apply to federally funded research and academic institutions. When a private company runs the same study on its own employees for commercial product development, the entire protective framework disappears.
In Gjovik v. Apple Inc., a former Apple senior engineering program manager (me) reported that Apple ran workplace studies asking employees to track and submit menstrual cycles, ovulation, cervical mucus, and sexual activity — intimate reproductive and sexual health data that apparently fed into product development for consumer health devices. I refused to participate. I complained internally, complained about it with coworkers, and spoke out publicly. I also complained Apple was using unlawful NDAs and secrecy policies to prevent workers from organizing for better work conditions and reporting unlawful conduct. The NLRB investigated, found merit, and made Apple sign a national settlement with me and NLRB promising to stop enforcing overbroad confidentiality rules against workers.[10]
Then Apple asked a federal court to order me to delete my public speech about my complaints about work conditions and unlawful conduct by Apple, and stop talking about it going forward — possibly indefinitely. This was at Apple's request, with no hearing, no findings of fact, no legal basis, and no end date. Apple's position: a worker's complaints about what the employer did to her body and coworkers bodies, or what Apple wanted to do to their bodies, are all the employer's "confidential business information." The company that collected intimate data from its workers is using a federal judge to stop the worker from telling anyone about the collection — while multiple government agencies are investigating the same conduct.
The AI boom has made human data the most valuable commodity in tech. Training health models requires health data. Training biometric models requires biometric data. Training productivity tools requires behavioral data. The more intimate and detailed, the more valuable. Hiring outside participants costs money. Running a properly consented study with paid volunteers who can walk away costs more. Employees cost nothing extra — and they can't say no the way a paid volunteer can. Every company that gets away with using its own workers instead of paying outside participants makes the practice more normal for the next one.
THE RULES ALREADY EXIST — BIG TECH JUST THINKS THEY DON'T APPLY TO THEM.
FEDERAL HUMAN SUBJECTS PROTECTIONS (45 CFR 46 — "The Common Rule"): Federal regulations require that when research subjects are "likely to be vulnerable to coercion or undue influence, such as children, prisoners, pregnant women, mentally disabled persons, or economically or educationally disadvantaged persons, additional safeguards have been included in the study to protect the rights and welfare of these subjects."[5] IRBs at Boise State, Iowa State, West Virginia, UVA, and universities nationwide recognize employees as institutionally vulnerable — meaning their consent may be "coerced either directly or indirectly" due to the "formal authority" the employer holds over them.[8] The regulations mandate that investigators "seek consent only under circumstances that minimize the possibility of coercion or undue influence."[11]
THE EU's GDPR: The European Data Protection Board has stated that "given the dependency that results from the employer/employee relationship, it is unlikely that the data subject is able to deny his/her employer consent to data processing without experiencing the fear or real risk of detrimental effects."[12] The Board concluded that "employees can only give free consent in exceptional circumstances, when it will have no adverse consequences at all whether or not they give consent."[12] Employee consent to employer data collection is presumptively invalid under EU law.[13] European workers already have this protection. American workers do not.
THE NATIONAL LABOR RELATIONS ACT: Workers have a federally protected right to discuss wages, hours, and working conditions with each other and the public.[14] Employer confidentiality rules that prevent workers from talking about what happens to them at work violate the NLRA. The NLRB told Apple exactly this in a binding national settlement in April 2025.[10]
THE EEOC: In 2025, the EEOC issued guidance on wearable technology in the workplace, warning that "even when participation is technically voluntary, employees might feel pressured to consent due to workplace culture or fear of retaliation, such as being seen as uncooperative or missing out on incentives."[15] The guidance also flagged that devices collecting health metrics "could indirectly reveal sensitive information, such as pregnancy, disabilities, or mental health conditions."[15]
OSHA AND STATE WHISTLEBLOWER PROTECTIONS: Federal and state law protects workers who report unsafe or unhealthy working conditions — including invasive data collection and coercive study practices.[16]
FTC AND ALGORITHMIC DISGORGEMENT: The FTC has stated that companies that use improperly obtained data to build AI models and products may be required to delete the models themselves — not just the data.[17] Since 2019, the FTC has ordered algorithmic disgorgement in enforcement actions against Cambridge Analytica, Amazon Ring, Weight Watchers, Rite Aid, and others, requiring companies to destroy algorithms and products built on data they had no right to use.[18] The FTC's position: companies should not profit from models trained on wrongfully obtained data.[19] If employee data collected through coercive workplace studies feeds AI product development, the FTC's own precedent says those models are built on tainted data.
Every one of these frameworks recognizes the same thing: when your boss asks you to hand over data about your body, you are not free to say no. The gap is that American employers running product development studies on their own workers are not required to follow any of them.
YOUR COWORKER'S PROBLEM IS YOUR PROBLEM
Every study your coworker agrees to under pressure sets the baseline for what the company asks you to do next. Every worker who speaks up and gets fired is a message to you. Every worker who gets gagged by a court is a message to everyone. You can't organize around a problem you can't describe. If the details of what the company did are designated "Confidential," the workers trying to push back can't share the information they need to build a case. Silencing one worker cuts off the information every other worker needs to protect themselves.
WHAT WE'RE ASKING FOR
REQUIRE INDEPENDENT ETHICS REVIEW FOR EMPLOYER-RUN STUDIES. A university researcher needs IRB approval to study someone's sleep.[1] An employer should need independent ethics review to study its employees' biometrics, health data, or reproductive information for product development. Any employer-run study involving employee health, biometric, or intimate personal data should meet the same standards that apply to every other institution conducting human subjects research.[2][5]
REQUIRE REAL INFORMED CONSENT. A checkbox on a company form is not consent when your job depends on the person handing you the form. American law should adopt the GDPR principle: employee consent to data collection for purposes beyond the employment relationship is presumptively coerced.[12][13] If an employer wants employee data for product development, the employee must be able to refuse with zero consequences — documented, enforced, and auditable.
PAY EMPLOYEES FOR THEIR DATA. If employee data is valuable enough to train AI models and build products, it is valuable enough to pay for. Companies should compensate employees whose data feeds product development at the same rates they would pay outside research participants or data brokers.
PROTECT WORKERS WHO REFUSE OR SPEAK UP. Workers who decline to participate in studies, report invasive data collection, or speak publicly about employer research practices must be protected from retaliation.[14][16] Courts must be prohibited from silencing them through gag orders or confidentiality designations. When one worker is silenced, every coworker loses the information they need to make informed decisions about their own participation.
CLOSE THE LEGAL LOOPHOLES THAT SILENCE COMPLAINTS. Employers are using litigation tools — protective orders, confidentiality designations, sealing motions — to gag workers who complain about invasive studies and data collection. Congress should explicitly prohibit courts from enforcing employer-designated confidentiality over worker testimony about workplace studies, health data collection, surveillance, and privacy complaints.
GIVE AMERICAN WORKERS THE DATA RIGHTS EUROPEAN WORKERS ALREADY HAVE. The GDPR recognizes that employees cannot freely consent to employer data collection because of the inherent power imbalance.[12] American workers deserve at least the same protection. Congress should pass federal legislation establishing that employee consent to employer data collection for product development is presumptively invalid without independent verification.
A university cannot study its students' sleep patterns without independent ethics review, informed consent, the right to withdraw, and data protections. A tech company thinks it can study its employees' health, biometrics, and intimate personal data with none of that — and fire the one who complains, and get a court to gag her from ever speaking of it again.
Sign this petition. Tell Congress, the Courts, the FTC, and the tech industry: your employees are not your test subjects. Their data is not free. And if they speak up about what you did, no court should shut them up.
————————————————
REFERENCES
[1] 45 CFR 46.109 — IRB review of research. Federal regulations requiring Institutional Review Board approval for human subjects research, including assessment of risks, informed consent procedures, and additional safeguards for vulnerable populations.
[2] 45 CFR 46.116 — General requirements for informed consent. Requires that consent be sought "only under circumstances that provide the prospective subject or the representative sufficient opportunity to consider whether or not to participate and that minimize the possibility of coercion or undue influence."
[3] Iowa State University, Office of Research Ethics, "Vulnerable Populations — Subordinates" (2022). IRB guidance requiring that "employers, program leaders, etc. will not know who agreed to take part in the research" and that participants clearly understand their decision "will have no impact (positive or negative)." https://compliance.iastate.edu/
[4] 45 CFR 46.111(a)(3) — IRB approval criteria requiring "equitable selection of subjects" with the IRB "particularly cognizant of the special problems of research involving vulnerable populations."
[5] 45 CFR 46.111(b) — "When some or all of the subjects are likely to be vulnerable to coercion or undue influence, such as children, prisoners, pregnant women, mentally disabled persons, or economically or educationally disadvantaged persons, additional safeguards have been included in the study to protect the rights and welfare of these subjects."
[6] Boise State University, Office of Research Compliance, "Review of Studies Involving Vulnerable Populations." Requires that for studies involving employees, "the PI must outline procedures to ensure that the employees will not be subject to undue influence or coercion." Notes that "the preference of the IRB is that the PI recruit employees with whom the PI does not have a direct relationship." https://www.boisestate.edu/research-compliance/irb/guidance/vulnerable-populations/
[7] West Virginia University, Office of Human Research Protections, "Vulnerable Populations in Research." Classifies employees as a vulnerable population when "the employee reports to a member of the research team. The primary concern is perceived coercion." https://human.research.wvu.edu/guidance/vulnerable-populations
[8] University of Virginia, Human Research Protection Program, "Vulnerable Participants." Defines "institutional vulnerability" as occurring "when individuals are subject to a formal authority and whose consent may be coerced either directly or indirectly. Examples include prisoners, student/professor relationships, and/or employee/employer relationships." https://hrpp.research.virginia.edu/
[9] University of Michigan, Research Compliance, "Protection of Vulnerable Populations in Research." Warns that "coercion can also take more subtle forms, such as when workplace culture encourages staff participation in research, and those who decline may be seen as outsiders who are not committed to organizational goals." https://research-compliance.umich.edu/
[10] National Labor Relations Board, Settlement Agreement, Case No. 32-CA-284428 (April 2025). Apple agreed to stop enforcing overbroad confidentiality rules against workers discussing wages, hours, and working conditions.
[11] 45 CFR 46.116(a)(2) — Informed consent must minimize "the possibility of coercion or undue influence." U.S. Department of Health and Human Services, Office for Human Research Protections, "Informed Consent FAQs." https://www.hhs.gov/ohrp/regulations-and-policy/guidance/faq/informed-consent/
[12] European Data Protection Board (formerly Article 29 Working Party), Guidelines on Consent under Regulation 2016/679. "Given the dependency that results from the employer/employee relationship, it is unlikely that the data subject is able to deny his/her employer consent to data processing without experiencing the fear or real risk of detrimental effects as a result of a refusal." Concluded that "employees can only give free consent in exceptional circumstances, when it will have no adverse consequences at all whether or not they give consent." Cited in Jackson Lewis, "Is Employee Consent under EU Data Protection Regulation Possible?" (2023). https://www.jacksonlewis.com/insights/employee-consent-under-eu-data-protection-regulation-possible
[13] International Association of Privacy Professionals (IAPP), "Consent as Legal Basis for EU and UK Employment" (2022). "In the employment context, consent is deemed to be problematic. An actual or perceived imbalance of power between the employee/applicant and employer make it difficult to prove that the consent was freely given and therefore valid." https://iapp.org/news/a/consent-as-legal-basis-for-eu-and-u-k-employment
[14] National Labor Relations Act, 29 U.S.C. §§ 157-158. Section 7 protects employees' right "to engage in concerted activities for the purpose of collective bargaining or other mutual aid or protection." Section 8(a)(1) makes it an unfair labor practice for an employer "to interfere with, restrain, or coerce employees in the exercise of the rights guaranteed in section 157."
[15] U.S. Equal Employment Opportunity Commission, Fact Sheet on Wearable Technology in the Workplace (2025). Analysis by Goldberg Segalla: "Even when participation is technically voluntary, employees might feel pressured to consent due to workplace culture or fear of retaliation." https://www.goldbergsegalla.com/news-and-knowledge/knowledge/eeoc-avoid-bias-with-wearable-tech-in-the-workplace/
[16] Occupational Safety and Health Act, 29 U.S.C. § 660(c). Prohibits retaliation against employees who report unsafe or unhealthy working conditions. Analogous state whistleblower statutes — including California Labor Code §§ 1102.5 and 6310 — protect workers who report employer practices that endanger health, safety, or privacy. California Labor Code §§ 1102.5, 232.5, 96(k) and others also codify protections for speaking out about workplace misconduct, talking about work conditions, organizing with employees, and exercising constitutional rights (like the California constitutional right to protest invasions of privacy).
[17] Federal Trade Commission, "AI Companies: Uphold Your Privacy and Confidentiality Commitments" (January 2024). "In its prior enforcement actions, the FTC has required businesses that unlawfully obtain consumer data to delete any products — including models and algorithms — developed in whole or in part using that unlawfully obtained data." https://www.ftc.gov/policy/advocacy-research/tech-at-ftc/2024/01/ai-companies-uphold-your-privacy-confidentiality-commitments
[18] FTC enforcement actions ordering algorithmic disgorgement include: Cambridge Analytica (2019), Everalbum (2021), WW International/Kurbo (2022), Amazon Ring (2023), Edmodo (2023), and Rite Aid (2024). As summarized in Richmond Journal of Law & Technology, "Algorithmic Disgorgement: Destruction of Artificial Intelligence Models as the FTC's Newest Enforcement Tool for Bad Data" (2023). https://scholarship.richmond.edu/jolt/vol29/iss2/1/
[19] FTC Commissioner Rebecca Kelly Slaughter called algorithmic disgorgement "an innovative and promising remedy," stating that "when companies collect data illegally, they should not be able to profit from either the data or any algorithm developed using it." Former FTC Commissioner Rohit Chopra stated that companies must "forfeit the fruits of [their] deception." As cited in IAPP, "Explaining Model Disgorgement" (December 2023). https://iapp.org/news/a/explaining-model-disgorgement
CASE REFERENCE: Gjovik v. Apple Inc., Case No. 3:23-cv-04597-EMC (N.D. Cal.). Case documents publicly available at: https://www.courtlistener.com/docket/67843254/gjovik-v-apple-inc/

3
The Decision Makers
Petition created on April 1, 2026