Stop Computer-Generated NDIS Plans


Stop Computer-Generated NDIS Plans
The issue
Stop “Computer-Generated NDIS Plans” — Demand Human Oversight and Fair Appeals
The National Disability Insurance Agency (NDIA) plans to replace human judgment with computer-generated NDIS plans using the I-CAN assessment tool.
Under this system:
- NDIA staff cannot override the computer’s decision,
- participants cannot provide independent medical or personal evidence,
- appeals are heard using the same automated tool,
- complex and fluctuating disabilities are systematically mis-measured,
- people with disability are left unsupported, unheard, and at risk.
This is injustice by design.
Australia has already committed—internationally—to protecting people from harmful automated decisions.
We adopted the OECD AI Principles in 2019, which require:
- human oversight,
- fairness,
- transparency,
- accountability,
- and protections for vulnerable groups.
The NDIA’s proposal breaks every one of these principles.
It also violates Australia’s obligations under the UN Convention on the Rights of Persons with Disabilities, which requires
fair process, reasonable adjustment, and meaningful appeal.
No Australian should be denied essential supports because a computer made a mistake — and no-one should be forced into a system that refuses to correct those mistakes.
We call on the Minister for the NDIS and the Australian Government to:
- Stop the rollout of computer-generated NDIS plans until proper safeguards are in place.
- Guarantee human oversight, with the power to amend or overturn automated decisions.
- Restore the right to submit independent evidence.
- Create a genuine appeals pathway — not a closed loop using the same computer tool.
- Ensure all disability assessment tools meet international standards for fairness, transparency, and safety.
People with disability deserve dignity, justice, and real choice — not a rigid automated system that cannot understand their lives.
Sign the petition and help stop injustice by design in the NDIS.

6,929
The issue
Stop “Computer-Generated NDIS Plans” — Demand Human Oversight and Fair Appeals
The National Disability Insurance Agency (NDIA) plans to replace human judgment with computer-generated NDIS plans using the I-CAN assessment tool.
Under this system:
- NDIA staff cannot override the computer’s decision,
- participants cannot provide independent medical or personal evidence,
- appeals are heard using the same automated tool,
- complex and fluctuating disabilities are systematically mis-measured,
- people with disability are left unsupported, unheard, and at risk.
This is injustice by design.
Australia has already committed—internationally—to protecting people from harmful automated decisions.
We adopted the OECD AI Principles in 2019, which require:
- human oversight,
- fairness,
- transparency,
- accountability,
- and protections for vulnerable groups.
The NDIA’s proposal breaks every one of these principles.
It also violates Australia’s obligations under the UN Convention on the Rights of Persons with Disabilities, which requires
fair process, reasonable adjustment, and meaningful appeal.
No Australian should be denied essential supports because a computer made a mistake — and no-one should be forced into a system that refuses to correct those mistakes.
We call on the Minister for the NDIS and the Australian Government to:
- Stop the rollout of computer-generated NDIS plans until proper safeguards are in place.
- Guarantee human oversight, with the power to amend or overturn automated decisions.
- Restore the right to submit independent evidence.
- Create a genuine appeals pathway — not a closed loop using the same computer tool.
- Ensure all disability assessment tools meet international standards for fairness, transparency, and safety.
People with disability deserve dignity, justice, and real choice — not a rigid automated system that cannot understand their lives.
Sign the petition and help stop injustice by design in the NDIS.

6,929
Supporter voices
Petition updates
Share this petition
Petition created on 3 December 2025