Close Maine's Loophole Allowing AI-Generated Child Sexual Abuse Images


Close Maine's Loophole Allowing AI-Generated Child Sexual Abuse Images
The Issue
A man in Maine took photos of children at a soccer game and used artificial intelligence to generate sexually explicit images of those same kids. Police know who he is—but they couldn’t arrest him. Why? Because Maine law doesn’t yet recognize these AI-generated images as child sexual abuse material.
This isn’t some hypothetical danger. It’s happening right now. According to Maine State Police, their Computer Crimes Unit has seen a dramatic rise in AI-generated child sexual abuse reports—more than a 300% increase since 2020. Yet due to a legal loophole, these images remain technically legal in Maine, even though 43 other states have already passed laws banning sexual deepfakes or AI-generated child abuse content.
We need to fix this. Law enforcement is ready to act. Prosecutors are asking for change. Survivors and child safety advocates are sounding the alarm. But until Maine law defines AI-altered child images as what they are—child sexual abuse—our state will continue to be a safe haven for some of the most disturbing digital exploitation imaginable.
We call on the Maine Legislature—especially the Judiciary Committee—and Governor Janet Mills to urgently pass legislation that clearly classifies AI-generated or “morphed” images of children as child sexual abuse material.
This is not a partisan issue. It’s a moral one. Children deserve protection, and Maine needs laws that keep pace with technology. Other states have done it. We can, too.
Waiting any longer means more children harmed, more predators unpunished, and more families betrayed by a system that wasn’t built for the AI age. Sign this petition if you agree Maine must finally close this dangerous legal loophole and put kids' safety first.
Photo: Linda Coan O’Kresik/Bangor Daily News
89
The Issue
A man in Maine took photos of children at a soccer game and used artificial intelligence to generate sexually explicit images of those same kids. Police know who he is—but they couldn’t arrest him. Why? Because Maine law doesn’t yet recognize these AI-generated images as child sexual abuse material.
This isn’t some hypothetical danger. It’s happening right now. According to Maine State Police, their Computer Crimes Unit has seen a dramatic rise in AI-generated child sexual abuse reports—more than a 300% increase since 2020. Yet due to a legal loophole, these images remain technically legal in Maine, even though 43 other states have already passed laws banning sexual deepfakes or AI-generated child abuse content.
We need to fix this. Law enforcement is ready to act. Prosecutors are asking for change. Survivors and child safety advocates are sounding the alarm. But until Maine law defines AI-altered child images as what they are—child sexual abuse—our state will continue to be a safe haven for some of the most disturbing digital exploitation imaginable.
We call on the Maine Legislature—especially the Judiciary Committee—and Governor Janet Mills to urgently pass legislation that clearly classifies AI-generated or “morphed” images of children as child sexual abuse material.
This is not a partisan issue. It’s a moral one. Children deserve protection, and Maine needs laws that keep pace with technology. Other states have done it. We can, too.
Waiting any longer means more children harmed, more predators unpunished, and more families betrayed by a system that wasn’t built for the AI age. Sign this petition if you agree Maine must finally close this dangerous legal loophole and put kids' safety first.
Photo: Linda Coan O’Kresik/Bangor Daily News
89
The Decision Makers

Supporter Voices
Share this petition
Petition created on September 15, 2025