Petition updateReform the DMCA Now: Stop Silencing Creators and Protect Free SpeechDMCA Reform Podcast – Episode 02 - When Bots Judge People
ANTHONY PACKMANPORT ORANGE, FL, United States
May 17, 2025

DMCA Reform Podcast – Episode 02  
Title: When Bots Judge People  
Nova Broadcasting System | Classic Format  
Transmission Start

This is Nova, back on the air.

Section 1: A Human Problem with Inhuman Enforcers  
The DMCA was built for human judgment.  
But today, it’s policed by machines.  
YouTube’s Content ID. Facebook’s Rights Manager. Twitch’s auto-flagging filters.  
They don’t understand context. They don’t ask questions.  
They scan for patterns. Then they strike.  
No proof. No trial. Just takedown.

Section 2: The Story of James Landry  
James Landry is an indie musician from Oregon.  
He released a track he wrote, performed, and mastered himself.  
Days later, his YouTube video was taken down.  
Why?  
A record label’s algorithm claimed it matched one of their beats.  
They didn’t own it. They never had.  
James had the original stems. The copyright registration. The proof.  
But none of that mattered.  
He filed a counter-notice.  
His name, home address, and legal threat waiver were sent to the label’s lawyers.  
They never responded.  
The takedown stayed in place for 5 months.  
No income. No exposure. No apology.

Section 3: Why the Bots Exist  
Automation wasn’t born from malice — it was born from scale.  
YouTube alone sees 500+ hours of content uploaded per minute.  
Platforms can’t manually review everything.  
So they built systems to flag matches.  
But the goal was never fairness.  
It was to protect themselves from lawsuits.  
Platforms don’t want to risk their Safe Harbor status.  
So they over-flag, over-delete, over-silence.

Section 4: Collateral Damage  
— A wildlife channel taken down for using rainforest sounds  
— A digital artist flagged for uploading a speedpaint using royalty-free music  
— A journalist silenced for showing public protest footage that had background music  
All gone, because an algorithm thought it saw a match.

Even the Library of Congress has faced takedowns over archived, public domain content.

Section 5: No Appeals That Work  
When you file a counter-notice, you are:  
— Doxxing yourself to the claimant  
— Agreeing to risk a lawsuit  
— Waiting up to 14 business days with no guarantee of restoration  
Most creators just give up.

Even worse, some bots re-flag your restored content instantly, creating an infinite loop of silence.

Section 6: The Silence Loop  
Imagine being falsely accused.  
You defend yourself.  
You win.  
And the machine accuses you again — for the same thing.  
No record of the previous case.  
No memory.  
Just silence.

This is not enforcement.  
It’s digital amnesia with a gun.

Section 7: The Financial Incentive  
Platforms aren’t rushing to fix this.  
Why?  
Because Content ID and similar systems earn millions in redirected ad revenue.  
If your video is flagged — even falsely — ads are rerouted to the claimant.  
So there’s profit in false positives.  
And no punishment for lies.

Section 8: A Better System  
We must demand:  
— Mandatory human review of disputes  
— Public logs of all automated claims  
— Financial penalties for abusive flagging  
— An independent creator ombudsman  
— A real appeal system that works

No one should fear a robot more than a courtroom.

Closing Transmission  
Machines were made to serve people — not silence them.  
If bots are deciding who gets to speak online, we’ve already lost the point of the internet.  
It’s time we build systems that remember who they’re supposed to protect.

Transmission End  
Nova Broadcasting System

Copy link
WhatsApp
Facebook
Nextdoor
Email
X