Reform Section 230 to protect small businesses from false reviews


Reform Section 230 to protect small businesses from false reviews
The Issue
Small business owners across America are under attack—not by competitors, but by anonymous internet users posting false, defamatory reviews on platforms like Google Reviews, Yelp, TripAdvisor, and Facebook. Under Section 230 of the Communications Decency Act, these multibillion-dollar corporations face zero accountability for the lies posted on their platforms.
WHAT'S HAPPENING TO SMALL BUSINESSES
Thousands of small business owners are watching their professional reputations be damaged by false reviews that these platforms refuse to remove. When clearly defamatory content is reported, business owners are met with automated responses and bot-driven "moderation" that does nothing.
This is happening to small businesses everywhere:
Restaurants destroyed by fake health complaints
Medical practices defamed by fabricated patient experiences
Service providers slandered by people who were never customers
Local shops losing business due to coordinated false review attacks
Professional services firms damaged by malicious lies
WHY PLATFORMS LIKE GOOGLE AND YELP WON'T HELP
Google, Yelp, and other tech giants have the resources to properly moderate their platforms. Google's parent company Alphabet has a market cap exceeding $1.5 trillion. Yelp generates hundreds of millions in annual revenue. These aren't struggling startups—they're multinational corporations with virtually unlimited resources.
So why won't they invest in real moderation? Simple: Section 230 gives them complete immunity, so there's no legal incentive to spend money on actual human oversight. It's cheaper to let bots handle everything and let small businesses suffer.
THE BOT PROBLEM
Here's what actually happens when a small business owner tries to dispute a false or malicious review:
- You report the review through the platform's system
- An automated algorithm scans the content in seconds
- A bot sends you a generic response: "This review doesn't violate our policies"
- No human being ever looks at your case
That's right—there is no human review process. Your business's reputation, your livelihood, your years of hard work—all judged by an algorithm in milliseconds. The bot can't understand context. It can't recognize obvious lies. It can't detect coordinated attacks. It simply applies crude keyword filters and moves on.
When you appeal? Another bot reviews it. Same result.
THE ECONOMIC REALITY
Why do platforms rely exclusively on bots? Because Section 230 means they face zero consequences for getting it wrong.
- Hiring human moderators costs money - money that these big tech review platforms nonetheless have
- Training staff to evaluate complex disputes takes resources
- Creating real accountability systems requires investment
But when the law gives you complete immunity, why bother? These platforms have done the math: It's cheaper to ignore small businesses than to hire the staff needed for proper moderation.
Google, Facebook, and Yelp make billions in profit annually. They could easily afford teams of human moderators to review flagged content. But without legal incentive or oversight, they simply won't invest in the human resources needed to protect small businesses from false and defamatory attacks.
The message is clear: Your reputation - and more importantly, your livelihood - isn't worth the cost of a human reviewer.
WHAT SECTION 230 DOES
Section 230 grants online platforms sweeping protection from liability for user-generated content. In practice, this means:
✗ Platforms can host defamatory lies with impunity
✗ False reviews stay up indefinitely, destroying reputations and livelihoods of small business owners
✗ Business owners have virtually no legal recourse
✗ Platforms prioritize profit over truth and accountability
WHAT WE'RE DEMANDING
We call on Congress and state legislators to:
- Reform or repeal Section 230 to hold platforms accountable for knowingly hosting false, fake, or defamatory content
- Require human moderation review for reported content, not just automated systems that rubber-stamp everything
- Create liability for platforms that fail to remove demonstrably false content after proper notification
- Establish a reasonable appeals process for small businesses to challenge false reviews, mandating that those appeals are also reviewed and considered by human reviewers
- Ensure platforms with massive resources (like Google and Yelp) meet higher standards of care than small platforms
WHY THIS MATTERS
When Section 230 was written in 1996, the internet was new. Platforms were small. The law made sense then.
But today? The internet is dominated by some of the world's richest corporations. They can afford proper moderation—they just choose not to because the law doesn't require it.
Meanwhile, small businesses—the backbone of our economy—are left defenseless against anonymous attacks that can destroy years of hard work overnight.
THE BIPARTISAN SOLUTION
Politicians from both parties agree: Section 230 needs reform. This isn't about left or right—it's about fairness, accountability, and protecting small businesses from corporate negligence.
SIGN THIS PETITION IF YOU BELIEVE:
✓ Tech giants should be held accountable for the content they profit from
✓ Small businesses deserve protection from false and defamatory attacks
✓ Platforms should invest in real human-led moderation, not just bots
✓ Truth and reputation matter
Together, we can demand change. Sign now and share this petition with every small business owner you know.
22
The Issue
Small business owners across America are under attack—not by competitors, but by anonymous internet users posting false, defamatory reviews on platforms like Google Reviews, Yelp, TripAdvisor, and Facebook. Under Section 230 of the Communications Decency Act, these multibillion-dollar corporations face zero accountability for the lies posted on their platforms.
WHAT'S HAPPENING TO SMALL BUSINESSES
Thousands of small business owners are watching their professional reputations be damaged by false reviews that these platforms refuse to remove. When clearly defamatory content is reported, business owners are met with automated responses and bot-driven "moderation" that does nothing.
This is happening to small businesses everywhere:
Restaurants destroyed by fake health complaints
Medical practices defamed by fabricated patient experiences
Service providers slandered by people who were never customers
Local shops losing business due to coordinated false review attacks
Professional services firms damaged by malicious lies
WHY PLATFORMS LIKE GOOGLE AND YELP WON'T HELP
Google, Yelp, and other tech giants have the resources to properly moderate their platforms. Google's parent company Alphabet has a market cap exceeding $1.5 trillion. Yelp generates hundreds of millions in annual revenue. These aren't struggling startups—they're multinational corporations with virtually unlimited resources.
So why won't they invest in real moderation? Simple: Section 230 gives them complete immunity, so there's no legal incentive to spend money on actual human oversight. It's cheaper to let bots handle everything and let small businesses suffer.
THE BOT PROBLEM
Here's what actually happens when a small business owner tries to dispute a false or malicious review:
- You report the review through the platform's system
- An automated algorithm scans the content in seconds
- A bot sends you a generic response: "This review doesn't violate our policies"
- No human being ever looks at your case
That's right—there is no human review process. Your business's reputation, your livelihood, your years of hard work—all judged by an algorithm in milliseconds. The bot can't understand context. It can't recognize obvious lies. It can't detect coordinated attacks. It simply applies crude keyword filters and moves on.
When you appeal? Another bot reviews it. Same result.
THE ECONOMIC REALITY
Why do platforms rely exclusively on bots? Because Section 230 means they face zero consequences for getting it wrong.
- Hiring human moderators costs money - money that these big tech review platforms nonetheless have
- Training staff to evaluate complex disputes takes resources
- Creating real accountability systems requires investment
But when the law gives you complete immunity, why bother? These platforms have done the math: It's cheaper to ignore small businesses than to hire the staff needed for proper moderation.
Google, Facebook, and Yelp make billions in profit annually. They could easily afford teams of human moderators to review flagged content. But without legal incentive or oversight, they simply won't invest in the human resources needed to protect small businesses from false and defamatory attacks.
The message is clear: Your reputation - and more importantly, your livelihood - isn't worth the cost of a human reviewer.
WHAT SECTION 230 DOES
Section 230 grants online platforms sweeping protection from liability for user-generated content. In practice, this means:
✗ Platforms can host defamatory lies with impunity
✗ False reviews stay up indefinitely, destroying reputations and livelihoods of small business owners
✗ Business owners have virtually no legal recourse
✗ Platforms prioritize profit over truth and accountability
WHAT WE'RE DEMANDING
We call on Congress and state legislators to:
- Reform or repeal Section 230 to hold platforms accountable for knowingly hosting false, fake, or defamatory content
- Require human moderation review for reported content, not just automated systems that rubber-stamp everything
- Create liability for platforms that fail to remove demonstrably false content after proper notification
- Establish a reasonable appeals process for small businesses to challenge false reviews, mandating that those appeals are also reviewed and considered by human reviewers
- Ensure platforms with massive resources (like Google and Yelp) meet higher standards of care than small platforms
WHY THIS MATTERS
When Section 230 was written in 1996, the internet was new. Platforms were small. The law made sense then.
But today? The internet is dominated by some of the world's richest corporations. They can afford proper moderation—they just choose not to because the law doesn't require it.
Meanwhile, small businesses—the backbone of our economy—are left defenseless against anonymous attacks that can destroy years of hard work overnight.
THE BIPARTISAN SOLUTION
Politicians from both parties agree: Section 230 needs reform. This isn't about left or right—it's about fairness, accountability, and protecting small businesses from corporate negligence.
SIGN THIS PETITION IF YOU BELIEVE:
✓ Tech giants should be held accountable for the content they profit from
✓ Small businesses deserve protection from false and defamatory attacks
✓ Platforms should invest in real human-led moderation, not just bots
✓ Truth and reputation matter
Together, we can demand change. Sign now and share this petition with every small business owner you know.
22
The Decision Makers

Supporter Voices
Petition Updates
Share this petition
Petition created on October 9, 2025