No More Fake News Act


No More Fake News Act
The Issue
Summary of the Petition: “No More Fake News Act”
Every day, Americans are bombarded with false or misleading information from partisan media, viral social media accounts, and unchecked political statements. This flood of misinformation undermines our democracy, divides our communities, and puts people’s safety at risk.
It’s time to demand accountability and transparency in the media. That’s why we support the No More Fake News Act, a new law that restores trust by holding media outlets, social media platforms, and influential accounts to basic standards of honesty and disclosure.
What the Act Does
Transparency
Media outlets must disclose who owns and funds them.
Influencers and platforms must label sponsored content and AI-generated media.
Foreign-funded outlets must openly declare their ties.
Accountability
Repeated spread of false information requires clear corrections and retractions.
Platforms must notify users when their viral content qualifies for these rules — so individuals aren’t punished without warning.
Politicians can still speak freely, but when their false claims are broadcast or go viral, outlets and platforms must attach corrections so the public gets the truth.
Satire, parody, and political opinion remain fully protected.
Small creators won’t be punished for first mistakes — they’ll get education and support instead.
All major enforcement actions require bipartisan approval to prevent political abuse.
Invests in media literacy programs so citizens can recognize misinformation themselves.
Why This Matters
False news stories reach millions of Americans within hours, while corrections rarely catch up.
Politicians and pundits are free to speak, but the public deserves corrections when lies are spread as fact.
Social media companies profit from amplifying outrage, but leave users with the blame. This bill makes platforms responsible for stopping viral falsehoods.
Call to Action
We, the undersigned, call on Congress to pass the No More Fake News Act immediately. This Act protects free speech while ensuring the American people get truth, transparency, and accountability from the media and platforms we rely on every day.
Sign this petition if you believe the truth matters and it’s time to put an end to false media.
The No More Fake News Act balances accountability with free speech requiring truth in media, without silencing opinions.
Here is the actual Act in full
Media Transparency & Accountability Act
Restoring Media Accountability and Transparency
SECTION 1. TITLE.
This Act may be cited as the “No More Fake News Act”
SECTION 2. FINDINGS AND PURPOSE.
(a) Findings.
1. The spread of false or misleading information by private and partisan media outlets, social media platforms, and online accounts undermines democratic institutions, public trust, and individual safety.
2. Transparency in ownership, funding, content labeling, and algorithmic amplification is necessary for citizens to evaluate credibility.
3. Social media platforms enable individuals and small accounts to achieve sudden, large scale influence, making accountability essential regardless of follower count.
4. Citizens require media literacy education to critically evaluate information in an era of digital and algorithm driven content.
(b) Purpose: The purpose of this Act is to:
1. Strengthen accountability for private media and online platforms.
2. Require disclosure of funding, ownership, and content standards.
3. Impose remedies for repeated or deliberate dissemination of disinformation.
4. Ensure transparency in social media amplification and engagement.
5. Support educational programs to improve media literacy.
6. Preserve constitutional protections of free speech and the free press.
SECTION 3. DEFINITIONS.
For purposes of this Act:
1. “Misinformation” means demonstrably false or misleading content presented as fact, regardless of intent.
2. “Disinformation” means demonstrably false or misleading content disseminated with knowledge of its falsity and intent to deceive.
3. “Media outlet” means any entity regularly engaged in publishing or broadcasting news or commentary to the public, including but not limited to:
• Television networks and cable channels,
• Radio broadcasters,
• Print newspapers and magazines,
• Digital and online news publishers,
• Hybrid or multimedia platforms that provide news or commentary across multiple formats.
4. “Social media platform” means any online service with more than 1 million active monthly U.S. users that permits user-generated content and algorithmic promotion.
5. “Influential Account” means any individual or entity account that meets the thresholds in Section 6.
6. “Independent Oversight Board” means the body established under Section 7.
SECTION 4. SAFEGUARDS AND SPECIAL PROVISIONS.
(a) First Amendment Protection. Enforcement under this Act shall be narrowly tailored and may not extend to lawful expression of opinion, political criticism, or satire.
(b) Oversight Integrity.
1. Oversight Board appointments shall be bipartisan and staggered.
2. Board proceedings must be public and decisions accompanied by written justification.
(c) Foreign Influence Disclosure. Any media outlet, platform, or Influential Account receiving foreign funding or direction must disclose such ties publicly.
(d) Platform Compliance.
1. Platforms must undergo independent third-party audits to verify compliance.
2. Employees of platforms shall have whistleblower protections if exposing noncompliance.
(e) Small Creators. First violations by small accounts triggered under Section 6 shall result in education and compliance assistance, not penalties, unless repeated or intentional.
(f) Threshold Adjustments. The Oversight Board may recommend adjustments to impression and follower thresholds every 3 years to reflect evolving platform dynamics, subject to approval.
(g) Funding Transparency. Concealing sponsorships, shell company financing, or PAC backed funding in media or influencer content shall constitute a violation of this Act.
(h) Bipartisan Safeguard. Major enforcement actions, including fines exceeding $10 million, shall require a supermajority vote of the Oversight Board.
SECTION 5. DISCLOSURE REQUIREMENTS.
(a) Ownership Transparency.
Media outlets and platforms must disclose:
1. Ownership structures and controlling interests.
2. Major financial backers, including foreign entities.
(b) Content Labeling.
1. All opinion or commentary must be clearly labeled and distinct from factual reporting.
2. AI generated, altered, or synthetic content must be clearly disclosed.
(c) Algorithmic Transparency:
Social media platforms shall provide public reporting on algorithmic promotion and trending content, including explanations for amplification of high-engagement posts.
SECTION 6. ACCOUNTABILITY AND CORRECTIONS.
(a) Oversight: The Independent Oversight Board shall review cases of repeated dissemination of misinformation or disinformation.
(b) Remedies: Upon finding violations, the Board may require:
1. Prominent corrections or retractions in the same format as the original content.
2. Monetary fines proportionate to audience reach and monetization.
3. Temporary suspension of algorithmic amplification for repeat offenders.
(c) Civil Remedies. Individuals demonstrably harmed by disinformation may bring private action for damages.
SECTION 7. INFLUENTIAL ACCOUNTS AND VIRAL CONTENT.
(a) Influential Accounts.
Any account with 100,000 or more followers, or an average monthly reach exceeding 1 million impressions, shall be designated an Influential Account under this Act.
(b) Viral Content Trigger:
Regardless of follower count, any single post, video, or other content that achieves
1. more than 1 million views, shares, or engagements within a 30-day period, or
2. demonstrable impact on public safety, elections, or financial markets,
shall be subject to the same obligations as Influential Accounts.
(c) Obligations:
Once designated under subsection (a) or (b), account holders must:
1. Clearly label factual reporting versus commentary.
2. Disclose sponsorships, funding sources, and material connections.
3. Identify AI-generated, altered, or manipulated content.
4. Issue corrections or retractions for demonstrably false factual claims, in the same format and prominence as the original content.
(d) Enforcement:
1. Social media platforms must notify users when their content qualifies under subsection (b).
2. The Independent Oversight Board may initiate review for violations of this section.
(e) Platform Responsibility for Compliance.
1. When a post, video, or other content meets the thresholds outlined in subsection (b), the hosting social media platform shall:
• Provide written notification to the account holder that their content has been designated as Viral Content under this Act;
• Provide clear, accessible guidance regarding disclosure and correction obligations required by this Act;
• Allow a reasonable compliance period of not less than 15 days before penalties may be imposed.
2. No account holder shall be subject to civil penalties under this Act for Viral Content unless they have been provided with notification and a compliance period under paragraph (1).
3. Platforms that fail to notify account holders of their obligations shall bear liability for any resulting violations.
(e) Free Speech Protection.
Satire, parody, and personal opinion not presented as factual reporting are exempt from obligations in this section.
SECTION 8. INDEPENDENT OVERSIGHT BOARD.
(a) Establishment.— There is established an Independent Oversight Board composed of 12 members appointed equally by the President, the Speaker of the House, and the Senate Majority Leader.
(b) Membership.— Members shall include experts in law, journalism, technology, and civil liberties.
(c) Duties.— The Board shall:
1. Publish annual reports on cases, enforcement actions, and outcomes.
2. Establish clear, transparent standards for misinformation review.
3. Make proceedings publicly accessible.
SECTION 9. MEDIA LITERACY PROGRAMS.
(a) Grants. The Secretary of Education shall administer grants to states for inclusion of media literacy curricula in secondary and higher education.
(b) Public Campaigns. The Federal Communications Commission shall coordinate public service campaigns to educate citizens on identifying misinformation and disinformation.
SECTION 10. ENFORCEMENT AND PENALTIES.
(a) Civil Penalties. Media outlets, platforms, or Influential Accounts found guilty of repeated dissemination of disinformation may be fined up to 5 percent of annual revenue or equivalent earnings.
(b) Judicial Review. All enforcement decisions are subject to review in U.S. district courts.
False Political Statements and Corrections
SECTION 11. FALSE POLITICAL STATEMENTS AND CORRECTIONS.
(a) Scope. This section applies when an elected official, candidate for public office, or government representative makes a statement that is demonstrably false when presented as fact, and such statement is disseminated through a media outlet or social media platform.
(b) Media Outlet Obligations.
1. Where a false statement under subsection (a) is broadcast or published as factual reporting by a media outlet, the outlet shall issue a correction or clarification within 48 hours of verification, with equal or greater prominence than the original dissemination.
2. The correction shall clearly identify the false statement and provide accurate, verified information.
(c) Social Media Platform Obligations.
1. When such a statement is disseminated on a social media platform and verified as false by independent, nonpartisan fact-checking sources:
• (A) The platform shall affix a context label to the content indicating the statement is disputed, with a link to verified information;
• (B) The original statement shall remain visible to preserve free political expression;
• (C) The statement shall not be algorithmically promoted as factual reporting until corrective information is attached.
2. If the statement exceeds the viral threshold set forth in Section 6(b), the platform shall also display a side-by-side correction card within feeds where the content appears, citing data or findings from independent, authoritative sources.
(d) User Protections.
1. Individual users who share or comment on such content shall not be penalized under this Act unless they knowingly monetize or materially alter the content to misrepresent corrections.
2. The compliance burden rests with the media outlet or platform, not the speaker or user.
(e) Transparency.
Media outlets and platforms subject to this section shall publish quarterly reports detailing:
1. The number of false political statements corrected or labeled
2. The corrective actions taken
3. Appeals submitted and their outcomes.
SECTION 11. PROTECTION OF FREE SPEECH.
(a) Limitation. Nothing in this Act shall be construed to authorize censorship of lawful speech, parody, satire, or political opinion.
(b) Safeguard. Enforcement applies only to demonstrably false factual claims published with reckless disregard or deliberate intent to mislead.
SECTION 12. REPORTING REQUIREMENTS.
The Independent Oversight Board shall issue an annual public report including:
1. Number of misinformation cases reviewed.
2. Remedies applied and penalties imposed.
3. Trends in algorithmic amplification and viral misinformation.
SECTION 13. EFFECTIVE DATE.
This Act shall take effect 18 months after enactment.
2
The Issue
Summary of the Petition: “No More Fake News Act”
Every day, Americans are bombarded with false or misleading information from partisan media, viral social media accounts, and unchecked political statements. This flood of misinformation undermines our democracy, divides our communities, and puts people’s safety at risk.
It’s time to demand accountability and transparency in the media. That’s why we support the No More Fake News Act, a new law that restores trust by holding media outlets, social media platforms, and influential accounts to basic standards of honesty and disclosure.
What the Act Does
Transparency
Media outlets must disclose who owns and funds them.
Influencers and platforms must label sponsored content and AI-generated media.
Foreign-funded outlets must openly declare their ties.
Accountability
Repeated spread of false information requires clear corrections and retractions.
Platforms must notify users when their viral content qualifies for these rules — so individuals aren’t punished without warning.
Politicians can still speak freely, but when their false claims are broadcast or go viral, outlets and platforms must attach corrections so the public gets the truth.
Satire, parody, and political opinion remain fully protected.
Small creators won’t be punished for first mistakes — they’ll get education and support instead.
All major enforcement actions require bipartisan approval to prevent political abuse.
Invests in media literacy programs so citizens can recognize misinformation themselves.
Why This Matters
False news stories reach millions of Americans within hours, while corrections rarely catch up.
Politicians and pundits are free to speak, but the public deserves corrections when lies are spread as fact.
Social media companies profit from amplifying outrage, but leave users with the blame. This bill makes platforms responsible for stopping viral falsehoods.
Call to Action
We, the undersigned, call on Congress to pass the No More Fake News Act immediately. This Act protects free speech while ensuring the American people get truth, transparency, and accountability from the media and platforms we rely on every day.
Sign this petition if you believe the truth matters and it’s time to put an end to false media.
The No More Fake News Act balances accountability with free speech requiring truth in media, without silencing opinions.
Here is the actual Act in full
Media Transparency & Accountability Act
Restoring Media Accountability and Transparency
SECTION 1. TITLE.
This Act may be cited as the “No More Fake News Act”
SECTION 2. FINDINGS AND PURPOSE.
(a) Findings.
1. The spread of false or misleading information by private and partisan media outlets, social media platforms, and online accounts undermines democratic institutions, public trust, and individual safety.
2. Transparency in ownership, funding, content labeling, and algorithmic amplification is necessary for citizens to evaluate credibility.
3. Social media platforms enable individuals and small accounts to achieve sudden, large scale influence, making accountability essential regardless of follower count.
4. Citizens require media literacy education to critically evaluate information in an era of digital and algorithm driven content.
(b) Purpose: The purpose of this Act is to:
1. Strengthen accountability for private media and online platforms.
2. Require disclosure of funding, ownership, and content standards.
3. Impose remedies for repeated or deliberate dissemination of disinformation.
4. Ensure transparency in social media amplification and engagement.
5. Support educational programs to improve media literacy.
6. Preserve constitutional protections of free speech and the free press.
SECTION 3. DEFINITIONS.
For purposes of this Act:
1. “Misinformation” means demonstrably false or misleading content presented as fact, regardless of intent.
2. “Disinformation” means demonstrably false or misleading content disseminated with knowledge of its falsity and intent to deceive.
3. “Media outlet” means any entity regularly engaged in publishing or broadcasting news or commentary to the public, including but not limited to:
• Television networks and cable channels,
• Radio broadcasters,
• Print newspapers and magazines,
• Digital and online news publishers,
• Hybrid or multimedia platforms that provide news or commentary across multiple formats.
4. “Social media platform” means any online service with more than 1 million active monthly U.S. users that permits user-generated content and algorithmic promotion.
5. “Influential Account” means any individual or entity account that meets the thresholds in Section 6.
6. “Independent Oversight Board” means the body established under Section 7.
SECTION 4. SAFEGUARDS AND SPECIAL PROVISIONS.
(a) First Amendment Protection. Enforcement under this Act shall be narrowly tailored and may not extend to lawful expression of opinion, political criticism, or satire.
(b) Oversight Integrity.
1. Oversight Board appointments shall be bipartisan and staggered.
2. Board proceedings must be public and decisions accompanied by written justification.
(c) Foreign Influence Disclosure. Any media outlet, platform, or Influential Account receiving foreign funding or direction must disclose such ties publicly.
(d) Platform Compliance.
1. Platforms must undergo independent third-party audits to verify compliance.
2. Employees of platforms shall have whistleblower protections if exposing noncompliance.
(e) Small Creators. First violations by small accounts triggered under Section 6 shall result in education and compliance assistance, not penalties, unless repeated or intentional.
(f) Threshold Adjustments. The Oversight Board may recommend adjustments to impression and follower thresholds every 3 years to reflect evolving platform dynamics, subject to approval.
(g) Funding Transparency. Concealing sponsorships, shell company financing, or PAC backed funding in media or influencer content shall constitute a violation of this Act.
(h) Bipartisan Safeguard. Major enforcement actions, including fines exceeding $10 million, shall require a supermajority vote of the Oversight Board.
SECTION 5. DISCLOSURE REQUIREMENTS.
(a) Ownership Transparency.
Media outlets and platforms must disclose:
1. Ownership structures and controlling interests.
2. Major financial backers, including foreign entities.
(b) Content Labeling.
1. All opinion or commentary must be clearly labeled and distinct from factual reporting.
2. AI generated, altered, or synthetic content must be clearly disclosed.
(c) Algorithmic Transparency:
Social media platforms shall provide public reporting on algorithmic promotion and trending content, including explanations for amplification of high-engagement posts.
SECTION 6. ACCOUNTABILITY AND CORRECTIONS.
(a) Oversight: The Independent Oversight Board shall review cases of repeated dissemination of misinformation or disinformation.
(b) Remedies: Upon finding violations, the Board may require:
1. Prominent corrections or retractions in the same format as the original content.
2. Monetary fines proportionate to audience reach and monetization.
3. Temporary suspension of algorithmic amplification for repeat offenders.
(c) Civil Remedies. Individuals demonstrably harmed by disinformation may bring private action for damages.
SECTION 7. INFLUENTIAL ACCOUNTS AND VIRAL CONTENT.
(a) Influential Accounts.
Any account with 100,000 or more followers, or an average monthly reach exceeding 1 million impressions, shall be designated an Influential Account under this Act.
(b) Viral Content Trigger:
Regardless of follower count, any single post, video, or other content that achieves
1. more than 1 million views, shares, or engagements within a 30-day period, or
2. demonstrable impact on public safety, elections, or financial markets,
shall be subject to the same obligations as Influential Accounts.
(c) Obligations:
Once designated under subsection (a) or (b), account holders must:
1. Clearly label factual reporting versus commentary.
2. Disclose sponsorships, funding sources, and material connections.
3. Identify AI-generated, altered, or manipulated content.
4. Issue corrections or retractions for demonstrably false factual claims, in the same format and prominence as the original content.
(d) Enforcement:
1. Social media platforms must notify users when their content qualifies under subsection (b).
2. The Independent Oversight Board may initiate review for violations of this section.
(e) Platform Responsibility for Compliance.
1. When a post, video, or other content meets the thresholds outlined in subsection (b), the hosting social media platform shall:
• Provide written notification to the account holder that their content has been designated as Viral Content under this Act;
• Provide clear, accessible guidance regarding disclosure and correction obligations required by this Act;
• Allow a reasonable compliance period of not less than 15 days before penalties may be imposed.
2. No account holder shall be subject to civil penalties under this Act for Viral Content unless they have been provided with notification and a compliance period under paragraph (1).
3. Platforms that fail to notify account holders of their obligations shall bear liability for any resulting violations.
(e) Free Speech Protection.
Satire, parody, and personal opinion not presented as factual reporting are exempt from obligations in this section.
SECTION 8. INDEPENDENT OVERSIGHT BOARD.
(a) Establishment.— There is established an Independent Oversight Board composed of 12 members appointed equally by the President, the Speaker of the House, and the Senate Majority Leader.
(b) Membership.— Members shall include experts in law, journalism, technology, and civil liberties.
(c) Duties.— The Board shall:
1. Publish annual reports on cases, enforcement actions, and outcomes.
2. Establish clear, transparent standards for misinformation review.
3. Make proceedings publicly accessible.
SECTION 9. MEDIA LITERACY PROGRAMS.
(a) Grants. The Secretary of Education shall administer grants to states for inclusion of media literacy curricula in secondary and higher education.
(b) Public Campaigns. The Federal Communications Commission shall coordinate public service campaigns to educate citizens on identifying misinformation and disinformation.
SECTION 10. ENFORCEMENT AND PENALTIES.
(a) Civil Penalties. Media outlets, platforms, or Influential Accounts found guilty of repeated dissemination of disinformation may be fined up to 5 percent of annual revenue or equivalent earnings.
(b) Judicial Review. All enforcement decisions are subject to review in U.S. district courts.
False Political Statements and Corrections
SECTION 11. FALSE POLITICAL STATEMENTS AND CORRECTIONS.
(a) Scope. This section applies when an elected official, candidate for public office, or government representative makes a statement that is demonstrably false when presented as fact, and such statement is disseminated through a media outlet or social media platform.
(b) Media Outlet Obligations.
1. Where a false statement under subsection (a) is broadcast or published as factual reporting by a media outlet, the outlet shall issue a correction or clarification within 48 hours of verification, with equal or greater prominence than the original dissemination.
2. The correction shall clearly identify the false statement and provide accurate, verified information.
(c) Social Media Platform Obligations.
1. When such a statement is disseminated on a social media platform and verified as false by independent, nonpartisan fact-checking sources:
• (A) The platform shall affix a context label to the content indicating the statement is disputed, with a link to verified information;
• (B) The original statement shall remain visible to preserve free political expression;
• (C) The statement shall not be algorithmically promoted as factual reporting until corrective information is attached.
2. If the statement exceeds the viral threshold set forth in Section 6(b), the platform shall also display a side-by-side correction card within feeds where the content appears, citing data or findings from independent, authoritative sources.
(d) User Protections.
1. Individual users who share or comment on such content shall not be penalized under this Act unless they knowingly monetize or materially alter the content to misrepresent corrections.
2. The compliance burden rests with the media outlet or platform, not the speaker or user.
(e) Transparency.
Media outlets and platforms subject to this section shall publish quarterly reports detailing:
1. The number of false political statements corrected or labeled
2. The corrective actions taken
3. Appeals submitted and their outcomes.
SECTION 11. PROTECTION OF FREE SPEECH.
(a) Limitation. Nothing in this Act shall be construed to authorize censorship of lawful speech, parody, satire, or political opinion.
(b) Safeguard. Enforcement applies only to demonstrably false factual claims published with reckless disregard or deliberate intent to mislead.
SECTION 12. REPORTING REQUIREMENTS.
The Independent Oversight Board shall issue an annual public report including:
1. Number of misinformation cases reviewed.
2. Remedies applied and penalties imposed.
3. Trends in algorithmic amplification and viral misinformation.
SECTION 13. EFFECTIVE DATE.
This Act shall take effect 18 months after enactment.
2
Share this petition
Petition created on September 14, 2025