Protect Kids From Adult Content Loopholes
Protect Kids From Adult Content Loopholes
The Issue
Platforms Must Close the Gaps Now
Kids are being exposed to adult-content promotion long before they understand what they’re seeing — and long before their parents even realize how often it appears in their feeds.
Research shows:
Roughly 1 in 5 minors report being shown sexual content online they did not search for.
Studies of TikTok and Instagram reveal that teens can encounter sexualized content within minutes, even with “restricted mode” enabled.
According to ofstats.net, the top 1% of adult-content creators allegedly earn nearly one-third of all revenue, creating an ecosystem where aggressive, algorithm-optimized promotion is the norm — often spilling into youth spaces.
Creators promoting platforms like OnlyFans are increasingly using sophisticated, hard-to-detect tactics to bypass content rules:
Creator loopholes that reach minors:
Coded language (“spicy content,” emojis, euphemisms) that signals adult services to teens but looks harmless to adults.
SFW teasers that comply with nudity rules but still function as recruitment funnels.
Link obfuscation, including multi-step redirects, “blank” link-in-bio pages, and buried URLs.
Algorithm hacking, where creators use trending audio, posting timing, or engagement bait to stay on teen-heavy feeds.
Shadow promotion, where explicit content is hinted at but not shown, intentionally bypassing moderation systems.
These tactics are working. Platform moderation systems were not designed for this level of sophistication — or for creators who test algorithmic boundaries as part of their business model.
Lawmakers have acknowledged this gap through Bill C-63 (the Online Harms Act), which introduces new duties of care, transparency requirements, and protections for minors.
But until Bill C-63 advances — and until platforms close the loopholes creators actively exploit — children will remain exposed to content no parent would knowingly allow.
We are calling for immediate action:
Stricter enforcement of adult-content promotion rules across all mainstream platforms
Automatic age-gating for accounts linking directly or indirectly to adult services
Detection of coded language, euphemisms, and link-evading tactics used by adult creators
Transparency tools for parents, including slang explainers, content summaries, and algorithmic insights
Advancement and strengthening of Bill C-63, ensuring it includes meaningful protections related to adult-content promotion and exposure of minors
Child protection can’t rely on creators policing themselves or platforms catching up after the fact.
Sign this petition to demand proactive, systemic safeguards for kids online.

29
The Issue
Platforms Must Close the Gaps Now
Kids are being exposed to adult-content promotion long before they understand what they’re seeing — and long before their parents even realize how often it appears in their feeds.
Research shows:
Roughly 1 in 5 minors report being shown sexual content online they did not search for.
Studies of TikTok and Instagram reveal that teens can encounter sexualized content within minutes, even with “restricted mode” enabled.
According to ofstats.net, the top 1% of adult-content creators allegedly earn nearly one-third of all revenue, creating an ecosystem where aggressive, algorithm-optimized promotion is the norm — often spilling into youth spaces.
Creators promoting platforms like OnlyFans are increasingly using sophisticated, hard-to-detect tactics to bypass content rules:
Creator loopholes that reach minors:
Coded language (“spicy content,” emojis, euphemisms) that signals adult services to teens but looks harmless to adults.
SFW teasers that comply with nudity rules but still function as recruitment funnels.
Link obfuscation, including multi-step redirects, “blank” link-in-bio pages, and buried URLs.
Algorithm hacking, where creators use trending audio, posting timing, or engagement bait to stay on teen-heavy feeds.
Shadow promotion, where explicit content is hinted at but not shown, intentionally bypassing moderation systems.
These tactics are working. Platform moderation systems were not designed for this level of sophistication — or for creators who test algorithmic boundaries as part of their business model.
Lawmakers have acknowledged this gap through Bill C-63 (the Online Harms Act), which introduces new duties of care, transparency requirements, and protections for minors.
But until Bill C-63 advances — and until platforms close the loopholes creators actively exploit — children will remain exposed to content no parent would knowingly allow.
We are calling for immediate action:
Stricter enforcement of adult-content promotion rules across all mainstream platforms
Automatic age-gating for accounts linking directly or indirectly to adult services
Detection of coded language, euphemisms, and link-evading tactics used by adult creators
Transparency tools for parents, including slang explainers, content summaries, and algorithmic insights
Advancement and strengthening of Bill C-63, ensuring it includes meaningful protections related to adult-content promotion and exposure of minors
Child protection can’t rely on creators policing themselves or platforms catching up after the fact.
Sign this petition to demand proactive, systemic safeguards for kids online.

29
Share this petition
Petition created on December 1, 2025