Regulate social media algorithms to protect Canadian children


Regulate social media algorithms to protect Canadian children
The Issue
Social media is being designed to keep children hooked, and Canada is failing to protect them!
As a parent, I am deeply concerned about the amount of time youth spend on social media and the impact it has on their mental and emotional well-being. Many young people struggle to disengage from platforms that are deliberately designed to capture and hold their attention, often at the expense of their health, sleep, and self-esteem.
Smartphones are no longer just communication tools; they are constant gateways to algorithm-driven platforms that can expose Canadian children and youth to harmful or developmentally inappropriate content. According to MediaSmarts, most children in Canada own a smartphone by age 12, placing them in digital environments that were not designed with their developmental needs in mind.
Social media platforms rely on sophisticated algorithms optimized to maximize engagement and time spent on the platform using features such as infinite scroll and personalized recommendation systems.
Research shows that this kind of engagement-prolonging design exploits principles of human psychology to keep users scrolling longer, often triggering compulsive use that adults and youth alike struggle to control.
For users under 18, this design can increase exposure to content that negatively affects mental health, including material that fuels anxiety, depression, and poor self-image.
These rising rates of anxiety, depression, and low self-esteem among Canadian youth are placing growing pressure on families, schools, healthcare services, and social supports. While social media is not the sole cause, its role in amplifying these harms cannot be ignored.
Canada needs clear, enforceable regulations that address how social media platforms design and deploy their algorithms for children and youth under 18. This should include greater transparency around how content is prioritized, limits on engagement-driven design features, and strong default protections such as meaningful parental controls and built-in time limits for under-18 users - enabled by default and designed to reduce excessive use rather than encourage it.
Our government has a responsibility to protect young people from preventable harm. Regulation must ensure that social media companies prioritize the well-being of children and youth over profit and are held accountable for the impacts of their design choices.
By signing this petition, you are calling on the Canadian government to take concrete action to regulate social media platforms in ways that protect children and youth under 18.
What We Are Asking For
We are calling on the Government of Canada to introduce federal legislation establishing age-appropriate design standards for social media platforms used by children and youth under 18.
Specifically, we are asking for:
1. Age-Appropriate Design Standards
Platforms should be required to design youth accounts with safety and well-being as the default. This includes the highest privacy settings, limits on data collection, and protections that reflect the developmental needs of minors.
2. Transparency and Independent Oversight of Algorithms
Social media companies should be required to disclose how their recommendation systems prioritize content for under-18 users and submit to independent audits assessing the risks those systems pose to youth mental health and well-being.
3. Restrictions on Manipulative Engagement Design for Minors
Features designed to maximize time spent, such as autoplay, infinite scroll, streak mechanics, and persistent behavioural nudges, should be limited or disabled by default for youth accounts.
4. Default Digital Well-Being Protections
Platforms should implement built-in safeguards for minors, including time-awareness prompts, limits on late-night notifications, and meaningful parental oversight tools enabled by default.
5. Enforceable Compliance and Accountability
Legislation must include meaningful enforcement mechanisms, including independent regulatory oversight and significant financial penalties for companies that fail to meet youth protection standards. Protections for children must be more than voluntary guidelines; they must be enforceable.
This is not a call to ban social media. It is a call to ensure that platforms serving children and youth are designed responsibly, with health and safety prioritized over engagement metrics and profit.

513
The Issue
Social media is being designed to keep children hooked, and Canada is failing to protect them!
As a parent, I am deeply concerned about the amount of time youth spend on social media and the impact it has on their mental and emotional well-being. Many young people struggle to disengage from platforms that are deliberately designed to capture and hold their attention, often at the expense of their health, sleep, and self-esteem.
Smartphones are no longer just communication tools; they are constant gateways to algorithm-driven platforms that can expose Canadian children and youth to harmful or developmentally inappropriate content. According to MediaSmarts, most children in Canada own a smartphone by age 12, placing them in digital environments that were not designed with their developmental needs in mind.
Social media platforms rely on sophisticated algorithms optimized to maximize engagement and time spent on the platform using features such as infinite scroll and personalized recommendation systems.
Research shows that this kind of engagement-prolonging design exploits principles of human psychology to keep users scrolling longer, often triggering compulsive use that adults and youth alike struggle to control.
For users under 18, this design can increase exposure to content that negatively affects mental health, including material that fuels anxiety, depression, and poor self-image.
These rising rates of anxiety, depression, and low self-esteem among Canadian youth are placing growing pressure on families, schools, healthcare services, and social supports. While social media is not the sole cause, its role in amplifying these harms cannot be ignored.
Canada needs clear, enforceable regulations that address how social media platforms design and deploy their algorithms for children and youth under 18. This should include greater transparency around how content is prioritized, limits on engagement-driven design features, and strong default protections such as meaningful parental controls and built-in time limits for under-18 users - enabled by default and designed to reduce excessive use rather than encourage it.
Our government has a responsibility to protect young people from preventable harm. Regulation must ensure that social media companies prioritize the well-being of children and youth over profit and are held accountable for the impacts of their design choices.
By signing this petition, you are calling on the Canadian government to take concrete action to regulate social media platforms in ways that protect children and youth under 18.
What We Are Asking For
We are calling on the Government of Canada to introduce federal legislation establishing age-appropriate design standards for social media platforms used by children and youth under 18.
Specifically, we are asking for:
1. Age-Appropriate Design Standards
Platforms should be required to design youth accounts with safety and well-being as the default. This includes the highest privacy settings, limits on data collection, and protections that reflect the developmental needs of minors.
2. Transparency and Independent Oversight of Algorithms
Social media companies should be required to disclose how their recommendation systems prioritize content for under-18 users and submit to independent audits assessing the risks those systems pose to youth mental health and well-being.
3. Restrictions on Manipulative Engagement Design for Minors
Features designed to maximize time spent, such as autoplay, infinite scroll, streak mechanics, and persistent behavioural nudges, should be limited or disabled by default for youth accounts.
4. Default Digital Well-Being Protections
Platforms should implement built-in safeguards for minors, including time-awareness prompts, limits on late-night notifications, and meaningful parental oversight tools enabled by default.
5. Enforceable Compliance and Accountability
Legislation must include meaningful enforcement mechanisms, including independent regulatory oversight and significant financial penalties for companies that fail to meet youth protection standards. Protections for children must be more than voluntary guidelines; they must be enforceable.
This is not a call to ban social media. It is a call to ensure that platforms serving children and youth are designed responsibly, with health and safety prioritized over engagement metrics and profit.

513
The Decision Makers
Supporter Voices
Share this petition
Petition created on January 22, 2026