Stop YouTube's Algorithm from Suggesting Politically Charged Content to Young Users

The Issue

I found myself, along with many others, victim to YouTube's seemingly harmless yet profoundly impactful algorithm. All it takes is one, usually not even overtly, political video for the cycle to start. The videos recommended to us by the algorithm echo only one corner of the political compass, exposing us to a deluge of bias and misinformation. The misrepresentation of facts, the targeted dislike towards certain groups, the manipulation - they all led to the shaping of my political views. These views didn't align with the ones I later developed from a balanced, unbiased perspective. I was fortunate enough to escape these rabbit holes, but it's critical to acknowledge the truth: not everyone is as lucky.



Algorithms, like YouTube's, that suggest politically charged content to underage viewers, create a feedback loop of radicalization. These young minds are particularly susceptible to such manipulation and can lead to them to develop hateful views of certain people, and even potentially extremism. This isn't an uncommon problem. All over the internet are examples of young minds spreading hate and misinformation, which they were most likely fed by political content creators.

We urge YouTube to revise its algorithms that recommend politically charged content to underage viewers. This is an immediate step in the right direction to halt digital radicalization and provide an unbiased information platform to our youth. Let's not let their susceptibility be a mark against them, but rather protectorate their right to balanced information. Sign the petition and help us get YouTube to make this vital change.

11

The Issue

I found myself, along with many others, victim to YouTube's seemingly harmless yet profoundly impactful algorithm. All it takes is one, usually not even overtly, political video for the cycle to start. The videos recommended to us by the algorithm echo only one corner of the political compass, exposing us to a deluge of bias and misinformation. The misrepresentation of facts, the targeted dislike towards certain groups, the manipulation - they all led to the shaping of my political views. These views didn't align with the ones I later developed from a balanced, unbiased perspective. I was fortunate enough to escape these rabbit holes, but it's critical to acknowledge the truth: not everyone is as lucky.



Algorithms, like YouTube's, that suggest politically charged content to underage viewers, create a feedback loop of radicalization. These young minds are particularly susceptible to such manipulation and can lead to them to develop hateful views of certain people, and even potentially extremism. This isn't an uncommon problem. All over the internet are examples of young minds spreading hate and misinformation, which they were most likely fed by political content creators.

We urge YouTube to revise its algorithms that recommend politically charged content to underage viewers. This is an immediate step in the right direction to halt digital radicalization and provide an unbiased information platform to our youth. Let's not let their susceptibility be a mark against them, but rather protectorate their right to balanced information. Sign the petition and help us get YouTube to make this vital change.

Petition Updates