Stop YouTube for taking down YouTubers with 'Kid-Friendly Content'

0 have signed. Let’s get to 200!
At 200 signatures, this petition is more likely to be featured in recommendations!
Katie Mealey started this petition to YouTube and

YouTube is about to embark on its most significant changes to kids videos yet. Google's massive online video site is re-engineering how it treats clips for children following last week's record $170 million penalty for violating kids' data privacy: YouTube pledged to disable comments, notifications and personalized ads on all videos directed at children. And its machine learning will police YouTube's sprawling catalog to keep kids videos in line, the company said. 

One problem: YouTube's machine learning was supposed to be suspending comments on videos featuring young minors already. It hasn't.

Comment-enabled videos prominently depicting young kids are still easy to find on YouTube. A single YouTube search for one kids-focused subject -- "pretend play" -- returned more than 100 videos with comments enabled, all prominently featuring infants, preschoolers and other children young enough to still have their baby teeth. 

After CNET contacted YouTube with a list of these videos, comments were disabled on nearly half of them. 

"We invest significantly in the teams and technologies that allow us to provide minors and families the best protection possible," YouTube spokeswoman Ivy Choi said. "We've suspended comments on hundreds of millions of videos featuring minors in risky situations and implemented a classifier that helps us remove two times the number of violative comments. We continue to disable comments on hundreds of thousands of videos a day and improve our classifiers."

YouTube is the world's biggest online video source, with 2 billion monthly users -- so big, in fact, it's the world's top source for kids videos too. Kids content is one of its most-watched categories, but YouTube has come under fire for a range of scandals involving children. That $170 million penalty addressed the data YouTube collects on kids without parents' consent. But YouTube has also faced scandals involving videos of child abuse and exploitation and nightmarish content in its YouTube Kids app, pitched as a kid-safe zone. 

YouTube's difficulty managing children's content is only one problem in a parade that the Google-owned site has faced in the last few years, including claims it allows hate speech to proliferate, spreads conspiracy theories and discriminates against some creators. Google's one of the big tech companies facing increasing questions about the power it wields, with the Justice Department kicking off an antitrust investigation into big tech.

In February, YouTube said it would disable comments on videos with young kids following an outcry over a ring of softcore pedophilia. Some videos featuring young children included comments with predatory links. Clicking on the links would transport viewers to other moments in YouTube videos with a minor in a sexually suggestive position. And once you fall in that rabbit hole, YouTube's recommendation algorithm appeared to feed you more of the same.  

So YouTube said it would suspend comments on videos featuring minors who were 13 and younger, as well as on videos featuring older minors who could be at risk of attracting predatory behavior. The changes would take place "over the next few months," YouTube said then. YouTube would make an exception for "a small number of channels that actively moderate their comments and take additional steps to protect children," the company said at the time.

"We announced some really significant changes, one of which is that we are no longer going to allow comments on videos that are featuring young minors anymore and older minors that are engaged in risky behavior," YouTube CEO Susan Wojcicki said on stage the day after the policy was announced. 

It was a move that she expected would anger innocent young creators and parents who rely on comments for genuine feedback, she said. But "that was a tradeoff that we made because we felt like we wanted to make sure that protecting children was our No. 1 priority," she added.

And they said that The scale of YouTube, where 500 hours of video are uploaded every minute, adds to the challenge. Even if the baseline success rate of YouTube's machine learning is high, the amount of videos it fails to catch will still be significant. 

"You may still only have one needle in a haystack. But add more and more haystacks, and it'll be easier for someone somewhere to find it," said Christian Shelton, a professor of computer science at the University of California, Riverside. "The technology will never be perfect. No other solution would also be perfect, but you shouldn't let the technology off the hook."

Google and YouTube's scale works in its favor, in some respects. Algorithms need data to learn, and YouTube has more video and data about it than anyone else. Machine learning for video, which essentially looks at videos as collections of still frames, also requires a level of computational power that's more feasible for a company with Google's resources. 

And one of YouTube's policy changes announced last week could help its machine learning improve. As part of its settlement with the FTC, YouTube will require uploaders to identify videos that are "made for kids," it said, effectively introducing more labels on its data. 

Algorithms need annotations like these to learn, and the more content that's getting processed, the more the annotations are necessary, according to Arnav Jhala, a computer science professor at North Carolina State University. Algorithms find patterns and correlations between labels and visible features in the frames. 

"The more labels they have, the higher correlation they will have, and on unlabeled video, the algorithms will have a higher accuracy," he said. "But you are dealing with almost an adversary on the other side."

That is, some uploaders have motives to misidentify their videos. 

Trolls, for example, could mislabel inappropriate content as kids videos, aiming to sneak sensitive images in front of children's eyes. YouTube previously had instances of kids videos with self-harm tips spliced into it. A trend of supposedly "child-friendly" YouTube videos that had familiar kids characters engaged in bizarre, violent or disturbing behavior earned its own moniker: Elsagate. Mislabeled data would pollute the information training a machine-learning algorithm.

Beyond that, video that targets kid audiences is "pretty freaking vague" as a directive to give an algorithm, Wilson said. 

 

We don't want this change to happen! Otherwise, us YouTubers would get banned and demonetized easily.

0 have signed. Let’s get to 200!
At 200 signatures, this petition is more likely to be featured in recommendations!