Prevent anonymous non age-restricted YouTube flagging
0 have signed. Let’s get to 2,500!
TV has always been scorned as brain washing technique by pouring controlled supply of censored content with shallow diversity.
On the contrary, YouTube has evolved into socially significant resource to express and entertain yourself, and is also becoming a preferred way to learn and study, among other things. What differs YouTube from old-school television is choice, freedom and diversity.
No wonder so many users upload hours upon hours of new content daily, and YouTube in-house team is obviously overwhelmed by the amount of content to provide any human filtering beside automated algorithms.
This is where YouTube community steps in. Anyone watching any YouTube video has the option to flag and report if the viewer finds the video inappropriate.
YouTube will immediately take action by limiting or taking down the video, and issuing a warning (strike) to the channel that posted the video. Each strike limits the channel functions, i.e. 1 strike strips the channel ability to live stream, 2 strikes disable the ability to upload new content, and 3 strikes take down the entire channel with all of the videos.
The strikes are issued automatically, without any human intervention from YouTube staff. No human personnel reviews the flagged video, despite what YouTube may state, as YouTube is an automated system, and it takes down videos by any YouTube creator regardless of the subscriber count, view, uploads count or channel age. This is done just in case, and looks like sacrificing a creator to bring the heat away from YouTube as a company is an acceptable collateral damage. Such attitude is very demotivating to core YouTube creators who literally live to make new YouTube content, only to have it flagged and removed just because a generic soccer mom finds it inappropriate, or the viewer disagrees with the presented opinion, or simply doesn't like the YouTuber.
Moreover, the practice is abused, as anyone can report a YouTube video with high chance of succeeding in taking down the entire channel and do it just because they can do it so easily. There is no penalty for false accusation. The reports are completely anonymous. The practice is also heavily abused by competing YouTube channels in the same niche to get a gain over the disabled YouTube channel.
While these reports may be useful to take down one-time uploaders of genuinely inappropriate content (porn, brutal, etc.), in reality true YouTube creators who comply with YouTube Guidelines suffer the most, as the flagging is mostly done for bogus reasons by mean people who find making other people suffer fun and entertaining. Especially when it's anonymous and there is no repercussion. And yet, most claims are satisfied automatically and put penalties on innocent YouTube creators. Sometimes the flagging is done by accident, especially by minors who tend to click all over the interface at random. As no hard evidence of why the video violates the Guidelines is required to flag it, the reports do not contain any specifics, of what aspect of the video was dimmed inappropriate exactly, and YouTube is a notoriously mute company never giving any explanation of the violation or anything whatsoever.
Strikes last at least three months to expire, which is a ridiculously long time for a fast-paced place such as YouTube. YouTube creators can dispute the claim by filing an appeal, which will go to review by human personnel, but YouTube Creator Support is a small team overwhelmed by hours upon hours of new video content each day, and it takes weeks and months for the appeal to go through. Sometimes strikes expire on their own before the appeal was reviewed. Three months without new content sends even the most popular YouTube channel into oblivion, but small to medium channels suffer the most.
This puts YouTube creators, the core of YouTube, in a particularly disadvantaged position as there is nothing they can do to protect their channels from malicious viewers abusing YouTube Community Guidelines flagging system and taking down videos which do not violate Community Guidelines in the first place. They cannot fight back or bring the offender to justice, since the reports are anonymous.
So while the inappropriate video reporting option is intended to make YouTube better, in fact it's making it worse and hurting YouTube users on a personal level. This is hurting the entire YouTube community, both YouTube creators and their audience, as bringing down videos and entire channels makes video creation and viewing impossible, and YouTube is an online video service.
Here's where YouTube should step in and amend the YouTube Community Guidelines flagging system to make legal action possible, i.e.:
1) Disable anonymous flagging - the user must complete a verification including real name and address to file a report
2) Prevent minor from flagging - the individual must be of legal age before filing a report
3) Grant at least some immunity to established YouTube creators, i.e. for channels with certain amount of subscribers (i.e. 10.000), views (i.e. 100000 a month), age (i.e. at least a year), activity, etc. - flagged videos are online and disable no channel functions until the dispute is checked by a human YouTube employee and results in channel strike. Also, a video must be flagged several times (i.e. at least 100 users flag it) to count the report genuine.
Today: Izzy is counting on you
Izzy Laif needs your help with “Google, Inc: Prevent anonymous non age-restricted YouTube flagging”. Join Izzy and 1,633 supporters today.