In a move that could be described as questionable, YouTube has announced a platform that it will allow users to moderate creators.
The program is called YouTube Heroes and it will allow users – known as Heroes – to do useful things such as add captions and subtitles to videos, share knowledge with other users on the YouTube forums and flag inappropriate content.
After joining the program Heroes can earn points by flagging content (1 point), sharing their knowledge (up to 10 points if its the “Best Answer”) and adding captions to videos (1 point).
As a Hero accumulates points they unlock perks, which allows them access to exclusive workshops, mass flagging content and the ability to contact YouTube staff directly.
Of course not everybody that applies to be a Hero will become one. Users will need to have a valid YouTube channel and be at least 18 years old to qualify for the program.
Is this the best YouTube can do?
It’s an open secret that YouTube’s system is broken.
The first issue that springs to mind is the Content ID system. Content ID has long been used by big firms to wrest advertising revenue from creators who use their products in fair use in videos.
More recently it was revealed that YouTube had been silently demonetising videos on some channels without giving them a way to challenge an algorithm’s rating of a video.
Creators are now able to challenge demonetisation strikes, which is nice, but creators won’t soon forget how they lost money without even knowing they had done something wrong.
The overwhelming opinion from creators is that YouTube needs real people running the important things like flagging and demonetising content and most importantly, policing Content ID.
YouTube Heroes seems to address some of those problems by putting real people into the mix. The trouble is that what often happens is users abuse the system because they can.
For instance a Hero who has never seen Ozzy Man Reviews might flag that content as inappropriate because the creator has a foul mouth, not understanding that Ozzy’s content caters to a specific rather than the general audience. An even worse example would be a ‘Tuber getting their content flagged as inappropriate en-masse as part of a witch-hunt.
YouTube does have terms and conditions in place to monitor abuse of the system but then the question becomes, who
watches the Watchmen moderates the moderators?
If you think users moderating content is a good move we implore you to cast an eye over at reddit.
While for the most part things can be calm and civil, the site is not perfect and often flame-wars can erupt. On reddit this isn’t a problem that can result in people losing money, while on YouTube it’s a problem that could.
In the interest of fairness, at least YouTube is doing something about the tens of thousands of accounts that violate its community guidelines but whether this user-centric approach to moderation will work remains to be seen.[Source – YouTube]