YouTube to start vetting content from popular channels
Right now, a bunch of YouTube creators are probably casting an eye in the direction of Logan Paul and saying, “thanks a heap, Paul!”
This is because, in the wake of Paul’s actions at the beginning of this year making him about as popular as a fart in a lift – as well as his subsequent booting from Google’s preferred advertising program – YouTube looks set to start vetting content from its more popular creators.
According to a report on Bloomberg, Google has told investors it plans to hire around 10,000 human moderators to assist its AI to flag content deemed inappropriate for adverts.
“We built Google Preferred to help our customers easily reach YouTube’s most passionate audiences and we’ve seen strong traction in the last year with a record number of brands,” an Alphabet spokesperson for Google said.
“As we said recently, we are discussing and seeking feedback from our brand partners on ways to offer them even more assurances for what they buy in the Upfronts [marketing spots that partners buy ahead of time].”
To be frank, this move should have been a long time coming. Leaving aside shenanigans from the likes of Paul and YouTube’s most popular creator, Felix “PewDiePie” Kjellberg – who, incidentally has been using the fallout of Paul’s controversy to drive traffic to his own channel – some companies pulled ads last year when they found they were placed on videos that promoted extremist right-wing content, and videos tagged as kid-friendly, which turned out to be rather gruesome.
Who knew? Humans might be more discerning than algorithms. Or Paul’s audience, for that matter.