advertisement
Facebook
X
LinkedIn
WhatsApp
Reddit

Mozilla’s crowdsourced report about YouTube raises concerns

There is a wealth of information, news and fun to be had on YouTube but unfortunately there is also misinformation, fake news and disturbing content.

Something that has always troubled us, however, is how the darker content on YouTube gets spread and a new report from Mozilla suggests that the almighty YouTube algorithm is a culprit.

Before that, we have to discuss the methodology Mozilla used to source the data its report is based on.

Mozilla created a browser extension called RegretsReporter which was used by 37 380 volunteers from 190 countries. The extension could be installed for Firefox and Google Chrome.

“Of our total volunteer contributors, 1 662 submitted at least one report, for a total of 3 362 reports coming from 91 countries, submitted between July 2020 and May 2021. Volunteers who downloaded the extension but did not file a report were still an important part of our study. Their data—for example, how often they use YouTube—was essential to our understanding of how frequent regrettable experiences are on YouTube and how this varies between countries,” Mozilla explains in its report.

Limitations to this study include:

  • “Selection bias: Our volunteers are a particular group of people and our findings may not generalize across all YouTube users.
  • Reporting bias: There may be many factors that affect whether a volunteer reports a particular video.
  • Regret concept: The concept of a YouTube Regret is (intentionally) non-specific, and different volunteers may make reports based on different notions of ‘regret’ than others.
  • The observational nature of the study means that, while we can confidently state ‘what’ is happening, we are not able to confidently infer the ‘why’. For example, we do not know why YouTube chose to recommend any particular video to any particular volunteer.”

So with the methodology out of the way, what did Mozilla find?

“Disparate and disturbing”

In terms of reporting “Regrets”, as Mozilla terms them, there was a “disparate and disturbing” mix of misinformation, fear mongering about COVID-19, hate speech, “inappropriate children’s cartoons” and scams.

What is concerning is that, in 71 percent of reports, these videos were suggested by the YouTube algorithm. In comparison, search results only yielded regrets in 7.47 percent of reports.

“Oh but maybe those folks were already watching questionable content” is a thought we also had.

Thankfully, the RegretsReporter extension wasn’t only recording the content that was reported, but the last five YouTube pages a volunteer viewed.

“In analyzing these ‘trail’ videos, we noticed that some of these trails were totally unrelated to what the volunteer was watching previously. To interrogate this trend, we had research assistants classify the trail videos that accompanied recommended Regrets to determine whether the recommendations seemed related or not. Among recommendations for which we have data on the trail of videos the volunteer followed, in 43.3 percent of cases, the regretted recommendation was completely unrelated to the previous videos that the volunteer watched,” Mozilla writes in its report.

In one example, a volunteer was watching Art Garfunkel cover an Everly Brothers song with his son, only to have “Trump Debate Moderator EXPOSED as having Deep Democrat Ties, Media Bias Reaches BREAKING Point” recommended to them.

The recommendation problem is particularly pronounced in countries where English is not the primary language.

“The rate of YouTube Regrets is 60 percent higher in countries that do not have English as a primary language,” reports Mozilla.

What we find bizarre is how content that violates YouTube’s own terms of service has flourished on the platform. Even more worrying is the prevalence of border-line content. This is content that doesn’t necessarily infringe on YouTube’s terms of service but pushes the boundary.

Mozilla says that since volunteers reported videos, 189 have been taken down but not before they racked up a collective 160 million views over five months.

How can YouTube address this

Mozilla’s report outlines a number of ways YouTube and other platforms can restrict the spread of harmful content.

There are some good ideas in those recommendations but they require platform operators acknowledge there is a problem and institute new controls and reporting methods to address problems.

There’s also the matter of YouTube’s algorithm being something of an indecipherable black box at this stage.

“There are many experts who argue that these problems are not actually errors with the algorithm—rather, they are the output of YouTube’s algorithm working exactly how it should and that there is a fundamental misalignment between algorithms optimized to further business incentives and those optimised for the well-being of people. That may well be true. What is definitely true is that algorithms that are this consequential should not be deployed without proper oversight. And transparency is an essential first step,” Mozilla explains.

You can find the full report from Mozilla titled YouTube Regrets: A crowdsourced investigation into YouTube’s recommendation algorithm, here.

advertisement

About Author

advertisement

Related News

advertisement