YouTube algorithm surfaces potentially harmful videos, study finds

Posted:
in General Discussion edited July 2021
YouTube's algorithm recommends objectionable, controversial, or otherwise problematic videos to its users, according to the results of a new crowdsourced study.

Credit: Szabo Viktor
Credit: Szabo Viktor


The results of the study, which were published by Firefox maker Mozilla, found thousands of instances of YouTube recommending "regrettable" videos. That broad category includes videos that contain hate speech, violence, and misinformation.

According to the research, 71% of videos that were flagged as problematic came from YouTube's own recommendations. The videos that surfaced tended to be much more popular, suggesting that content that contains controversial elements are favored by the system.

About 9% of the "regrettable" videos were later pulled by YouTube for violating the company's policy platform. The investigation also found that users in non-English-speaking countries were recommended problematic content at a 60% higher rate.

To fix the YouTube recommendation problem, Mozilla calls for "common sense transparency laws, better oversight, and consumer pressure."

"Research by Mozilla and countless other experts has confirmed that there are significant harms associated with YouTube," the organization wrote. "YouTube has made it clear that they are taking the wrong approach to managing this responsibility. We will only get closer to the right approach with greater openness, transparency, accountability, and humility."

Mozilla conducted the study over a 10-month period through Firefox and Chrome extensions that allowed users to report "regrettable" content, or content that they regretted watching.

In total, Mozilla gathered 3,362 reports submitted by 1,622 individual users across 91 countries. The study was carried out between July 2020 and June 2021. After getting the data, Mozilla hired 42 University of Exeter researchers to review the submissions and probe whether the videos violated YouTube's policies.

As far as a more specific definition of a "YouTube Regret," Mozilla's report details it as "hate speech, debunked political and scientific misinformation, and other categories of content that would likely ... violate YouTube's Community Guidelines." It could also include "borderline content," which could skirt the borders of YouTube's policies without violating them. This content might lead viewers down "dangerous paths." Mozilla first identified "YouTube Regret" in a crowdsourced study in 2019 in which respondents self-identified content as "regrettable."

The full results of the study, including researcher analysis, are available here.

Keep up with everything Apple in the weekly AppleInsider Podcast -- and get a fast news update from AppleInsider Daily. Just say, "Hey, Siri," to your HomePod mini and ask for these podcasts, and our latest HomeKit Insider episode too.If you want an ad-free main AppleInsider Podcast experience, you can support the AppleInsider podcast by subscribing for $5 per month through Apple's Podcasts app, or via Patreon if you prefer any other podcast player.

Comments

  • Reply 1 of 7
    sphericspheric Posts: 2,564member
    We’ve known this for years. Extremism and controversy bind eye-balls, the more extreme, the longer attention. 

    YouTube’s algorithms are designed to automatically lead you down rabbit-holes, often radicalising people on the way. 
    edited July 2021 viclauyycwatto_cobratht
  • Reply 2 of 7
    JapheyJaphey Posts: 1,767member
    spheric said:
    We’ve known this for years. Extremism and controversy bind eye-balls, the more extreme, the longer attention. 

    YouTube’s algorithms are designed to automatically lead you down rabbit-holes, often radicalising people on the way. 
    For some reason, your comment reminded me of this scene from “Private Parts”:

    https://youtu.be/9G6xu-J_Dmc
    watto_cobra
  • Reply 3 of 7
    Alex_VAlex_V Posts: 218member
    Dear Mozilla. Please build an open-source video platform to take on YouTube. Regards, Alex 
    viclauyycbeowulfschmidtwatto_cobra
  • Reply 4 of 7
    amar99amar99 Posts: 181member
    When terms such as "harmful" and "regrettable" are so loosely defined and subjectiviely based on the viewer's own views, I find trying to apply such labels as being objective to be misleading. "Do better" winds up meaning "please everyone", a lofty goal indeed. (Not saying there isn't crap on YT, just that different people can have different opinions about the same content.)
    watto_cobra
  • Reply 5 of 7
    sphericspheric Posts: 2,564member
    amar99 said:
    When terms such as "harmful" and "regrettable" are so loosely defined and subjectiviely based on the viewer's own views, I find trying to apply such labels as being objective to be misleading. "Do better" winds up meaning "please everyone", a lofty goal indeed. (Not saying there isn't crap on YT, just that different people can have different opinions about the same content.)
    Whether somebody turns into a terrorist is objectively quantifiable. 
    watto_cobra
  • Reply 6 of 7
    It's called clickbait and it works.
Sign In or Register to comment.