Hot topics close

YouTube’s ‘dislike’ barely works, according to new study on recommendations

YouTube’s in-app controls such as the “dislike” button, are largely ineffective as a tool for controlling recommendations, according to new research from Mozilla.

If you’ve ever felt like it’s difficult to “un-train” YouTube’s algorithm from suggesting a certain type of video once it slips into your recommendations, you’re not alone. In fact, it may be even more difficult than you think to get YouTube to accurately understand your preferences. One major issue, according to new research conducted by Mozilla, is that YouTube’s in-app controls such as the “dislike” button, are largely ineffective as a tool for controlling suggested content. According to the report, these buttons “prevent less than half of unwanted algorithmic recommendations.”

Researchers at Mozilla used data gathered from RegretsReporter, its browser extension that allows people to “donate” their recommendations data for use in studies like this one. In all, the report relied on millions of recommended videos, as well as anecdotal reports from thousands of people.

Mozilla tested the effectiveness of four different controls: the thumbs down “dislike” button, “not interested,” “don’t recommend channel” and “remove from watch history.” The researchers found that these had varying degrees of effectiveness, but that the overall impact was “small and inadequate.”

Of the four controls, the most effective was “don’t recommend from channel,” which prevented 43 percent of unwanted recommendations, while “not interested” was the least effective and only prevented about 11 percent of unwanted suggestions. The “dislike” button was nearly the same at 12 percent, and “remove from watch history” weeded out about 29 percent.

In their report, Mozilla’s researchers noted the great lengths study participants said they would sometimes go to in order to prevent unwanted recommendations, such as watching videos while logged out or while connected to a VPN. The researchers say the study highlights the need for YouTube to better explain its controls to users, and to give people more proactive ways of defining what they want to see.

“The way that YouTube and a lot of platforms operate is they rely a lot of passive data collection in order to infer what your preferences are,” says Becca Ricks, a senior researcher at Mozilla who co-authored the report. “But it's a little bit of a paternalistic way to operate where you're kind of making choices on behalf of people. You could be asking people what they want to be doing on the platform versus just watching what they're doing.”

Mozilla’s research comes amid increased calls for major platforms to make their algorithms more transparent. In the United States, lawmakers have proposed bills to scale back “opaque” recommendation algorithms and to hold companies accountable for algorithmic bias. The European Union is even farther ahead. The recently passed Digital Services Act will require platforms to explain how recommendation algorithms work and open them to outside researchers.

News Archive
  • Dune Part 2
    Dune Part 2
    Dune Part 2 Release Date, Cast, Plot Details
    27 Oct 2021
  • Purdue Basketball
    Purdue Basketball
    No. 25/25 Men's Basketball falls to No. 10/9 Purdue 81-71 in NCAA Second Round
    21 Mar 2022
  • MGK
    Machine Gun Kelly Fires Back at Eminem With Scorching Diss Track 'Rap Devil': Listen
    14 Oct 2018
  • Abu Dhabi Grand Prix
    Abu Dhabi Grand Prix
    F1 Standings - Verstappen wins Formula 1's Abi Dhabi Grand Prix... and the world championship! | Marca
    12 Dec 2021
  • Carlos Villagran
    Carlos Villagran
    Twitter: Difunden noticia del fallecimiento de Carlos Villagrán 'Kiko'
    21 May 2018
  • Andre Emmett
    Andre Emmett
    Former Texas Tech great Andre Emmett reportedly killed in Dallas shooting
    23 Sep 2019
This week's most popular shots