YouTube Video Recommendations Lead to More Extremist Content for Right-Leaning Users, Researchers Suggest::New research shows that a person’s ideological leaning might affect what videos YouTube’s algorithms recommend to them. For right-leaning users, video recommendations are more likely to come from channels that share political extremism, conspiracy theories and otherwise problematic content.

  • serial_crusher@lemmy.basedcount.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Subjective biases can play a huge part in stuff like this. The researchers behind this story had to go through a bunch of YouTube channels and determine whether they constitute extremist right wing content or not.

    I think it’s a safe assumption that if you took the people consuming that content and asked them whether the video they just watched was right wing extremist content, most of them would say no.

    So, it’s possible that you don’t think you’re being overwhelmed with right wing extremist content, but that somebody else looking at your viewing history might think you are.