Researchers say extreme content being pushed on young people and becoming normalised

  • postnataldrip@lemmy.world
    link
    fedilink
    English
    arrow-up
    59
    ·
    9 months ago

    It’s well-known that these algorithms push topics to drive engagement, and naturally things that make people angry or frightened or disgusted etc enough are more likely to be engaged with regardless of what that topic is.

    • kat_angstrom@lemmy.world
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      1
      ·
      9 months ago

      When outrage is the prime driver of engagement it’s going to push some people right off the platform entirely, and the ones who stay are psychologically worse off for it.

      • Imgonnatrythis
        link
        fedilink
        English
        arrow-up
        15
        ·
        9 months ago

        Social media execs, “we’ve done the math and it’s worth it”

        • kat_angstrom@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          9 months ago

          Worth it for them, for short term profits. Good thing nobody is considering the net effect this has on society or political discourse.

    • JoBo@feddit.uk
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      9 months ago

      They could certainly do with a control group or three. The point they’re trying to make is that over 5 days of watching recommended videos the proportion that were misogynistic grew from 13% on day 1 to 52% on day 5. That suggests a disproportionate algorithmic boost but it’s hard to tell how much that was caused by the videos they chose to view.

      A real world trial ought to be possible. You could recruit thousands of kids to just do their own thing and report back. It’s a very hard question to study in the lab because it’s nothing like the real world.