The lead plaintiff in the class action lawsuit, Fumiko Lopez, alleged that Apple devices improperly recorded their daughter, who was a minor, mentioning brand names like Olive Garden and Air Jordans and then served her advertisements for those brands on Apple’s Safari browser. Other named plaintiffs alleged that their Siri-enabled devices entered listening mode without them saying “Hey Siri” while they were having intimate conversations in their bedrooms or were talking with their doctors.

In their suit, the plaintiffs characterized the privacy invasions as particularly egregious given that a core component of Apple’s marketing strategy in recent years has been to frame its devices as privacy-friendly. For example, an Apple billboard at the 2019 Consumer Electronics Show read “What happens on your iPhone, stays on your iPhone,” according to the lawsuit.

The proposed settlement, filed in California federal district court on Tuesday, covers people who owned Siri-enabled devices from September 17, 2014 to December 31, 2024 and whose private communications were recorded by an unintended Siri activation. Payout amounts will be determined by how many Apple devices a class member owned that improperly activated a listening session.

  • Optional@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    2 days ago

    covers people who owned Siri-enabled devices from September 17, 2014 to December 31, 2024 and whose private communications were recorded by an unintended Siri activation.

    How the fuck would we know that?

    • rockSlayer@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      2 days ago

      If you’ve ever been served an ad for something suspiciously soon after a conversation about that thing, you probably qualify

      • reddig33@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        2 days ago

        There’s no proof Siri shared data with advertisers. Apple settled because accidental activations recording intimate conversations were sent to a QA team of human beings to review the audio recordings. Apple has agreed to be upfront about it and publish guidelines as well as adding an “opt in” setting on devices.

        • pelespiritOPM
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          3
          ·
          2 days ago

          Yes there is:

          The lead plaintiff in the class action lawsuit, Fumiko Lopez, alleged that Apple devices improperly recorded their daughter, who was a minor, mentioning brand names like Olive Garden and Air Jordans and then served her advertisements for those brands on Apple’s Safari browser.

          Shortly after The Guardian’s report, Apple temporarily suspended all human grading of Siri responses and acknowledged that “we haven’t been fully living up to our high ideals.” The company said it would resume human grading after releasing software updates and that going forward, graders would be given computer-generated transcripts of conversations, rather than the audio itself, and that only Apple employees, and not third-party contractors, would conduct the grading.