• renohren@partizle.com
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    1 year ago

    I find it very possible for an on device scanning taking place with current bionic chips and Apple being against any server side data collection: If 1 person gets flagged because of a false positive, Apple’s Privacy reputation in the general public goes down the drain, there would be too much damage to the brand. The ability to deny knowledge of anything to do with their customers activities is also the biggest reason for their push to E2E encryption. Using server side data collection about CSAM would disrupt that plausible deniability argument in all matters.

    • Foxygen@partizle.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Not just damage to the brand, but also, a lawsuit. They flatly say they aren’t phoning home with detection results. If they are, that opens them up to legal remedies from people who were lied to.

      Maybe Anker gets away with just flat-out lying (about e2e encryption for example) but a huge publicly traded company this side of the Pacific is another matter.

    • dragonfornicator@partizle.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      If you think about it, apples privacy reputation doesn’t matter to begin with. First, it’s just a reputation. It’s what they claim, not necessarily what they do. They are a multi trillion dollar tech giant, you don’t get there with honesty. But regardless, imagine their reputation goes down the drain. The consensus of “If you have nothing to hide, you have nothing to fear”, the ability to claim everything being to “protect the children” (or, related, “protect the country form terrorism”) all negate the necessity for that reputation. As sad as it is, most people don’t think too much about privacy, at least not in regards to modern technology. People will still buy their products and be part of the ecosystem. Apple is a luxury brand, their products used as status symbols, their most loyal customers are essentially a cult and for many it’s all they know. That is, if such a case gets big enough to “go viral”.

      • bouncing@partizle.comM
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        I think you’re underestimating it. They just introduced e2e encryption for almost all iCloud content. That’s not something there was that much market pressure to implement.

          • bouncing@partizle.comM
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            I don’t know that likes apply at all.

            To my understanding it’s all the metadata though. What’s not included are contacts, calendar, and email—because there’s no way to implement it with carddav, imap, etc.

      • mikeymike@partizle.comM
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        It’s also what they do.

        • Private relay
        • Tracking protection
        • Full e2e encryption where feasible
        • Device encryption
        • App tracking protection

        It’s a brand, yes, but it is absolutely reasonable to ask why they would want to flush that reputation away.

          • mikeymike@partizle.comM
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Source code to confirm it would be nice, but security researchers crawl all over this stuff.

            Also, they have no real incentive to do otherwise. As product features, these don’t just sell products, they actually reduce the administrative load on Apple because then Apple doesn’t have to deal with as many data requests.