Instead of scanning iCloud for illegal content, Apple’s tech will locally flag inappropriate images for kids. And adults are getting an opt-in nudes filter too.
I find it very possible for an on device scanning taking place with current bionic chips and Apple being against any server side data collection: If 1 person gets flagged because of a false positive, Apple’s Privacy reputation in the general public goes down the drain, there would be too much damage to the brand. The ability to deny knowledge of anything to do with their customers activities is also the biggest reason for their push to E2E encryption. Using server side data collection about CSAM would disrupt that plausible deniability argument in all matters.
Not just damage to the brand, but also, a lawsuit. They flatly say they aren’t phoning home with detection results. If they are, that opens them up to legal remedies from people who were lied to.
Maybe Anker gets away with just flat-out lying (about e2e encryption for example) but a huge publicly traded company this side of the Pacific is another matter.
If you think about it, apples privacy reputation doesn’t matter to begin with. First, it’s just a reputation. It’s what they claim, not necessarily what they do. They are a multi trillion dollar tech giant, you don’t get there with honesty. But regardless, imagine their reputation goes down the drain. The consensus of “If you have nothing to hide, you have nothing to fear”, the ability to claim everything being to “protect the children” (or, related, “protect the country form terrorism”) all negate the necessity for that reputation. As sad as it is, most people don’t think too much about privacy, at least not in regards to modern technology. People will still buy their products and be part of the ecosystem. Apple is a luxury brand, their products used as status symbols, their most loyal customers are essentially a cult and for many it’s all they know. That is, if such a case gets big enough to “go viral”.
I think you’re underestimating it. They just introduced e2e encryption for almost all iCloud content. That’s not something there was that much market pressure to implement.
To my understanding it’s all the metadata though. What’s not included are contacts, calendar, and email—because there’s no way to implement it with carddav, imap, etc.
Source code to confirm it would be nice, but security researchers crawl all over this stuff.
Also, they have no real incentive to do otherwise. As product features, these don’t just sell products, they actually reduce the administrative load on Apple because then Apple doesn’t have to deal with as many data requests.
I find it very possible for an on device scanning taking place with current bionic chips and Apple being against any server side data collection: If 1 person gets flagged because of a false positive, Apple’s Privacy reputation in the general public goes down the drain, there would be too much damage to the brand. The ability to deny knowledge of anything to do with their customers activities is also the biggest reason for their push to E2E encryption. Using server side data collection about CSAM would disrupt that plausible deniability argument in all matters.
Not just damage to the brand, but also, a lawsuit. They flatly say they aren’t phoning home with detection results. If they are, that opens them up to legal remedies from people who were lied to.
Maybe Anker gets away with just flat-out lying (about e2e encryption for example) but a huge publicly traded company this side of the Pacific is another matter.
If you think about it, apples privacy reputation doesn’t matter to begin with. First, it’s just a reputation. It’s what they claim, not necessarily what they do. They are a multi trillion dollar tech giant, you don’t get there with honesty. But regardless, imagine their reputation goes down the drain. The consensus of “If you have nothing to hide, you have nothing to fear”, the ability to claim everything being to “protect the children” (or, related, “protect the country form terrorism”) all negate the necessity for that reputation. As sad as it is, most people don’t think too much about privacy, at least not in regards to modern technology. People will still buy their products and be part of the ecosystem. Apple is a luxury brand, their products used as status symbols, their most loyal customers are essentially a cult and for many it’s all they know. That is, if such a case gets big enough to “go viral”.
I think you’re underestimating it. They just introduced e2e encryption for almost all iCloud content. That’s not something there was that much market pressure to implement.
That is good for apple users. Does that include meta-data? Locations, timestamps and the likes?
I don’t know that likes apply at all.
To my understanding it’s all the metadata though. What’s not included are contacts, calendar, and email—because there’s no way to implement it with carddav, imap, etc.
It’s also what they do.
It’s a brand, yes, but it is absolutely reasonable to ask why they would want to flush that reputation away.
If they actually do that, great. Nothing against that. I just have an inherit distrust towards fortune 500 companies
Source code to confirm it would be nice, but security researchers crawl all over this stuff.
Also, they have no real incentive to do otherwise. As product features, these don’t just sell products, they actually reduce the administrative load on Apple because then Apple doesn’t have to deal with as many data requests.