Good morning. Hello. How are you? #458
Apple's CSAM Detection is potentially hella problematic. Or maybe not. They should explain it a bit more so we can tell.
Good morning there, hello there, how are you there? Is it Friday there? Is there a there there? Sorry I don’t know what I was doing there. Oh god I can’t stop.
Just dealt with a PayPal order getting refunded because the person did not make the order, even though they totally did. It was for an autographed copy of Agency. Did you know you can buy my books at my website? You can! Oo let’s make a button for it:
Custom buttons. What fun.
Single-topic GMHHAY today, people. I am sorry. Bear with me.
So, Apple has announced a bunch of new features for iOS “for the children.”
Two of these features seem fine and cool — you can turn on a service that scans your kid’s iMessages to make sure people aren’t sending them dick pics or something, and there are some new child protection features in Siri. Fine.
The third one, though, is, well, weird. And maybe bad? But maybe not? But it is kind of weird and suspicious that Apple made it so you can’t tell if it’s bad or not. I’d chalk that up to nefarious purposes and assume that makes it bad except, you know, I worked with Apple for nearly a decade and there is plenty of reason to assume they just, sort of forgot to reassure people, or assumed their vague wording was reassuring.
Basically, Apple is going to scan iCloud images (probably) for what the biz calls CSAM but you and I would call child pornography or kiddie porn. This is not uncommon. There’s a database of imagery that the National Center for Missing and Exploited children maintains of kiddie porn. The technical idea is that when an image uploads a photo to your service, they compare that photo to the database and if they get a bunch of hits, then they might report you. NCMAC maintains the database, the exact details of technical implementation are left to the companies.
Now, historically, this service has been used at two types of companies: platforms and storage companies. That is, if you use a social network like Twitter or Facebook, and have decided to post the photo up there, they will scan those photos and make sure it’s not illegal. That seems fine. These networks are (somewhat) optional to use in life, but also you choose each photo you put up there. You have some control of this process.
With the storage companies, well, typically you choose what to store with them. I mean, I don’t know if Dropbox does this, but I assume they do. But you don’t need Dropbox, there are other backup options that are up there, if this bothers you having some large tech company scan everything you ever created. You could also, you know, only store encrypted tarballs or something — like I do with my Backblaze backups. You have some measure of control.
I am, it should go without saying, against child pornography. I have been involved in these services — we implemented the NCMEC database in my time at Tumblr when I was (briefly) overseeing the Trust and Safety team. They are, by and large, a good thing. These platforms are private companies they have terms of service, they are optional in your life, and you conform to their rules: just like you can’t, you know, foment rebellion on them (ahem).
Apple says this new feature is an iCloud feature, so it could be argued that this is like the Dropbox feature (maybe we should be using Microsoft here, I know they do it I am only guessing Dropbox does). But! There’s a twist!
Apple is doing this scanning of your photos on the phone before they’re sent to iCloud.
Now. To be fair, Apple would say a couple things here: first, that they’re not “scanning” your photos, they are “processing” them to make a hash, and that hash is the only thing compared to the NCMAC database, and no one sees your photos, ever, unless you get enough positive hits against the database. That’s fine, that’s good, it is, if you ask me, or probably most people, scanning photos. To make a hash, a machine needs to go through every pixel of the photo and look at it and do a bunch of calculations — the exact mechanism varies but whatever, it looks at and processes your photos. That is scanning.
So, I’ve definitely seen some people say “they’re not scanning your photos!” This is a stretch. I’ve seen some say they’re only doing it “on iCloud.”
This is factually untrue. They are doing it on your phone.
Now. Two things initially concerned me about this.
First: it is unclear if you can opt-out of this. It is unclear if this happens even if you’re not using iCloud or not. This seems pedantic, but it’s not. If they only do this scanning as a part of the upload process, well, then, okay. I guess it’s fine. Apple has, actually — and they deserve some credit here — thought of a pretty cool way to preserve your privacy here. Do the scanning on the phone, only send the hashes, no one gets to look at your photos unless you get a series of positive hits against the database.
BUT: it is unclear if Apple only does this to photos immediately and imminently destined for iCloud. Apple posted a technical description of the CSAM detection process, but this document notably obfuscates when this scanning is happening, and to which photos. Apple uses the phrase “before an image is stored in iCloud photos,” which would imply that this happens as part of the upload process only — and, thus, is something you can opt out of by not using iCloud. But it never explicitly says this, and it never explicitly says when this happens. It is a very odd omission. It is, quite possibly, an omission they didn’t realize they were making — maybe to them “before an image is stored in iCloud photos” is very clear and they think it’s clear it means “as part of the upload process, and only for the photos that are going to be uploaded.” But it doesn’t say that.
One thing I found particularly weird is that not a single one of the three academics who posted glowing reports of the service along with Apple’s press announcements addressed this issue. It’s not super surprising, they were, all three of them, computer science academics. No privacy academic issued one of these assessments. Hrm.
This matters. A user can opt out of iCloud. A user cannot opt-out of every photo on their phone being scanned. There is a difference. The latter is a significant escalation in tech company surveillance. This would be a completely different beast than every CSAM detection system out there. This would be huge.
Apple needs to provide clarity here.
Apple is doing this scanning on the phone so it can send your photos to iCloud encrypted. They say this maintains privacy, by making a hash before the encryption instead of doing it all on the server, because if the server had unencrypted photos, or the hash, they could do whatever they want to them, right? Never mind that this, in a way, is all theater, because they can still do whatever they want by maintaining a database of hashes beyond what NCMEC provides and just do it all on the phone. But there are additional questions here, which leads to our second concern.
This is obviously going to lead to scope creep. I mean, we know this. Apple’s only launching this in the US, but so what. It will get launched elsewhere. Countries will insist upon it.
I am, of course, thinking of China here, amongst others.
Apple does some $70 billion of revenue in Chine per year, has over 200 stores there and, obviously, does most of its manufacturing there. Apple is in no position to resist China’s demands.
So. This whole system. It works by comparing hashes — a hash of your image against a hash of an image in the database. It does not care with the image is. It could be a kiddie porn image, it could be an image of a Tibetan or Uyghur flag. It could be an image of a protester blocking a column of tanks in Tiananmen Square. Just because it’s happening on your phone doesn’t mean it can’t be used for other purposes.
For a while, I couldn’t get my head around this. Apple knows this. Does Apple think they have the moral fortitude to resist China’s demands? Do they think that they’re too ingrained in China for them to really threaten to kick Apple out? Has Apple been reading the news? Do they think if they just don’t launch the feature in China, China won’t, you know, insist “hey you have to launch that nifty new surveillance feature here.” I mean, shit, this feature seems so convenient for China. Instead of, you know, slogging through lots of data, you can just focus on the repeat offenders, conveniently found by Apple’s cool new algorithms. Does Apple think that their statement saying that for now they’re only launching in America is enough? Do the lives of the people of Hong Kong, Tibet and Xinjiang matter less than American kids? I mean, I am not making these calculations but it sure seems like someone is. Is Apple so sure of its interpretation of geopolitical events that they think they can control this? Even considering the news of the last few months? It feels like madness.
But. Then I realized. All of this is so confusing. Unless, of course, their datacenters in China are already compromised. Oh. Wait. From the New York Times earlier this year:
And in its data centers, Apple’s compromises have made it nearly impossible for the company to stop the Chinese government from gaining access to the emails, photos, documents, contacts and locations of millions of Chinese residents, according to the security experts and Apple engineers.
Awesome. So.
I think the most generous, logical explanation of this new feature is: Apple is only scanning photos destined for iCloud. And they’re doing this on the phone to maintain encryption. For Americans. And there’s no scope creep argument, because they only country Apple must bend over and take it from is China, and they’ve already done that at the datacenter level, so China can already get their Orwellian jollies, and every other country isn’t powerful enough to force Apple to do anything.
And if they already aren’t bending over for America in their datacenters, well, problem solved. They can put out this (really very clever) technical doc about maintaining privacy while only scanning on the phone, and whenever the US Gov’t wants them to scan for something, the government can just, you know FISA warrant them or something into adding a single hash to their database. I mean, I’m not saying it’d be easy for the government to do this, there would be hurdles, outcry, legal objections, etc. but Apple definitely gave them a nice new roadmap of those hurdles. And all those people complaining about the NSA mass surveillance of yesteryear can just calm down because we’re not really looking.
Cooooool.
And, still, this interpretation isn’t even yet confirmed. It’s still conceivable — from a close reading of Apple’s tech docs — that they will be scanning every photo on your phone, whether you use iCloud or not.
We’ll just have to wait and see about that.
Let’s do a mix! Been working on this one for a while. Modern synth pop. Fun stuff. Dancy, blippy, boopy. There are two Nation of Language songs on here but whatever they are just fantastic and one of this week’s obsessions, along with Karate, Guns and Tanning, which are just fantastic as well. Man people make so much good music what a world to live in with so much good music.
Happy Friday!