Apple opts everyone into having their Photos analyzed by AI
Apple opts everyone into having their Photos analyzed by AI
Homomorphic-based Enhanced Visual Search is so privacy-preserving, iPhone giant activated it without asking
Apple opts everyone into having their Photos analyzed by AI
Homomorphic-based Enhanced Visual Search is so privacy-preserving, iPhone giant activated it without asking
Homomorphic encryption, which allows for analyzing secret data without a decryption step, is actually incredibly cool. It’s a shame the conversation will begin with the fact that they deployed the feature as on by default.
And it’s right that this is the conversation because Apple needs to learn people want to be in control and these things need to be opt-in. They can build the most sophisticated fancy system to protect your privacy, if it’s sending your stuff to another server it needs to ask for permission full stop.
They (and every other tech company) have been doing this type of thing for nearly 20 years. You might see some whinging about it in some corners of the Internet, like here, but most people don't know or don't give a shit.
It sucks.
It allows processing data without decrypting it, which is great in terms of preventing someone else from snooping on it, but doesn't change that Apple is retaining the ability to analyze the data content, which is the actual issue here.
Reading between the lines, I guarantee they're doing the same thing for CSAM protection. I think sex offenders caused this to happen, I believe they found out that they were using photos to host that horrid stuff, and apple can't just ignore it, so I think we have them to thank