It’s entirely a nonstarter for entire fucking industries. That’s not hyperbole. I work in one of them.
Edit: scratch that - If any infosec team, anywhere, in any industry, at any corporation or organization, doesn’t categorically refuse to certify for use any system that is running MS Recall, they should be summarily fired and blackballed from the industry. It’s that bad. For real: this is how secrets (as in, cryptographic) get leaked. The exposure and liability inherent to this service is comical in the extreme. This may actually kill the product.
E2: to the title’s implication that such trust can be earned: it kinda can’t. That’s basically the point of really good passwords and secrets (private keys, basically): nobody else knows them. To try to dance around that is fundamentally futile. Also: who am I kidding, this shit will sell like hotcakes. Everyone’s on fucking Facebook, and look how horrifically they exploit everyone’s data for goddamn everything. This isn’t much worse than that to the average mostly-tech-illiterate consumer.
Accounting details, sensitive credentials for sys admin use, HIPAA data, PII etc. there's just so much crap understood to be temporarily unlocked, viewed, and then immediately deleted or locked again. Even home users shouldn't turn this thing on, check your bank? Balance and account details now always available. Use a password manager? Whatever you looked at is likely captured.
Using it may not be legal for videoconferencing in states and countries where recording without notification is illegal.
Also, legalities aside, if there is any application that might be displaying the contents of one's laptop webcam onscreen, that turns it into something that logs a series of snapshots of that (and then OCRs any text that the camera can see). I can see potential problems there.
For all the invasive problems this feature causes, what the fuck does it actually do? The ability to ask an ai what website you were on last Thursday? Who needs this garbage
The concept is useful. A well known idea capture of it is the famous “As We May Think” article from Vannevar Bush all the way back in 1945, which conceptualized a machine “Memex” that would enhance humans capabilities with for example memory and recall. A lot of humans needs help with this and use devices for this daily, with notes, map lookups of where you parked, find my things for devices, analytics for photo libraries etc etc etc.
It doesn’t transmit the data; it supposedly stores it locally. The issue is it’s a huge convenient plaintext trove of information if the system is compromised.
During setup of your new Copilot+ PC, and for each new user, you're informed about Recall and given the option to manage your Recall and snapshots preferences. If selected, Recall settings will open where you can stop saving snapshots, add filters, or further customize your experience before continuing to use Windows 11. If you continue with the default selections, saving snapshots will be turned on.
What this opens the door to is MICROSOFT will be able to get your database and be able to ask it questions as if it was talking to you. An AI agent of you that they can do what they like with. This is insanely dangerous.
This, as many users in infosec communities on social media immediately pointed out, sounds like a potential security nightmare.
Copilot+ PCs are required to have a fast neural processing unit (NPU) so that processing can be performed locally rather than sending data to the cloud; local snapshots are protected at rest by Windows’ disk encryption technologies, which are generally on by default if you’ve signed into a Microsoft account; neither Microsoft nor other users on the PC are supposed to be able to access any particular user’s Recall snapshots; and users can choose to exclude apps or (in most browsers) individual websites to exclude from Recall’s snapshots.
This all sounds good in theory, but some users are beginning to use Recall now that the Windows 11 24H2 update is available in preview form, and the actual implementation has serious problems.
Security researcher Kevin Beaumont, first in a thread on Mastodon and later in a more detailed blog post, has written about some of the potential implementation issues after enabling Recall on an unsupported system (which is currently the only way to try Recall since Copilot+ PCs that officially support the feature won’t ship until later this month).
The short version is this: In its current form, Recall takes screenshots and uses OCR to grab the information on your screen; it then writes the contents of windows plus records of different user interactions in a locally stored SQLite database to track your activity.
Data is stored on a per-app basis, presumably to make it easier for Microsoft’s app-exclusion feature to work.
The original article contains 710 words, the summary contains 260 words. Saved 63%. I'm a bot and I'm open source!