A new Microsoft Windows feature planned for Copilot+ PCs called “Recall” has been called a security and privacy nightmare by cybersecurity researchers and privacy advocates.
Copilot Recall, enabled by default, frequently captures screenshots, or “snapshots,” of a user's activity and stores them in a local database specific to the user's account. The potential for personal and sensitive data to be exposed by this new feature has alarmed security and privacy advocates and even sparked a UK investigation into the issue.
Co-pilot recall disputes privacy and security claims
In a lengthy Mastodon thread about the new feature, Windows security researcher Kevin Beaumont wrote: “It's no exaggeration to say that this is the dumbest cybersecurity move of the last decade. I hope my parents can keep their PCs safe.”
In a blog post about Recall's security and privacy, Microsoft says that processing and storage occurs only locally on the device and is encrypted, but even Microsoft's own explanation raises concerns: “Note that Recall does not perform content moderation, so information such as passwords or financial account numbers will not be hidden. That data may be in snapshots stored on your device, especially if the site does not comply with standard internet protocols, such as cloaking password entry.”
Security and privacy advocates dispute the claim that data is stored safely on the local device. If someone obtains a user's password, or if a court orders the data to be turned over for legal or law enforcement purposes, the amount of data exposed with Recall could be much greater than it would be otherwise. It also gives hackers and malware access to much more data than they would have without Recall.
Beaumont said the screenshots are stored in a SQLite database and “can be accessed as the user through a program or whatever – no physical access is needed at all, even if it could be stolen.”
He posted a video (reposted below) in which he said two Microsoft employees had apparently easily accessed the Recall database folder “while the SQLite database was sitting there.”
Does Recall have crowdhooks?
Beaumont also questioned Microsoft's claims that all of this happens locally. “The underlying code for Copilot+ Recall includes a large amount of Azure AI backend code, which is built into the Windows OS,” he wrote on Mastodon. “There are also a ton of API hooks for user activity monitoring.
“This creates a wider attack surface. … They are going all in on this and it will have a significant negative impact on the safety of people who use Microsoft Windows.”
Data may not be completely deleted
Additionally, any sensitive data deleted by the user will still be stored in Recall screenshots.
“There's no ability to delete screenshots you've taken while using the PC,” Beaumont says, “so you'll need to remember to clear the screenshots Recall takes every few seconds, and if you or a friend use disappearing messages on WhatsApp, Signal or anything like that's bound to be recorded.”
One commenter noted that Copilot Recall also appears to raise compliance issues, such as creating additional unnecessary data that could circumvent deletion requests.[T]”This is a complete PCI and GDPR failure, and the SOC2 control listing isn't very good either,” the commenter said.
Leslie Carhart, Dragos' director of incident response, responded by saying the “anger and disbelief are understandable.”
A second commenter said: “The concept of GDPR is very simple: data minimisation. Put simply, only store data that has an actual, legitimate and lawful purpose, and only for as long as necessary. And this is precisely where this approach spectacularly fails. Large amounts of data can be stored without a specific purpose, and for periods far longer than any reasonable use of that data.”
It remains to be seen whether Microsoft makes any changes to allay concerns before officially releasing Recall. If it doesn't, security and privacy experts may find themselves even busier than ever.