Microsoft Recall is an AI-based tool for Copilot+ PCs running Windows 11 24H2 or above that continuously takes and stores screenshots to log user activity for searching past use. The company relaunched the feature in 2025 after delaying it throughout 2024 due to widespread security criticism. Now opt-in only, Recall requires user permission during initial system setup and claims to operate with robust protections. Yet independent security researchers and institutions have already exposed critical flaws that contradict Microsoft’s reassurances.
Key Takeaways
- Recall takes continuous screenshots and stores them in an encrypted database protected by TPM hardware and Virtualization-based Security (VBS) Enclave operations.
- University of Pennsylvania researchers deemed Recall’s security, legality, and privacy challenges “substantial and unacceptable” and blocked it on managed systems.
- The tool’s filter fails to exclude sensitive data on non-common browsers like Vivaldi, remote desktops like AnyDesk, and payment forms in real-world tests.
- Users can access Recall data via guessed or shoulder-surfed PINs, allowing non-technical attackers to view all activity including deleted content.
- Recall logs interactions from Zoom, Teams, WhatsApp, and Signal with transcripts and self-destructing messages, potentially violating others’ privacy rights.
How Microsoft Recall’s Security Architecture Actually Works
Microsoft designed Recall around three core protections: encryption via TPM hardware keys, Virtualization-based Security (VBS) Enclave isolation, and Windows Hello biometric authentication. The company claims these layers prevent unauthorized access and malware exfiltration. In theory, a user must enroll biometric sign-in—facial recognition or fingerprint—to enable Recall, query the database, and view screenshots. All operations run inside a VBS Enclave, a hardware-isolated container that Microsoft argues protects data even if malware compromises the main operating system.
The problem emerges in the fallback mechanism. Microsoft allows users to authenticate with a PIN after initial biometric setup, ostensibly to prevent data loss if facial recognition fails. This PIN can be guessed, shoulder-surfed, or brute-forced. In testing, a DoublePulsar researcher described a real-world scenario: “She guessed my PIN — I used the one I use for my bank card — as (unknown to me) she had shoulder surfed me at a cash point prior. From there, she just searched for Signal and went through the conversations”. Once inside, an attacker sees everything Recall has logged, including deleted messages and private browsing activity that should have been filtered.
The Filter Problem: Why Sensitive Data Still Gets Captured
Recall attempts to exclude sensitive data—private browser windows, payment forms, passwords—from its screenshot logs. Microsoft Recall security risks become apparent when these filters encounter non-standard software. The tool fails reliably on browsers like Vivaldi, remote desktop applications like AnyDesk, and various payment interfaces. University of Pennsylvania’s Office of Information Security found the filtering approach fundamentally flawed, calling it insufficient to address the tool’s “substantial and unacceptable” privacy and security challenges.
Kaspersky’s analysis highlighted an additional concern: Recall meticulously logs interactions with other users, including Zoom and Teams calls with transcripts, WhatsApp and Signal conversations marked as self-destructing, and one-time photos or videos. These captures potentially violate the privacy rights of people who never consented to being logged by someone else’s Recall system. A person on a call with a Recall user has no way to opt out of being recorded and stored indefinitely on that user’s device.
Prompt Injection and AI Exfiltration Vectors
Security researchers including Michael Bargury have identified a second attack surface: the Copilot AI integration. Because Recall feeds screenshot data to Copilot for semantic searching, an attacker could inject malicious prompts into Copilot that trick the AI into exfiltrating sensitive information from the Recall database. A user might ask Copilot an innocent question, not realizing a compromised website or email has injected a hidden prompt that instructs Copilot to extract and transmit all banking credentials, passwords, or personal messages from the Recall logs. Microsoft’s current security model does not isolate Copilot from the Recall data—both operate on the same device, creating a shared attack surface.
Presence detection, rate-limiting, and anti-hammering measures offer some friction against automated malware, but they do not prevent targeted attacks or insider threats. A sophisticated attacker with physical access, credentials, or network position can work around these defenses. The University of Pennsylvania explicitly recommends disabling Recall via Group Policy on managed systems where data sensitivity justifies the precaution.
Why Microsoft’s Denial of Data Risk Rings Hollow
Microsoft insists Recall poses “no data risk” and that the company has built a “robust set of controls against known threats”. Independent testing contradicts this claim. The PIN bypass alone demonstrates that biometric authentication—Microsoft’s primary security selling point—can be circumvented by a person with a guessed or shoulder-surfed code. The filter failures show that sensitive data enters the database despite exclusion rules. The Copilot integration creates a new exfiltration pathway that Microsoft has not adequately addressed in public documentation.
Microsoft Recall security risks are not theoretical. They are demonstrated in controlled tests by respected security researchers and flagged by university information security offices. The company’s response has been to repackage the feature as opt-in and add authentication layers, but the underlying architectural flaws—continuous screenshot logging, unreliable filtering, fallback authentication via guessable PIN, AI-mediated data access—remain unchanged. Opt-in status does not eliminate the risk; it only shifts responsibility to users who may not understand the implications of enabling a tool that logs everything they do on their computer.
Should You Enable Recall on Your Copilot+ PC?
Unless you work in a highly controlled environment with IT oversight, enabling Recall introduces risk that outweighs the convenience of searching past screenshots. The tool’s filtering cannot be trusted to exclude sensitive data, its PIN fallback can be guessed, and its integration with Copilot creates additional attack surfaces. If you do enable Recall, use a strong, unique PIN that differs from your banking codes, and assume that anything you see on your screen—including private messages, payment details, and one-time codes—will be logged and stored indefinitely.
Can you disable Recall if you don’t want it?
Yes. Recall is opt-in during Windows 11 setup, meaning you can decline to enable it at initial configuration. If you have already enabled Recall, you can disable it through Windows settings or via Group Policy on managed systems. Microsoft provides official instructions for disabling Recall in managed environments.
Does Recall capture content from private browsing windows?
Recall is designed to exclude private browser windows from logging, but the filter is unreliable and fails on non-common browsers like Vivaldi. Assume that some private browsing activity may still be captured, especially if you use less mainstream browsers or remote desktop applications.
What happens to Recall data if I forget my PIN or biometric credentials?
Microsoft allows PIN fallback to prevent permanent data loss, but this same fallback creates the security vulnerability described above. If you forget both your biometric enrollment and PIN, you may lose access to Recall data, though Microsoft has not published detailed recovery procedures for end users.
Microsoft Recall’s relaunch in 2025 does not resolve the core tension between convenience and privacy. The company has added opt-in status and authentication layers, but independent security researchers have already demonstrated that these defenses can be bypassed or circumvented. Until Microsoft redesigns Recall to eliminate screenshot logging, implement foolproof filtering, and isolate Copilot from the Recall database, the tool remains a liability for anyone handling sensitive information on their device. For most users, the smarter choice is to leave Recall disabled and rely on traditional search methods.
This article was written with AI assistance and editorially reviewed.
Source: TechRadar


