Morning Overview

Disable this creepy new Microsoft feature right now or regret it

Microsoft is pushing AI deeper into Windows through a feature called Recall, which takes frequent snapshots of on-screen activity on Copilot+ PCs and uses on-device AI to make that activity searchable. The feature has drawn sharp criticism from security researchers and privacy advocates who argue it creates an unprecedented record of everything a person does on their computer. That concern gains weight against a broader pattern of regulatory findings about how Microsoft-related services can fall short on data protection, including a 2024 decision by the European Data Protection Supervisor (EDPS) involving Microsoft 365 use inside the European Commission. If you don’t need Recall, the safest move is to keep it disabled.

What Recall Does and Why It Alarmed Researchers

Recall works by taking periodic snapshots of a user’s screen, storing them locally, and indexing the content so that an AI assistant can retrieve past activity on demand. The pitch is convenience: users can search for something they saw days or weeks ago, even if they forgot where they found it. But the practical effect is that a running visual log of emails, banking sessions, private messages, medical records, and passwords sits on the device, waiting to be queried. Security researchers and privacy advocates argued that early versions raised serious security risks, including the possibility that stored snapshots could become a high-value target for malware or someone with physical access to a device.

Microsoft responded to the initial backlash by delaying the rollout and adding encryption and biometric authentication requirements before Recall data can be accessed. The company also shifted Recall to an opt-in model rather than enabling it by default. Still, the core design remains the same: the feature is intended to capture broad on-screen activity unless a user takes steps to exclude specific apps or websites. That design choice places the burden on the individual to anticipate every sensitive context, rather than building privacy protections into the default experience.

Europe Already Flagged Microsoft’s Data Practices

The privacy questions around Recall do not exist in a vacuum. The European Data Protection Supervisor found that the European Commission’s use of Microsoft 365 infringes data protection law governing EU institutions and bodies. The EDPS, which serves as the independent data protection authority for EU institutions, concluded that the Commission failed to provide sufficient safeguards when transferring personal data through Microsoft’s cloud services. The ruling specifically cited failures in purpose specification and inadequate risk assessments for how data processed through Microsoft 365 could be accessed or used.

That finding matters beyond Brussels. It suggests that even a sophisticated institutional customer can fall short of strict data-protection requirements when safeguards, purpose limits, and transfer controls aren’t sufficiently specified and assessed. If an organization as resourced as the European Commission faced these compliance findings around Microsoft 365, it’s a reminder that consumers may have even less visibility and leverage over how data is handled across a large software ecosystem. The EDPS decision dealt with Microsoft 365, not Recall specifically, but the underlying issue is the same: Microsoft’s ecosystem collects and processes data in ways that can outpace the safeguards users or administrators put in place.

Why the Opt-In Fix Falls Short

Microsoft’s decision to make Recall opt-in rather than on-by-default was a direct response to public pressure. On the surface, that change addresses the loudest complaint. But opt-in framing can be misleading if device setup screens or prompts make enabling AI features feel like the default choice. Most people click through setup screens quickly, and a feature presented alongside other productivity tools can easily get toggled on without a full understanding of what it does. The difference between opt-in and opt-out shrinks considerably when the user interface nudges people toward saying yes.

Even for users who knowingly enable Recall, the feature creates a data store that did not previously exist on consumer PCs. Traditional browser history or app logs capture fragments of activity. Recall captures a visual record of the entire screen at regular intervals, which means it can inadvertently record content the user never intended to save, including sensitive documents opened briefly, confidential chats visible in a notification, or financial details displayed in a background tab. The sheer breadth of what gets captured makes it qualitatively different from conventional logging, and it introduces a target that attackers or forensic tools could exploit.

How to Turn Recall Off

For anyone who has already enabled Recall or suspects it may be active, the key step is to turn it off in Windows Settings and stop it from saving new snapshots. Look for the Recall/snapshots controls under Windows privacy settings, then disable snapshot saving and delete any stored snapshots if the option is available.

The broader lesson here is that new AI features on any platform deserve scrutiny before activation, not after. Default settings are designed to maximize engagement, not to protect the most cautious user. Treating every new AI toggle as something to evaluate rather than accept is the most reliable way to avoid handing over data that is difficult to claw back. Recall may eventually mature into a feature with strong enough protections to justify the tradeoff, but based on available evidence, that bar has not been cleared.

A Pattern That Extends Beyond One Feature

The tension between productivity and privacy in Microsoft’s products is not new, but Recall sharpens it. For years, enterprise customers have negotiated data processing agreements, configured compliance settings, and still run into gaps. The EDPS decision on the European Commission’s use of Microsoft 365 is a highly authoritative example of that gap playing out at an institutional level, with the EDPS concluding that the Commission’s use infringed Regulation (EU) 2018/1725. That regulation governs how EU institutions handle personal data, and the finding that Microsoft’s own processing arrangements fell short carries real weight for anyone assessing the company’s privacy commitments.

Recall takes that same dynamic and brings it to the individual desktop. Where Microsoft 365 data processing happens largely out of sight in cloud infrastructure, Recall’s screen captures are tangible and local. Users can see exactly what the feature stores, which paradoxically makes the privacy risk easier to grasp but also easier to ignore once the initial novelty fades. The most prudent course for anyone who values control over their digital footprint is to leave Recall disabled until independent audits confirm that the stored data cannot be accessed by unauthorized parties, exfiltrated by malware, or compelled through legal process without the user’s knowledge. Until that evidence exists, the feature represents a liability dressed up as a convenience.

More from Morning Overview

*This article was researched with the help of AI, with human editors creating the final content.