Microsoft is postponing the rollout of its AI-driven Windows Recall function to conduct further testing and ensure its security before making it available for public preview on Copilot+ PCs. Initially scheduled for a public preview release on June 18 alongside the introduction of the new Copilot+ AI PCs, the company has decided to delay its launch, opting to first offer it for preview through the Windows Insider Program (WIP). This decision follows an update to a recent Windows Recall blog post, indicating a shift in the preview release strategy based on feedback from the Windows Insider Community.
The delayed release coincides with a critical report from ProPublica, highlighting Microsoft’s prioritization of revenue over security, and Microsoft President Brad Smith’s meeting with the US Congress to address recent security shortcomings. The Recall feature captures screenshots of active windows on a user’s PC at regular intervals, which are then processed by an Azure AI model running on the device to extract information and store it in a SQLite database. Users can then perform text-based searches on the extracted data, allowing for easy retrieval of historical information.
Privacy advocates and cybersecurity experts have raised concerns about the potential privacy implications of Windows Recall, cautioning that it could be exploited to compromise users’ data. Despite Microsoft’s assurances that the feature would be enabled by default on new Copilot+ AI devices and encrypted using Bitlocker for security, experts have highlighted vulnerabilities in Bitlocker’s decryption process, potentially exposing the data to malware or unauthorized access. Microsoft has responded to these concerns by announcing plans to make Windows Recall an opt-in feature and encrypting the database until a user authenticates with Windows Hello, but the extent of additional security measures remains unclear.