views
Microsoft’s AI journey has been smooth so far until it introduced a feature that can become a privacy nightmare. We are talking about the Windows 11 recall tool that was demoed at the recent Build 2024 conference by Satya Nadella and his team. The company showcased how Windows 11 and its AI tech can capture screenshots of your activities to help you trace back in the history of files, websites or apps you have opened.
Microsoft even claimed that the Recall feature has been developed with your security and privacy in mind. However, latest developments reveal a scary truth about the Windows 11 AI feature that can force the company to roll back or even consider removing it altogether.
We all know that Recall operates purely on the device and does not send any of the data to the cloud for processing. The AI tool will help people find items, files and even their search activity on the PC itself. It also pointed out that Recall has a photographic memory of your virtual activity on the Windows PC.
But new details claim that the data even stored on your device can be easily hacked since they are kept in plain text rather than secured behind encryption. The details about the security issue with Recall come via Kevin Beaumount, who is a security researcher.
He pointed out that the data accessed through Recall and stored on the Windows 11 PC is readable when the person signs into their Microsoft account. Microsoft feels that AI can optimise search on PCs with Recall but its compromises seem bigger than the convenience on offer.
Beaumount even talked about the ease with which hackers can access this confidential data, especially when the PC is stolen or misplaced and ends up in the wrong hands. Microsoft will really need to fix these issues with Recall before it becomes widely available through Copilot + PCs later this year. The AI hype is already facing concerns about privacy and user data security and the last thing you want is a big-scale data hack because of a feature that we didn’t really need.
Comments
0 comment