Privacy concerns swirl around Microsoft’s new AI-Powered tool

sensitive information you’d prefer not to share writes Cyber Security Technologist James Galbraith.

CoPilot is an AI-powered tool that co-ordinates content between Microsoft applications. It includes a feature called Recall.

What are the privacy concerns surrounding the ‘Recall’ feature?

There are privacy concerns regarding the Recall feature since it was launched in May.

One of the primary issues is that everything is locally stored on devices using Recall, meaning that if your system is compromised, malicious actors could access the entire database – essentially giving them screenshots of everything you have done for a given date range.

This was a colossal issue initially, considering that upon release the Recall database was stored in plaintext. Not long after the announcement of the Recall feature, security professionals began to test exploits, and find potential vulnerabilities in the Recall system.

One security researcher released a demo tool, called ‘TotalRecall’, which was able to automatically extract and display everything that Recall had recorded.

Researchers quickly realised that the entirety of the Recall database was stored in plaintext and unencrypted, meaning that all the they had to do was run their tool, and then filter through the database with search terms like “password” and “credit card” to access sensitive data.

Since then, Microsoft has taken strides to improve Recall’s local storage, by ensuring that the database is encrypted, and that access controls for accessing the Recall data are more stringent.

One of the other major issues of the Recall feature is that originally, there was no opt-in process.

Previously, the Recall feature would be enabled by default on CoPilot+ PC’s – it could be disabled in the settings, but users would need to be aware that it was enabled in the first place. There was the chance for a large amount of devices to be compromised through a feature that they didn’t even know was enabled.

Despite the fact that Microsoft has now made it an opt-in feature, these concerns are still prevalent – there is scepticism around whether users completely understand the implications of enabling the feature.

To conclude, Microsoft has pushed forward many changes that mitigate the majority of the issues with Recall.

But would you be comfortable knowing that there was a database storing everything you’ve ever seen on your device, which could potentially be exploited by malicious actors? I know how I feel!

More about Microsoft CoPilot?

Microsoft CoPilot is an AI-powered tool that leverages the use of Large Language Models (LLMs), which are a type of AI algorithm that uses deep learning and large datasets to understand and generate content. CoPilot uses the Generative Pre-Trained Transformer (GPT) models, and currently, it integrates with the GPT-4 model.

What is it used for?

Microsoft CoPilot is used to coordinate content in Microsoft Graph, Word, Excel, PowerPoint, Outlook, Teams, and other Microsoft 365 productivity applications. This integration provides various AI-powered tools that link in to the 365 applications that you use every day, assisting you with your daily tasks.

What is the ‘Recall’ feature?

The Recall feature allows CoPilot to collect information about the user, what the user is doing on the device, and information about the device itself. Essentially, every 5 seconds, Recall will take a screenshot of whatever is on your screen/screens. This is then turned into a database, stored locally, which you are able to search manually.