Recall is designed to use local AI models to screenshot everything you see or do on your computer and then give you the ability to search and retrieve anything in seconds. There’s even an explorable timeline you can scroll through. Everything in Recall is designed to remain local and private on-device, so no data is used to train Microsoft’s AI models.
Despite Microsoft’s promises of a secure and encrypted Recall experience, cybersecurity expert Kevin Beaumont has found that the AI-powered feature has some potential security flaws. Beaumont, who briefly worked at Microsoft in 2020, has been testing out Recall over the past week and discovered that the feature stores data in a database in plain text.
Holy cats, this is way worse than we were told.
Microsoft said that Recall stored its zillions of screenshots in an encrypted database hidden in a system folder. Turns out, they're using SQLite, a free (public domain) database to store unencrypted plain text in the user's home folder. Which is definitely NOT secure.
Further, Microsoft refers to Recall as an optional experience. But it's turned on by default, and turning it off is a chore. They buried it in a control panel setting.
They say certain URLs and websites can be blacklisted from Recall, but only if you're using Microsoft's Edge browser! But don't worry: DRM protected films & music will never get recorded. Ho ho ho.
This whole debacle feels like an Onion article but it's not.
Luckily(?) Recall is currently only available on Windows 11, but I fully expect Microsoft to try and shove this terrible thing onto unsuspecting Win10 users via Update.
i'm so sick of being the only person who can make simple connections of how doing a thing to the ecosystem has effects. so so so so sick NO ONE knows the ways of the plants
they gotta start making North remakes. you take old 2d games and flip the view around so everything is facing North instead. who knows what goes on behind all these