Attacks on Microsoft’s Copilot AI allow for answers to be manipulated, data extracted, and security protections bypassed, new research shows.
I mean it is already by design a Phishing system.
Submitted 4 months ago by BrikoX@lemmy.zip to technology@lemmy.zip
https://www.wired.com/story/microsoft-copilot-phishing-data-extraction/
Attacks on Microsoft’s Copilot AI allow for answers to be manipulated, data extracted, and security protections bypassed, new research shows.
I mean it is already by design a Phishing system.
Jozzo@lemmy.world 4 months ago
Note that the attacker needs to already have access to your Microsoft 365 account to do any of this. Fuck copilot and all, but this isn’t something they couldn’t achieve before.