Microsoft has confirmed that a bug allowed its Copilot AI to summarize customers’ confidential emails for weeks without permission.
The bug, first reported by Bleeping Computer, allowed Copilot Chat to read and outline the contents of emails since January, even if customers had data loss prevention policies to prevent ingesting their sensitive information into Microsoft’s large language model.
Copilot Chat allows paying Microsoft 365 customers to use the AI-powered chat feature in its Office software products, including Word, Excel, and PowerPoint.
Microsoft said the bug, trackable by admins as CW1226324, means that draft and sent email messages “with a confidential label applied are being incorrectly processed by Microsoft 365 Copilot chat.”
The tech giant said it began rolling out a fix for the bug earlier in February. A spokesperson for Microsoft did not respond to a request for comment, including a question about how many customers are affected by the bug.
Earlier this week, the European Parliament’s IT department told lawmakers that it blocked the built-in AI features on their work-issued devices, citing concerns that the AI tools could upload potentially confidential correspondence to the cloud.
