Table of Contents
DLP Policy for Copilot Ignored By Software Glitch
Updated: 19 February 2026
An embarrassing security glitch appeared in Microsoft 365 when the DLP policy constructed to stop Copilot Chat processing email and files stamped with specific sensitivity labels failed to suppress confidential material appearing in Copilot responses. According to service health advisory CW1226324 (3 February 2026), the root cause is a “code issue” which allows “items in the Sent Items and Drafts folders to be picked up by Copilot even though confidential labels are set in place.” Items in other folders don’t appear to be affected.
Customers first reported the issue on 21 January 2026, so the flaw was active for a while before Microsoft accepted that the problem was real.
How the DLP Policy for Copilot Works
Figure 1 shows a DLP policy rule of the type affected by the problem. The rule mandates that any email or document (Office or PDF files) stamped with the Confidential label is excluded from Copilot for Microsoft 365 processing.

With the DLP Policy for Copilot in place, I created and sent a message stamped with the Confidential label. The message refers to a fictious Project Bunny. When I asked Copilot Chat to search for information about Project Bunny, Copilot responded that it could find an internal reference (the email) but cannot disclose the content because it is marked as sensitive internal correspondence (i.e., the sensitivity label).

Problem Fix Rolling Out
According to CW1226324, my tenant was affected. I didn’t notice the problem until it was drawn to my attention by a comment posted to Office365ITPros.com. By the time I tested, Microsoft had fixed the problem, and the fix had reached my tenant. According to an update posted on 10 February 2026, the remediation for the issue is rolling out. Or, in Microsoft terms, “saturates across the affected environments,” which is how they describe a software update reaching all the servers that need to be patched.
Microsoft says that users have reported that the deployed fix has resolved the issue. They continue to monitor the progress of the fix and will provide a follow-up update on 24 February 2026. Microsoft says that they expect “full remediation” by the time of their next update.
If, as it seems, the problem is fixed, Microsoft should generate a post-incident report (PIR) to explain what happened. It would be interesting to know if the code issue has always existed or was introduced as the result of some change.
Software Problems Happen
Everyone working in IT realizes the potential for software issues to occur. But problems like this raise the question about how Microsoft tests software before release. Confidential messages do end up in the Sent Items folder, so on the surface it seems like checking email in that folder is an easy test for the DLP Policy for Copilot.
It can be argued that Copilot Chat exposing messages in the Draft folder is less important because these items are personal to the signed-in user and haven’t been shared with other people. Nevertheless, a policy is a policy, and Copilot should not violate the DLP policy.
A Good Reminder
On the upside, the incident serves as a useful reminder to the Microsoft 365 tenants that support the 15 million paid seats for Microsoft 365 Copilot that confidential information should be protected from AI tools. The DLP policy for Copilot should be used in all of these tenants, as should Restricted Content Discovery (RCD) for SharePoint Online. RCD is a SharePoint Advanced Management feature that is available to every tenant with Microsoft 365 Copilot licenses to remove sites containing sensitive or confidential information from Copilot’s view.
For more information about how the DLP policy for Copilot works, see the Microsoft documentation.
Support the work of the Office 365 for IT Pros team by subscribing to the Office 365 for IT Pros eBook. Your support pays for the time we need to track, analyze, and document the changing world of Microsoft 365 and Office 365. Only humans contribute to our work!
11 Replies to “Code Error Allowed Copilot Chat to Expose Confidential Information”