Table of Contents
Administrators Can Easily View Copilot Diagnostic Logs through Microsoft 365 Admin Center
An interesting and important request posted to the Microsoft 365 Copilot feedback forum raises the issue of administrator access to Copilot prompts. The Copilot section of the Microsoft 365 admin center contains a setting to send diagnostic logs to Microsoft on behalf of individual users (Figure 1).

Microsoft says that the feature “helps Microsoft receive comprehensive diagnostic data to aid in debugging, especially in cases where users may not be able to provide feedback themselves. By providing feedback on behalf of your users, you can help enhance the overall experience of Copilot for your organization by improving the quality and relevance of its responses.”
What’s in the Copilot Diagnostic Logs
Administrator submission of Copilot diagnostic logs involves the collection of diagnostic information generated for one or more users as they interact with Microsoft 365 Copilot through a selected application. For example, selecting Microsoft 365 (Office) captures details of interactions through Copilot Chat (BizChat). You can only select a single application to report, probably on the basis that an issue is likely to be restricted to how Copilot works through one interface.
The logs can capture up to 30 interactions selected from a date range going back a maximum of 30 days. The logs are generated in JSON format. The basic problem is that user prompts are revealed for all to see, including the administrator who generates the logs. Clicking on the link for the log file opens the file in a browser tab. Figure 2 shows the diagnostic information captured when I prompted BizChat to explain why the content of Copilot diagnostic logs is not obfuscated.

The prompt submitted by the user and the responses generated by Copilot are very clear. On the on hand, it’s easy to understand why this should be so. If the content was obfuscated in some manner, it would be harder for support engineers to interpret. On the other hand, the ease of access to what could be highly confidential and sensitive information is troubling.
Keeping AI Interactions Private
Many people have become accustomed to using AI tools to explore concepts, ideas, and feelings. Those interactions might turn into strategic initiatives or address personal questions. They might examine options to solve HR problems or improve business tactics. In short, the Copilot diagnostic logs capture information that people probably don’t want to share with their tenant administrator.
People might reluctantly agree to share the information with Microsoft to debug a problem with Copilot, but the thought that someone in the organization can access Copilot prompts and responses without any oversight is not something that most users will be comfortable with.
The current situation creates problems at many levels, including making sure that user privacy is respected by tenants. Protecting confidential information is a strong point in the argument to use Microsoft 365 Copilot over competitor solutions like the connectors to Microsoft 365 from OpenAI and Anthrophic.
Administrators can use other methods to investigate Copilot interactions, but exporting the Copilot diagnostics log is easier and faster. In addition, as far as I can tell, the Microsoft 365 admin center does not capture an audit record when an administrator exports the Copilot diagnostic log. It’s not good when such an obvious gap in data privacy is revealed.
Protecting User Privacy is Important
Microsoft provides tenants with an option to obfuscate usage report data to stop administrators seeing usage data for individual users by masking user-referenceable data like display names and email addresses. Knowing that someone has sent so many messages or created 43 documents in the last month is not an earthshattering breach of personal information. Even so, Microsoft responded to customer requests to deliver the option to conceal usage data in reports, for both the reports shown in the Microsoft 365 admin center and those generated by the Microsoft Graph usage reports API (including the Copilot usage report API).
Revealing the full gambit of someone’s conversation with Microsoft 365 Copilot to administrator eyes is a much more serious matter. Microsoft needs to fix this loophole fast by blocking administrator access to user prompts and responses. Adding auditing whenever an administrator generates Copilot diagnostic logs wouldn’t go amiss either. Please vote for the feedback item to show your support.
Support the work of the Office 365 for IT Pros team by subscribing to the Office 365 for IT Pros eBook. Your support pays for the time we need to track, analyze, and document the changing world of Microsoft 365 and Office 365. Only humans contribute to our work!