Microsoft has enhanced the DLP policy for Copilot to cover Office files held in any storage location instead of only Microsoft 365 locations like SharePoint Online and OneDrive for Business. The change is made in the Office augmentation loop, a little-known internal component that coordinates use of connected experiences by apps. Extending the DLP policy to cover all locations makes perfect sense.
Microsoft would very much like Microsoft 365 tenants to use Copilot instead of ChatGPT. A recent comparison between Copilot and ChatGPT outlines some areas that Microsoft thinks are important when deciding which AI tool to use. Microsoft has a point because Copilot is embedded into Microsoft 365 whereas ChatGPT is more of an add-on. The competition for hearts and minds is very intense in the AI space.
A code error allowed Copilot Chat to expose confidential email. Microsoft is fixing the problem, but it’s a reminder of how AI can expose information of Microsoft 365 tenants don’t use available features to restrict AI access. Those features need to be configured and deployed, but that doesn’t take much effort. It’s better than users complaining when Copilot exposes their most secret thoughts.
A LinkedIn post explained how the UK Revenue and Customs authority train 30,000 people to use Microsoft 365 Copilot effectively. It’s a reminder that introducing complex software to a user community takes careful planning and support, including the provision of well-planned training to help people exploit the new software as quickly as possible. Otherwise, some of those expensive licenses might be wasted.
After the fuss around the initial introduction of the Anthrophic models into Microsoft 365 in September, we learn that Microsoft will enable access for all in January 2026. It would have been so much better had Microsoft said that they were working on the data protection arrangements with Anthrophic, but that didn’t happen. Is all well now? We’ll see in January…
A new DLP policy for Copilot prompts monitors blocked sensitive information types like credit card numbers to stop their use in Copilot prompts. The new policy can’t be combined with the existing DLP policy for Copilot, which checks for files with specific sensitivity labels to prevent Copilot from using their content in its responses. But that’s OK because the two policies do very different work.
A temporary chat with Microsoft 365 Copilot is one that forgets everything discussed in the conversation once the chat is over. The idea is that by leaving no trace, Copilot won’t recycle the ideas discussed in the chat later. Copilot absolutely discards the chat thread, but those pesky compliance records remain behind, ready for eDiscovery and other compliance investigations.
The question was asked if it was possible to identify use of the Claude LLM by the Copilot Researcher Agent. Audit records often help, so that’s the natural location to check. As it turns out, some information is captured when the Researcher agent is used, but figuring out if the agent uses the default ChatGPT-5 or Claude LLMs is a matter of intuition (or guesswork).
Agenda auto-draft is a new feature for OWA and the new Outlook to help meeting organizers create a draft meeting agenda using AI. The Copilot-generated draft agenda contains an introduction and some bullet points created from the meeting subject. It’s not a make or break feature for Microsoft 365 Copilot. Some will like it, if they discover how to use agenda auto-draft.
The Copilot usage report Graph API is now generally available. Like the report APIs for the other workloads, the Copilot usage API helps to understand usage of some very expensive licenses. Even better, the usage data can be combined with data from other Microsoft 365 sources to produce interesting and valuable insights. All it takes is some PowerShell to knit everything together.
Microsoft 365 Copilot Search can be extended by ingesting information from external sources through a Microsoft 365 Copilot Connector. In this article, we show how to configure the Enterprise websites prebuilt connector to ingest articles from the Office365ITPros.com and Practical365.com sites, and how Copilot Search presents that information in its results and summaries. It’s quick, easy, and seamless – so really pretty good!
On September 24, Microsoft announced that Anthrophic LLMs could be used with the Copilot Researcher agent and to build agents with Copilot Studio. Although it’s great to enable choice so that customers can choose the AI model they prefer, questions about data security, lack of support for compliance solutions, and adherence to standards like the EU data boundary will concern Microsoft 365 tenants.
The rollout of the Copilot Chat integration with the Microsoft 365 apps has started, with the intention of making it easier to use AI in peoples’ work. Nice as the integration is, the news that an Open in Word action button is coming (soon) to allow content generated by Copilot to be edited in Word is even better. And we round out the week with a note about a change to the domain used by Teams.
Microsoft 365 Copilot now has some SharePoint skills to deploy in the SharePoint admin center. The problem is that the skills aren’t very good and don’t do much to help hard-pressed SharePoint Online administrators cope with the vast explosion of sites that exist in many tenants today. The problem is data. If Copilot doesn’t have the information to reason over, it can’t answer questions or give advice.
A new SharePoint Site content and policy comparison report is available to tenants with Microsoft 365 Copilot or SharePoint advanced management licenses. The idea is that you choose some reference sites to compare other sites against to detect deviations from the reference site. It seems like a good idea if you’re trying to impose standards to control Copilot. Unhappily, attempts at running the report turned up zero results.
Microsoft announced a new Copilot license check diagnostic for the Exchange Connectivity Analyzer. Sounds good, but the test is very simple, and its results don’t tell you anything more than a few lines of PowerShell can deliver. To prove the point, we wrote a quick script to show how to perform a Copilot license check with the Microsoft Graph PowerShell SDK.
Copilot memory is a term that refers to different things, including Copilot communication memory, a method to use the Graph to personalize responses for users. The idea is to use all the sources of information available through the Graph as Copilot responds to user prompts in Microsoft 365 apps instead of limiting sources to whatever the app works with. It’s a good idea, providing the Graph sources are accurate.
In late August, Microsoft plans to release the Copilot summarize email thread feature in Outlook clients without the need for a Microsoft 365 Copilot license. This news might seem surprising, but it’s simply a matter of business. If Microsoft doesn’t make basic AI features available in Outlook, ISVs (including OpenAI) will fill the gaps with add-ons. And that might make it harder to sell Microsoft 365 Copilot licenses.
After a report to the MSRC about some missing file data from Copilot audit records, Microsoft fixed the problem and audit records now contain details about the SharePoint Online files reviewed by Copilot to construct answers to user prompts. Having solid audit and compliance data is a good thing, unless you’re a lawyer charged with defending an eDiscovery action who might be asked to produce the files.
A July 14 post announces Copilot Memory, a method to personalize how Copilot responds to user prompts. Controls are available to disable Copilot memory on a per-user and tenant basis. Manipulation of the tenant controls is done through Graph resources. This article explains how Copilot memory works and how to update the tenant controls with PowerShell.
Security researchers documented a prompt injection vulnerability in an agent created with Copilot Studio that allowed the exfiltration of customer data. Microsoft has fixed the problem, but the researchers figure that natural language prompts and the way that AI responds means that other ways will be found to cause agents to do silly things. Microsoft 365 tenants need to think about the deployment and management of agents.
Microsoft 365 Copilot Search is the second iteration of Copilot Search. It borrows heavily from the older Microsoft Search in Bing feature in terms of how it presents different types of results. Copilot Search is unmatched when it comes to searching Exchange, SharePoint, and Teams, but its ability to search the web is hindered by the dependency on Bing and the preference given to Microsoft.com sources.
Microsoft 365 Copilot users can generate audio overviews from Word and PDF files and Teams meeting recordings stored in OneDrive for Business. Copilot creates a transcript from the file and uses the Azure Audio Stack to generate an audio stream (that can be saved to an MP3 file). Sounds good, and the feature works well. At least, until it meets the DLP policy for Microsoft 365 Copilot.
Among the blizzard of Copilot changes is one where Outlook can summarize attachments. That sounds small, but the feature is pretty useful if you receive lots of messages with “classic” (file) attachments. Being able to see a quick summary of long documents is a real time saver, and it’s an example of a small change that helps users exploit AI. Naturally, it doesn’t work with Outlook classic.
Copilot Studio Agents can use files as knowledge sources to reason over when they respond to user prompts. We explain how to use the monthly PDFs issued for the Office 365 for IT Pros and Automating Microsoft 365 with PowerShell eBooks as knowledge sources. If you’ve got Microsoft 365 Copilot licenses, this is an interesting way to interact with the books.
Microsoft will launch the aiInteractionHistory Graph API (aka, the Copilot Interaction Export API) in June. The API enables third-party access to Copilot data for analysis and investigative purposes, but any ISV who wants to use the API needs to do some work to interpret the records returned by the API to determine what Copilot really did in its interactions with users.
A set of 80 mysterious SharePoint Embedded containers turned up because Microsoft pre-provisioned storage for files used as knowledge sources by Copilot agents. Details of the pre-provisioning are in message center notification MC1058260, but who has the time to read and analyze everything posted to the message center? And anyway, the mysterious containers have now disappeared…
Some sites picked up the Microsoft 365 Copilot penetration test that allegedly proved how Copilot can extract sensitive data from SharePoint Online. When you look at the test, it depends on three major assumptions: that an attacker compromises a tenant, poor tenant management, and failure to deploy available tools. Other issues, like users uploading SharePoint and OneDrive files to process on ChatGPT, are more of a priority for tenant administrators.
Two new service plans are now in the Microsoft 365 Copilot license to allow users access to Viva Insights. The new service plans enable the Copilot dashboard in Viva Insights. It’s nice to get new functionality, but sometimes you don’t want people to use a feature, which brings up the topic of disabling a Copilot service plan using GUIs or a PowerShell script.
An article by a company specializing in penetration tests raised some questions about how attackers might use Copilot for Microsoft 365 to retrieve data. The article is an interesting read and reveals how Copilot can reveal data in password protected Excel worksheets. However, many of the issues raised can be controlled by applying available controls, and the biggest worry is lhow the account being used to run Copilot came to be compromised!
Copilot usage data can be pretty sparse, but it’s easy to enhance the data to gain extra insight into how Microsoft 365 Copilot is used within a tenant. In this case, an administrator wanted to have department and job title information available for each Copilot license holder, so we combined the Copilot usage data with details of Entra ID user accounts with Copilot licenses to create the desired report.
At Ignite 2024, Microsoft said that Copilot for Microsoft 365 tenants would benefit from SharePoint Advanced Management (SAM). What does that mean? Well, it doesn’t mean that Copilot tenants get SAM licenses, which is what many expect. It does mean that SAM checks for Copilot before it lets tenants use some, but not all, of its features. Read on…
First introduced in March 2025 to block access to sensitive documents by BizChat, Microsoft has extended the DLP policy for Copilot to cover the web and desktop versions of the Office apps (Word, Excel, and PowerPoint). The implementation works but could confuse users. It might be better if Microsoft simply removes all traces of Copilot when working with files subject to the DLP policy.
Microsoft 365 Copilot will soon introduce a feature to fix spelling and grammar errors with one click. At least, that’s the promise when Microsoft delivers the new feature in late April 2025. It seems like a good idea to do everything with a single pass to generate error-free text that the user can accept or reject. Quite how well this works in practice remains to be seen.
Restricted Content Discovery (RCD) is a solution to prevent AI tools like Microsoft 365 Copilot and agents accessing files stored in specific sites. RCD works by setting a flag in the index to stop Copilot attempting to use files. RCD is available to all tenants with Microsoft 365 Copilot and it’s an excellent method to stop Copilot finding and reusing confidential or sensitive information.
Microsoft has given the Copilot for Outlook UI a revamp to make the UI easier to use. The new UI is certainly better and reveals the option to rewrite as a poem. Not that sending poetic emails will make much difference to anyone, but the revamp proves once again that good design makes a difference. Overall, the new UI is a sign that Copilot is maturing after its hectic start.
The DLP policy for Microsoft 365 Copilot blocks access to sensitive files by checking for the presence of a sensitivity label. If a predesignated label is found on a file, Copilot Chat is blocked from using the file content in its responses. The nicest thing is that the DLP policy prevents users knowing about sensitive information by searching its metadata.
Some people get great results from AI tools like Microsoft 365 Copilot. Others struggle to make Copilot useful. As an article by a Microsoft product manager points out, the reason might be the way we use Copilot. If you don’t give Copilot the right data to work with and don’t ask the right questions through well-structured prompts, there’s no prospect of good answers.
The Microsoft 365 Copilot Chat app is the free to use chat app available to commercial Microsoft 365 customers. The free chat app now supports Copilot agents, including agents that are grounded against Graph data (on a pay-as-you-go metered basis). The free chat app is highly functional, and Microsoft hopes that it will convince customers to buy the full-fledged Copilot.
Microsoft loves branding exercises. At least, that can be the only reason why the Microsoft 365 Copilot rename is happening. I can think of no other reason why Microsoft would seek to confuse its customers by applying the Microsoft 365 moniker to an app that can’t access Microsoft 365 data, unless of course people pay to use Copilot agents. It’s all very confusing.