A new action for the DLP Policy for Copilot allows Microsoft 365 tenants to block Copilot from performing (Bing) web searches if a prompt contains sensitive information types. The new action allows Copilot to continue to process prompts using Microsoft 365 content (if the user has a Microsoft 365 Copilot license) while stopping potentially sensitive data being sent to Bing.
The Microsoft 365 admin center includes an option for administrators to send Copilot diagnostic logs on behalf of users to Microsoft for investigation. Sounds good, but the diagnostic logs are in plain text (JSON format) and the prompts and responses for Copilot user interactions can be viewed by administrators. That doesn’t seem like a good way to preserve anyone’s privacy. Vote for the feedback item to close this loophole.
Microsoft plans to introduce flex routing to handle situations when demand exceeds capacity for Copilot processing in Europe. This could be an issue for Microsoft 365 tenants in the European Union or European Free Trade Association who want to be sure that their data is processed in Microsoft local datacenters and not sent to the U.S. or Australia when available capacity cannot meet the demand for “large language model inferencing”.
The Planner agent is now available to users with Microsoft 365 Copilot licenses and any Planner license. The purpose of the Planner agent is to help people understand the steps necessary to accomplish assigned tasks. Like any AI tool, the quality of the agent’s output depends on the precision and completeness of the instructions given for a task. But if you’re prepared to iterate, the agent might work for you.
Microsoft 365 E7 bundles Microsoft 365 E5, Microsoft 365 Copilot, the Entra Suite, and the new Agent 365 into a $99/user/month SKU. The big question is whether investing in Microsoft 365 E7 licenses make sense for tenants? Buying a big batch of licenses will simply throw money away unless those licenses can be used. Paul Robichaux debates the issues and suggests some advice about how to assess the need for E7.
In December 2024, Microsoft introduced a control to block responses to sentiment-related prompts in Teams meeting chat. Now that block extends to every Teams user following a consolidation of the Teams meeting implementation of chat with Microsoft 365 Copilot. Basically, the block stops Copilot responding to prompts that look for opinions about emotions, judgement, or evaluations of other meeting participants. It’s a good thing.
Microsoft celebrated the 25th anniversary of SharePoint with a batch of announcements, including AI in SharePoint, intended to help administrators to manage all aspects of SharePoint Online through natural language. Other interesting announcements included department-level payments for Microsoft 365 Backup and the renaming of the Connections app in Teams as the SharePoint app. Well, the last wasn’t that interesting…
Microsoft would very much like Microsoft 365 tenants to use Copilot instead of ChatGPT. A recent comparison between Copilot and ChatGPT outlines some areas that Microsoft thinks are important when deciding which AI tool to use. Microsoft has a point because Copilot is embedded into Microsoft 365 whereas ChatGPT is more of an add-on. The competition for hearts and minds is very intense in the AI space.
A code error allowed Copilot Chat to expose confidential email. Microsoft is fixing the problem, but it’s a reminder of how AI can expose information of Microsoft 365 tenants don’t use available features to restrict AI access. Those features need to be configured and deployed, but that doesn’t take much effort. It’s better than users complaining when Copilot exposes their most secret thoughts.
Microsoft FY26 Q2 results included a new figure for Microsoft 365 commercial paid seats: “over 450 million.” Seats are growing at a consistent 6% year-over-year rate, and the June 2026 increases will mean an extra $10 billion or so revenue. In other news, we learned that Microsoft 365 Copilot has 15 million paid seats, or roughly 3.33% of the Microsoft 365 installed base.
Restricted Content Discovery (RCD) is a feature that blocks access by Microsoft 365 Copilot and agents to the files stored in a SharePoint Online site. Instead of relying on tenant administrators, site administrators can now enable or disable RCD. It’s a natural evolution of what is an essential feature to keep sensitive and confidential information being leaked inadvertently by AI.
Chat and meetings have their agents, and now the Teams channel agent is available to help members understand what happens inside channels. Like any AI agent given limited sets of data to reason over, the channel agent does a good job of finding nuggets hidden in conversations. The issue is that the channel agent doesn’t currently work for channels that have external members, like guest accounts. That’s a big downside.
A LinkedIn post explained how the UK Revenue and Customs authority train 30,000 people to use Microsoft 365 Copilot effectively. It’s a reminder that introducing complex software to a user community takes careful planning and support, including the provision of well-planned training to help people exploit the new software as quickly as possible. Otherwise, some of those expensive licenses might be wasted.
After the fuss around the initial introduction of the Anthrophic models into Microsoft 365 in September, we learn that Microsoft will enable access for all in January 2026. It would have been so much better had Microsoft said that they were working on the data protection arrangements with Anthrophic, but that didn’t happen. Is all well now? We’ll see in January…
A new DLP policy for Copilot prompts monitors blocked sensitive information types like credit card numbers to stop their use in Copilot prompts. The new policy can’t be combined with the existing DLP policy for Copilot, which checks for files with specific sensitivity labels to prevent Copilot from using their content in its responses. But that’s OK because the two policies do very different work.
The Ignite 2025 keynote was a marathon 150-minute event, but some interesting Microsoft 365 announcements emerged, mostly centered on AI. Microsoft is obviously focused on making AI and agents a very real part of tenant activities, so there’s new agent management and a repository among other things that will roll out in the year ahead.
A temporary chat with Microsoft 365 Copilot is one that forgets everything discussed in the conversation once the chat is over. The idea is that by leaving no trace, Copilot won’t recycle the ideas discussed in the chat later. Copilot absolutely discards the chat thread, but those pesky compliance records remain behind, ready for eDiscovery and other compliance investigations.
The question was asked if it was possible to identify use of the Claude LLM by the Copilot Researcher Agent. Audit records often help, so that’s the natural location to check. As it turns out, some information is captured when the Researcher agent is used, but figuring out if the agent uses the default ChatGPT-5 or Claude LLMs is a matter of intuition (or guesswork).
The site attestation policy is designed to require site owners to make a positive statement that the settings of their site, including its current membership, are accurate. The idea is that requiring site owners to attest that their site is still needed will force people to decide whether sites are still in active use and should be kept online. If not, the policy can move the sites into Microsoft 365 Archive.
OpenAI has launched a ChatGPT enterprise SharePoint Connector that allows organizations to synchronize files from SharePoint Online to ChatGPT. I could never understand why Microsoft 365 tenants allowed users to upload individual files from SharePoint or OneDrive to ChatGPT for processing. Using a connector to synchronize entire sites to ChatGPT makes even less sense, especially from a compliance perspective. I must be missing something!
The Copilot usage report Graph API is now generally available. Like the report APIs for the other workloads, the Copilot usage API helps to understand usage of some very expensive licenses. Even better, the usage data can be combined with data from other Microsoft 365 sources to produce interesting and valuable insights. All it takes is some PowerShell to knit everything together.
On September 24, Microsoft announced that Anthrophic LLMs could be used with the Copilot Researcher agent and to build agents with Copilot Studio. Although it’s great to enable choice so that customers can choose the AI model they prefer, questions about data security, lack of support for compliance solutions, and adherence to standards like the EU data boundary will concern Microsoft 365 tenants.
With not a little hype, Microsoft launched the SharePoint Knowledge Agent on September 18. Getting some AI help to organize sites sounds good, but only if the assistance delivered by the artificial intelligence does something useful. In this case, the agent generated some moderately interesting results without ever reaching the level of AI magic anticipated (and reported) by some.
The rollout of the Copilot Chat integration with the Microsoft 365 apps has started, with the intention of making it easier to use AI in peoples’ work. Nice as the integration is, the news that an Open in Word action button is coming (soon) to allow content generated by Copilot to be edited in Word is even better. And we round out the week with a note about a change to the domain used by Teams.
Microsoft 365 Copilot now has some SharePoint skills to deploy in the SharePoint admin center. The problem is that the skills aren’t very good and don’t do much to help hard-pressed SharePoint Online administrators cope with the vast explosion of sites that exist in many tenants today. The problem is data. If Copilot doesn’t have the information to reason over, it can’t answer questions or give advice.
Microsoft plans to deploy an update to change how transcription behaves for Teams meetings where Copilot is enabled. New meetings will not generate a transcript unless the meeting organizer explicitly enables transcription or the Microsoft 365 tenant deploys custom meeting policies that enable transcription with Copilot. The AI features work even without a transcript. But no transcript means no searchable artifact, and that’s what some want.
A new SharePoint Site content and policy comparison report is available to tenants with Microsoft 365 Copilot or SharePoint advanced management licenses. The idea is that you choose some reference sites to compare other sites against to detect deviations from the reference site. It seems like a good idea if you’re trying to impose standards to control Copilot. Unhappily, attempts at running the report turned up zero results.
Copilot memory is a term that refers to different things, including Copilot communication memory, a method to use the Graph to personalize responses for users. The idea is to use all the sources of information available through the Graph as Copilot responds to user prompts in Microsoft 365 apps instead of limiting sources to whatever the app works with. It’s a good idea, providing the Graph sources are accurate.
In late August, Microsoft plans to release the Copilot summarize email thread feature in Outlook clients without the need for a Microsoft 365 Copilot license. This news might seem surprising, but it’s simply a matter of business. If Microsoft doesn’t make basic AI features available in Outlook, ISVs (including OpenAI) will fill the gaps with add-ons. And that might make it harder to sell Microsoft 365 Copilot licenses.
After a report to the MSRC about some missing file data from Copilot audit records, Microsoft fixed the problem and audit records now contain details about the SharePoint Online files reviewed by Copilot to construct answers to user prompts. Having solid audit and compliance data is a good thing, unless you’re a lawyer charged with defending an eDiscovery action who might be asked to produce the files.
Microsoft 365 Copilot users can generate audio overviews from Word and PDF files and Teams meeting recordings stored in OneDrive for Business. Copilot creates a transcript from the file and uses the Azure Audio Stack to generate an audio stream (that can be saved to an MP3 file). Sounds good, and the feature works well. At least, until it meets the DLP policy for Microsoft 365 Copilot.
Agent governance is the framework that allows tenants to deploy agents safely, securely, and under control. A new ISV offering from Rencore helps to fill some gaps in Copilot agent governance that currently exist in what’s available in Microsoft 365. It’s good to see ISV action in this space because the last thing that anyone wants is the prospect of Copilot agents running amok inside Microsoft 365 tenants.
Among the blizzard of Copilot changes is one where Outlook can summarize attachments. That sounds small, but the feature is pretty useful if you receive lots of messages with “classic” (file) attachments. Being able to see a quick summary of long documents is a real time saver, and it’s an example of a small change that helps users exploit AI. Naturally, it doesn’t work with Outlook classic.
Microsoft will launch the aiInteractionHistory Graph API (aka, the Copilot Interaction Export API) in June. The API enables third-party access to Copilot data for analysis and investigative purposes, but any ISV who wants to use the API needs to do some work to interpret the records returned by the API to determine what Copilot really did in its interactions with users.
Some sites picked up the Microsoft 365 Copilot penetration test that allegedly proved how Copilot can extract sensitive data from SharePoint Online. When you look at the test, it depends on three major assumptions: that an attacker compromises a tenant, poor tenant management, and failure to deploy available tools. Other issues, like users uploading SharePoint and OneDrive files to process on ChatGPT, are more of a priority for tenant administrators.
Copilot usage data can be pretty sparse, but it’s easy to enhance the data to gain extra insight into how Microsoft 365 Copilot is used within a tenant. In this case, an administrator wanted to have department and job title information available for each Copilot license holder, so we combined the Copilot usage data with details of Entra ID user accounts with Copilot licenses to create the desired report.
At Ignite 2024, Microsoft said that Copilot for Microsoft 365 tenants would benefit from SharePoint Advanced Management (SAM). What does that mean? Well, it doesn’t mean that Copilot tenants get SAM licenses, which is what many expect. It does mean that SAM checks for Copilot before it lets tenants use some, but not all, of its features. Read on…
Microsoft 365 Copilot will soon introduce a feature to fix spelling and grammar errors with one click. At least, that’s the promise when Microsoft delivers the new feature in late April 2025. It seems like a good idea to do everything with a single pass to generate error-free text that the user can accept or reject. Quite how well this works in practice remains to be seen.
Restricted Content Discovery (RCD) is a solution to prevent AI tools like Microsoft 365 Copilot and agents accessing files stored in specific sites. RCD works by setting a flag in the index to stop Copilot attempting to use files. RCD is available to all tenants with Microsoft 365 Copilot and it’s an excellent method to stop Copilot finding and reusing confidential or sensitive information.
Microsoft has given the Copilot for Outlook UI a revamp to make the UI easier to use. The new UI is certainly better and reveals the option to rewrite as a poem. Not that sending poetic emails will make much difference to anyone, but the revamp proves once again that good design makes a difference. Overall, the new UI is a sign that Copilot is maturing after its hectic start.