Site icon Office 365 for IT Pros

Why Copilot Access to “Restricted” Passwords Isn’t as Big an Issue as Uploading Files to ChatGPT

Microsoft 365 Copilot and penetration tests
Advertisements

Unless You Consider Excel Passwords to be Real Passwords

I see that some web sites have picked up the penetration test story about using Microsoft 365 Copilot to extract sensitive information from SharePoint. The May 14 Forbes.com story is an example. The headline of “New Warning — Microsoft Copilot AI Can Access Restricted Passwords” is highly misleading.

Unfortunately, tech journalists and others can rush to comment without thinking an issue through, and that’s what I fear has happened in many of the remarks I see in places like LinkedIn discussions. People assume that a much greater problem exists when if they would only think things through, they’d see the holes in the case being presented.

Understanding the Assumptions made by the Penetration Test

As I pointed out in a May 12 article, the penetration test was interesting (and did demonstrate just how weak Excel passwords are). However, the story depends on three major assumptions:

I’m sure any attacker would love to find an easily-compromised tenant where they can gain control over accounts that have access to both badly managed SharePoint Online sites that hold sensitive information and Microsoft 365 Copilot to help the attackers find that information. Badly-managed and easily-compromised Microsoft 365 tenants do exist, but it is my earnest hope that companies who invest in Microsoft 365 Copilot have the common sense to manage their tenants properly.

Uploading SharePoint and OneDrive Files to ChatGPT

Personally speaking, I’m much more concerned about users uploaded sensitive or confidential information to OpenAI for ChatGPT to process. The latest advice from OpenAI is how the process works for their Deep Research product. Users might like this feature because they can have their documents processed by AI. However, tenant administrators and anyone concerned with security or compliance might have a different perspective.

I covered the topic of uploading SharePoint and OneDrive files to ChatGPT on March 26 and explained that the process depends on an enterprise Entra ID app (with app id e0476654-c1d5-430b-ab80-70cbd947616a) to gain access to user files. Deep Research is different and its connector for SharePoint and OneDrive is in preview, but the basic principle is the same: a Graph-based app uploads files for ChatGPT to process. If that app is blocked (see my article to find out how) or denied access to the Graph permission needed to access files, the upload process doesn’t work.

Set Your Priorities

I suggest that it’s more important to block uploading of files from a tenant to a third-party AI service where you don’t know how the files are managed or retained. It certainly seems like a more pressing need than worrying about the potential of an attacker using Microsoft 365 Copilot to run riot over SharePoint, even if a penetration test company says that this can happen (purely as a public service, and not at all to publicize their company).

At least, that’s assuming user accounts are protected with strong multifactor authentication…


Exit mobile version