Generative AI solutions and platforms are taking over the business world, offering huge opportunities to businesses. However, the security risks involved for businesses are significant and without careful consideration, you can easily expose sensitive or competitive information to these models. As a Microsoft Copilot Consultancy Partner this is a common question asked by our clients.

In this article we explain the risks and how to navigate them for Microsoft Copilot.

The risks of data leaks in Microsoft Copilot

If not explicitly stated all inputs into Copilot will be used to train the model, which means in theory that information could be then generated as part of an answer for another user. This means you can accidentally leak anything from confidential information and company IP to customer data and more just by adding it to Copilot. 

Level 1 — Public Free Copilot

This is the most basic level of Copilot where you access it via the browser and it’s not connected to your work account.

A general-purpose AI assistant with no connection to your organisation’s systems. It is not appropriate for any business or sensitive use. There are no enterprise data protections, data may be used to improve Microsoft’s AI models, and there are no compliance commitments. Staff should be advised never to use this for work-related tasks.

Level 2 — Copilot Chat (Free, included with M365 commercial subscriptions)

The next level of Copilot is the Copilot Chat offering that is included with Microsoft 365 subscriptions. It is usually accessed and logged into with your M365 account and available in multiple Office 365 solutions but still limited to a chat interface.

This is a meaningful step up from consumer Copilot. When a user is signed in with their organisational Entra ID account, Enterprise Data Protection (EDP) applies automatically. Prompts and responses are not used to train AI models, data stays within the M365 service boundary, and Microsoft’s GDPR and compliance commitments apply.

Critically, this does not access your organisation’s SharePoint, emails, or files automatically, it is grounded in web data only, unless a user manually uploads or pastes content into the chat.

Security implication: No organisational data is accessed or exposed beyond what the user deliberately provides. However, when users copy-paste content into the chat, they are creating an ungoverned copy of that data the text loses its sensitivity labels and permission context, even though it remains within the M365 boundary. No other user in the organisation can access another user’s chat session. The risk is data extraction from its governed container, not unauthorised access by others.

Level 3 — Microsoft 365 Copilot (Licensed add-on)

This is the fully licensed integrated solution, where Copilot is fully embedded across Office 365 including Word, Teams, PowerPoint and SharePoint.

The most capable and perhaps, counterintuitively, the most governable experience. Copilot accesses your organisational data through Microsoft Graph but does so entirely within your existing security framework.

Critically:

  • Copilot only surfaces content a user already has permission to access. It respects all existing SharePoint permissions, access controls, and sensitivity labels
  • No user can access data through Copilot that they couldn’t already access directly. There is no privilege escalation whatsoever
  • Sensitivity labels on source documents are inherited by Copilot-generated responses
  • All interactions are audit-logged for compliance and eDiscovery
  • Data never moves. Copilot queries content in place and returns a response; it does not copy or relocate your files

Security implication: Your existing access controls are the security boundary. Copilot operates entirely within them. The main governance consideration is ensuring SharePoint permissions are correctly configured beforehand, if sites are overshared, Copilot will surface content users technically have access to. This isn’t a Copilot risk; it’s a permissions hygiene issue that Copilot makes visible.

Copilot Data Security Layers

Consumer Copilot Copilot Chat (Free) M365 Copilot (Licensed)
Requires work account sign-in No Yes Yes
Enterprise Data Protection No Yes Yes
Data used for model training Potentially No  No
Accesses org data automatically No No Yes, within permissions only
Respects existing access controls N/A N/A Yes Always
Risk of privilege escalation N/A None None
Suitable for business use  No With policy Yes, With governance in place

Copilot Readiness Services

So, the free versions of Copilot are great tools to ask questions to generate content and answers but should be treated with extreme caution to ensure data security. The licensed version offers the best governance security. To build a secure Copilot environment in your business access our Copilot consultancy services or Copilot readiness assessments for more information.