There has been an outcry after blatant data breaches and misinformation related to the use of AI. However, the benefits unlocked by AI, such as Copilot, make it inevitable. This blog dives into some key considerations to prepare you for Copilot before its deployment.
Introducing Microsoft 365 Copilot
The introduction of Copilot for Microsoft 365 is set to transform how we work. It promises to boost productivity and efficiency within M365 applications such as Word, PowerPoint, Teams, and Outlook by leveraging your company’s information.
Although the temptation to deploy Copilot immediately is understandable, it is crucial to first evaluate your organization's readiness and go through preparatory steps, approaching the deployment with a plan, especially in the context of minimizing the risk of data leakage.
While there are some hard technical requirements and strategic and operational considerations such as building a center of excellence or developing an adoption strategy, a critical step in preparing for Copilot is data readiness. The first step is to determine the relevant data that Copilot will have access to (and the associated risks), based on the permissions in your M365 tenant. Then, outdated and irrelevant content must be purged. This helps reduce security risks and improves the quality of Copilot’s output. Finally, and arguably more importantly, robust access management and data governance practices are indispensable. Before rolling out Copilot, it is essential that you have a clear understanding of your data security posture.
Copilot’s Data Governance and Protection Features and the Challenge of Permissions
Two reassuring things to note concerning Copilot’s data governance and protection are that your data will always remain within your tenant and the large language model (LLM) used by Copilot won’t be trained on your enterprise data. Also, Copilot's access to sensitive data is contingent on existing permissions and policies. This means Copilot can only access data for which the user has at least view permissions. While generally beneficial, this can become a liability if permissions are too lax. Copilot could surface potentially confidential documents that users have access to but were previously unaware of, such as documents stored in overly permissive OneDrive folders.
Overly broad permissions are a widespread problem: according to a 2022 Varonis report, 10% of a company’s cloud data is exposed to every single employee and Microsoft's 2023 State of Cloud Permissions report found that less than 1% of permissions granted to identities are actually used.
Key Considerations Before Deployment
To ensure robust access management and a secure data environment, I want to get into a few key considerations and steps that you should take into account before deploying Copilot:
- Data identification, classification and governance: it is key to identify, understand and classify your data. Proper data classification is the cornerstone of sound governance, ensuring that Copilot processes data in compliance with your organizational policies and legal requirements.
- Data labeling: apply appropriate sensitivity labels to your classified data so that you can develop data loss policies to prevent oversharing. While with Copilot usually the highest priority label from the source files is automatically inherited, you should familiarize yourself with the exceptions. Appropriate labeling will guide Copilot's data interactions and align them with your company’s policies.
- Access management: make sure to establish, review, and maintain robust permission policies. Navigating content permissions can be complex as there are many ways to give a user access to data. To move towards the optimal state of least privilege, you need to conduct regular access reviews and tighten permissions on overexposed assets. Effective access management is critical to prevent data leakage, as Copilot can access all sensitive data that a user has permission to view.
- Geographical restrictions: be aware of and adhere to any necessary regulatory geographical restrictions for your data. If you are already using Microsoft 365, no changes should be necessary as Copilot data processing stays within your data boundary. Still, maintaining your data boundaries such as the EU data boundary is central to ensuring regulatory compliance.
- Regulations: new regulations, such as the EU Artificial Intelligence Act are moving fast. Even though Microsoft states that they will continue to adjust their products in accordance with new regulations, keep track of which regulations your company must follow and of gaps within Copilot.
- AI Impact assessment (Risk & Privacy Assessment): familiarize yourself with the potential risks that could arise from using your data with Copilot. Hence, conducting a comprehensive risk assessment, and potentially a Privacy Impact Assessment, is a key step to ensure the deployment aligns with your organization’s risk appetite.
Conclusion and Discussion
In conclusion, the deployment of Microsoft Copilot within M365 apps significantly helps you embrace AI-driven workplace efficiency. However, it requires a methodical approach, centered around robust data classification and stringent access management in accordance with your company's policies and regulations. Although complex, these steps are not only crucial for the seamless integration of Copilot but also help to enhance your overall security posture.
What are your biggest concerns about the use of Copilot for M365 and where do you see the biggest gaps you need to address first? Wherever you are in your AI journey, KPMG can help you with the next step.