Microsoft 365 Copilot - Is your company ready for AI?
- Microsoft 365 Copilot is an artificial intelligence that is directly integrated into Microsoft Office programs, SharePoint and Exchange Server.
- The system supports employees in everyday tasks and thus increases the efficiency of different departments.
- The introduction of Copilot has significant implications for companies’ data protection and therefore requires comprehensive coordination and guidance.
Assessment of Copilot Readiness
The licensing of Copilot for Microsoft 365 represents a significant change to a company’s IT security architecture compared to the use of ChatGPT, Gemini or other AI assistants based on Large Language Models (LLM). Unlike ChatGPT & Co., Copilot does not just access predefined data. The Microsoft tool retrieves additional information from the Internet and – even more importantly – from the company’s own database. Copilot uses data from the SharePoint server, for example, and can also access emails, chats and documents via Microsoft Graph. This means that information that was previously only available locally in the data of individual employees and groups may also be visible in Copilot’s responses and content.
Oliver Teich (Strategic Consultant)
Check and/or implement authorization models
Microsoft itself advises in the Copilot documentation: “It is important that you use the permission models available in Microsoft 365 services such as SharePoint to ensure that the right users or groups have the right access to the right content in your organization.”
It is not enough to check the permissions of users and groups. Other access paths such as guest access, local SharePoint permissions, share links and external and public access should also be carefully reviewed.
Note: People who do not belong to your company can also have access to data via shared team channels.
Note: Copilot does not accept any labels assigned via Microsoft Purview Information Protection (MPIP) in its responses. Although the system ensures that only data that is relevant to the respective user is used for AI-generated content, the response itself does not receive an MPIP label.
Overall, a strict need-to-know policy should therefore be implemented in the company. With Copilot, it is more important than ever that employees only have access to the data that is relevant to their respective tasks. It is advisable to implement a zero-trust architecture based on the principle of least privilege, or at least a strict review of all access permissions if this is not possible.
Checking the data protection policy
Microsoft claims that both Microsoft 365 and Copilot comply with the General Data Protection Regulation. The company promises on its website: “Microsoft Copilot for Microsoft 365 complies with our existing privacy, security and compliance obligations to Microsoft 365 commercial customers, including the General Data Protection Regulation (GDPR) and the European Union (EU) Data Limitation Regulation.”
Oliver Teich (Strategic Consultant)
Evaluation of additional agreements
However, the German Federal and State Data Protection Conference (DSK) and other supervisory authorities, such as ENISA, think that the Data Protection Addendum (DPA) offered by Microsoft does not adequately meet the requirements of European data protection law. They recommend that companies conclude an additional data processing agreement with Microsoft or at least review this carefully. The State Commissioner for Data Protection and Freedom of Information of North Rhine-Westphalia describes in a handout which considerations are important here. Essentially, the experts recommend: “A supplementary agreement to the DPA to be concluded between the controller and Microsoft should make it clear that this supplementary agreement takes precedence over all conflicting contractual texts included by Microsoft and takes precedence over these in the event of a conflict.” This supplementary agreement should regulate the following points, among others:
- Microsoft’s own responsibility in the context of data processing for business activities that are triggered by the provision of products and services to the customer,
- Obligation to follow instructions, disclosure of processed data, fulfillment of legal regulations
- Implementation of technical and organizational measures in accordance with Art. 32 GDPR
- Deletion of personal data and
- Information about sub-processors.
If such agreements have already been made or evaluated, they should at least be subjected to a new data protection impact assessment as part of the Copilot roll-out.
Data might leave the boundaries of the Microsoft 365 service
In general, Microsoft promises that all data in the Microsoft 365 system will be stored and processed within the EU. In the context of Copilot, however, the company points out two exceptions to this principle:
- For example, a graph-based chat can be linked to web content. In this case, a Bing query required for this can also contain internal company data – and thus end up at Microsoft. To be on the safe side, all Bing functions of Copilot should therefore be deactivated.
- Plugins can also be installed for Copilot. Here, Microsoft explicitly recommends: “Review the privacy policy and terms of use of the plug-in to determine how it handles your organization’s data.” Companies that use Copilot should therefore generally not allow plug-ins in the system or require a separate data protection and risk assessment for each plug-in used.
Review IT security strategy
In a study on the use of AI language models in companies, the German Federal Office for Information Security (BSI) comes to the conclusion that, in addition to many advantages, these systems can also harbor new IT security risks or increase the threat potential of known IT threats.
The BSI therefore advises: “In response to these potential threats, companies or authorities should carry out a risk analysis for the use of large AI language models in their specific application before integrating them into their work processes. They should also evaluate misuse scenarios to determine whether they pose a threat to their workflows. Based on this, existing security measures can be adapted and, if necessary, new measures can be taken and users can be informed about the potential dangers.”
Before introducing the Copilot system, companies should therefore urgently gain an overview of the current status of their IT security architecture. To this end, not only Microsoft 365, but also all other programs, apps, services and plugins used should be checked. Microsoft itself recommends the introduction of a zero-trust model for Copilot.
Works council may be required to approve AI deployment
The start into the AI future cannot be decided by management or the IT department on its own. As a system such as Copilot has a significant impact on workflows and processes, an existing works council must be involved in the planning of the introduction or for a pilot project itself.
As the AI systems can monitor the performance and behavior of employees, the works council has a right of co-determination and can even demand the conclusion of a works agreement on the use of AI.
Employee training
Probably the most important step in the introduction of the Copilot system in Microsoft 365 is the training of employees. The following points should be communicated clearly and comprehensibly to all those who will later work with Copilot:
- The AI’s results should never be accepted without verification. Microsoft itself admits: “The answers generated by generative AI are not guaranteed to be 100% reliable.” This somewhat slippery wording means that AI sometimes invents information. So before relying on the data provided by Copilot, it should always be checked by employees independently of the Copilot system. This is because Microsoft only provides Copilot information as part of its best-effort quality guidelines and therefore assumes no liability for the accuracy of the system’s statements.
- The use of Copilot means that a so-called semantic index is created for each user. This is used to create content in future that sounds authentic and corresponds to the user’s style. To do this, the AI analyzes the characteristics and habits of its users over several weeks.
- All requests to the AI are initially saved and can later be viewed by the user (and senior administrators) at any time in the Copilot interaction history. This applies not only to entries in applications such as Word, PowerPoint or Excel, but also to team meetings in which Copilot’s automatic transcription function has been activated.
Oliver Teich (Strategic Consultant)
Ready for the AI revolution with Copilot
Copilot offers great possibilities: It simplifies everyday work, automatically creates conference recordings, designs presentations and prepares data in an easy to read format. However, these powerful capabilities also mean far-reaching intervention in a company’s data protection structure.
The introduction of the Copilot system must therefore be organized, supported and managed at many levels. Only if a company is fully prepared for the AI assistant can it take full advantage of the system’s possibilities and opportunities. If, on the other hand, mistakes are made during implementation, there is a risk of actual data protection leaks in the office architecture as well as regulatory problems.