AI from Microsoft: Is your company Copilot Ready?
Blog

AI from Microsoft: Is your company Copilot Ready?

Microsoft's Copilot AI system for Office and Exchange promises rapid boosts in productivity. Whether emails, presentations, analyses or statistics - the system based on ChatGPT is designed to make everything faster and easier. But before companies enter the new world of AI, important questions about data security, compliance, regulation and access controls need to be answered. Is your company Copilot Ready? Read here what you should pay attention to.
6 minutes
April 03, 2024

Microsoft 365 Copilot - Is your company ready for AI?

  • Microsoft 365 Copilot is an artificial intelligence that is directly integrated into Microsoft Office programs, SharePoint and Exchange Server.
  • The system supports employees in everyday tasks and thus increases the efficiency of different departments.
  • The introduction of Copilot has significant implications for companies’ data protection and therefore requires comprehensive coordination and guidance.

Assessment of Copilot Readiness

The licensing of Copilot for Microsoft 365 represents a significant change to a company’s IT security architecture compared to the use of ChatGPT, Gemini or other AI assistants based on Large Language Models (LLM). Unlike ChatGPT & Co., Copilot does not just access predefined data. The Microsoft tool retrieves additional information from the Internet and – even more importantly – from the company’s own database. Copilot uses data from the SharePoint server, for example, and can also access emails, chats and documents via Microsoft Graph. This means that information that was previously only available locally in the data of individual employees and groups may also be visible in Copilot’s responses and content.

``The implementation of Copilot in an organization may have an impact on existing GDPR compliance, depending on how Copilot is used and what data is being processed. It is therefore advisable to re-check compliance after the introduction of Copilot to ensure that no breaches or risks arise.``

Oliver Teich (Strategic Consultant)

Check and/or implement authorization models

Microsoft itself advises in the Copilot documentation: “It is important that you use the permission models available in Microsoft 365 services such as SharePoint to ensure that the right users or groups have the right access to the right content in your organization.”

It is not enough to check the permissions of users and groups. Other access paths such as guest access, local SharePoint permissions, share links and external and public access should also be carefully reviewed.

Note: People who do not belong to your company can also have access to data via shared team channels.

Note: Copilot does not accept any labels assigned via Microsoft Purview Information Protection (MPIP) in its responses. Although the system ensures that only data that is relevant to the respective user is used for AI-generated content, the response itself does not receive an MPIP label.

Overall, a strict need-to-know policy should therefore be implemented in the company. With Copilot, it is more important than ever that employees only have access to the data that is relevant to their respective tasks. It is advisable to implement a zero-trust architecture based on the principle of least privilege, or at least a strict review of all access permissions if this is not possible.

Checking the data protection policy

Microsoft claims that both Microsoft 365 and Copilot comply with the General Data Protection Regulation. The company promises on its website: “Microsoft Copilot for Microsoft 365 complies with our existing privacy, security and compliance obligations to Microsoft 365 commercial customers, including the General Data Protection Regulation (GDPR) and the European Union (EU) Data Limitation Regulation.”

``Check whether you need to carry out a data protection impact assessment (DPIA) for the use of Copilot. A DPIA is a systematic analysis of the impact of data processing on the protection of personal data.``

Oliver Teich (Strategic Consultant)

Evaluation of additional agreements

However, the German Federal and State Data Protection Conference (DSK) and other supervisory authorities, such as ENISA, think that the Data Protection Addendum (DPA) offered by Microsoft does not adequately meet the requirements of European data protection law. They recommend that companies conclude an additional data processing agreement with Microsoft or at least review this carefully. The State Commissioner for Data Protection and Freedom of Information of North Rhine-Westphalia describes in a handout which considerations are important here. Essentially, the experts recommend: “A supplementary agreement to the DPA to be concluded between the controller and Microsoft should make it clear that this supplementary agreement takes precedence over all conflicting contractual texts included by Microsoft and takes precedence over these in the event of a conflict.” This supplementary agreement should regulate the following points, among others:

  • Microsoft’s own responsibility in the context of data processing for business activities that are triggered by the provision of products and services to the customer,
  • Obligation to follow instructions, disclosure of processed data, fulfillment of legal regulations
  • Implementation of technical and organizational measures in accordance with Art. 32 GDPR
  • Deletion of personal data and
  • Information about sub-processors.

If such agreements have already been made or evaluated, they should at least be subjected to a new data protection impact assessment as part of the Copilot roll-out.

Data might leave the boundaries of the Microsoft 365 service

In general, Microsoft promises that all data in the Microsoft 365 system will be stored and processed within the EU. In the context of Copilot, however, the company points out two exceptions to this principle:

  • For example, a graph-based chat can be linked to web content. In this case, a Bing query required for this can also contain internal company data – and thus end up at Microsoft. To be on the safe side, all Bing functions of Copilot should therefore be deactivated.
  • Plugins can also be installed for Copilot. Here, Microsoft explicitly recommends: “Review the privacy policy and terms of use of the plug-in to determine how it handles your organization’s data.” Companies that use Copilot should therefore generally not allow plug-ins in the system or require a separate data protection and risk assessment for each plug-in used.

Review IT security strategy

In a study on the use of AI language models in companies, the German Federal Office for Information Security (BSI) comes to the conclusion that, in addition to many advantages, these systems can also harbor new IT security risks or increase the threat potential of known IT threats.

The BSI therefore advises: “In response to these potential threats, companies or authorities should carry out a risk analysis for the use of large AI language models in their specific application before integrating them into their work processes. They should also evaluate misuse scenarios to determine whether they pose a threat to their workflows. Based on this, existing security measures can be adapted and, if necessary, new measures can be taken and users can be informed about the potential dangers.”

Before introducing the Copilot system, companies should therefore urgently gain an overview of the current status of their IT security architecture. To this end, not only Microsoft 365, but also all other programs, apps, services and plugins used should be checked. Microsoft itself recommends the introduction of a zero-trust model for Copilot.

Works council may be required to approve AI deployment

The start into the AI future cannot be decided by management or the IT department on its own. As a system such as Copilot has a significant impact on workflows and processes, an existing works council must be involved in the planning of the introduction or for a pilot project itself.

As the AI systems can monitor the performance and behavior of employees, the works council has a right of co-determination and can even demand the conclusion of a works agreement on the use of AI.

Employee training

Probably the most important step in the introduction of the Copilot system in Microsoft 365 is the training of employees. The following points should be communicated clearly and comprehensibly to all those who will later work with Copilot:

  • The AI’s results should never be accepted without verification. Microsoft itself admits: “The answers generated by generative AI are not guaranteed to be 100% reliable.” This somewhat slippery wording means that AI sometimes invents information. So before relying on the data provided by Copilot, it should always be checked by employees independently of the Copilot system. This is because Microsoft only provides Copilot information as part of its best-effort quality guidelines and therefore assumes no liability for the accuracy of the system’s statements.
  • The use of Copilot means that a so-called semantic index is created for each user. This is used to create content in future that sounds authentic and corresponds to the user’s style. To do this, the AI analyzes the characteristics and habits of its users over several weeks.
  • All requests to the AI are initially saved and can later be viewed by the user (and senior administrators) at any time in the Copilot interaction history. This applies not only to entries in applications such as Word, PowerPoint or Excel, but also to team meetings in which Copilot’s automatic transcription function has been activated.
``The creation of individual language profiles for individual users can be compatible with EU data protection law if a number of factors are taken into account and complied with. Copilot offers various options for controlling and managing the creation of individual voice profiles for individual users, for example by selecting the data sources, setting the data protection level and the deletion, accessibility and correctability of the data by the user.``

Oliver Teich (Strategic Consultant)

Ready for the AI revolution with Copilot

Copilot offers great possibilities: It simplifies everyday work, automatically creates conference recordings, designs presentations and prepares data in an easy to read format. However, these powerful capabilities also mean far-reaching intervention in a company’s data protection structure.

The introduction of the Copilot system must therefore be organized, supported and managed at many levels. Only if a company is fully prepared for the AI assistant can it take full advantage of the system’s possibilities and opportunities. If, on the other hand, mistakes are made during implementation, there is a risk of actual data protection leaks in the office architecture as well as regulatory problems.

The Cyber Chronicle Newsroom
We keep you posted with the latest news, data & trend topics
Microsoft Sentinel as Azure SIEM - Benefits & Costs
Learn more
AI
Fighting AI attacks: How to protect data and systems
Learn more
Assessment & Advisory
ISO 27001 Certification without delay
Learn more
Assessment & Advisory
Managed Services to counter the shortage of manpower
Learn more
Security & IT Solutions
Workload Security with SASE, this is how it works
Learn more
Cloud Platform Security
DevOps security: Stress test for culture and technology
Learn more
Identity & Access Management
Biometrics - better security without passwords?
Learn more
Cyber Defense
Threat Intelligence - Knowledge is power & security
Learn more
NIS2
NIS2 & ISO/IEC 27001:2022: New controls to fulfill both standards
Learn more
Identity & Access Management
How Privileged Access Management increases security
Learn more
Assessment & Advisory
vCISO - more IT Security through customizable support
Learn more
AI
Cloud Platform Security
AI from Microsoft: Is your company Copilot Ready?
Learn more
NIS2
NIS2 & Risk Management: Are cyber risks really manageable?
Learn more
Zero Trust
Zero Trust - more IT Security through less trust
Learn more
Cloud Platform Security
Protective shield for your cloud platforms: Tips, Tricks, Pitfalls
Learn more
Assessment & Advisory
Security all-rounder CISO: Outsource or hire yourself?
Learn more
We’re here for you
Fill in the form and our experts will get in touch.

You are currently viewing a placeholder content from HubSpot. To access the actual content, click the button below. Please note that doing so will share data with third-party providers.

More Information