Skip to content

Commit ad8da03

Browse files
authored
Merge pull request #26825 from MicrosoftDocs/eavena-patch-1
Update microsoft-365-copilot-ai-security.md
2 parents 957871f + 23a5c17 commit ad8da03

2 files changed

Lines changed: 12 additions & 12 deletions

File tree

copilot/TOC.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -19,8 +19,6 @@
1919
href: microsoft-365-copilot-enable-users.md
2020
- name: Manage
2121
items:
22-
- name: AI security for Microsoft 365 Copilot
23-
href: microsoft-365-copilot-ai-security.md
2422
- name: Manage Microsoft 365 Copilot settings
2523
href: microsoft-365-copilot-page.md
2624
- name: Viva Insights Copilot Dashboard
@@ -43,6 +41,8 @@
4341
href: microsoft-365-copilot-privacy.md
4442
- name: Data, privacy, and security for web queries in Microsoft 365 Copilot and Microsoft Copilot
4543
href: manage-public-web-access.md
44+
- name: AI security for Microsoft 365 Copilot
45+
href: microsoft-365-copilot-ai-security.md
4646
- name: Apply principles of Zero Trust to Microsoft 365 Copilot
4747
href: /security/zero-trust/zero-trust-microsoft-365-copilot?toc=%2Fcopilot%2Fmicrosoft-365%2Ftoc.json&bc=%2Fcopilot%2Fmicrosoft-365%2Fbreadcrumb%2Ftoc.json
4848
- name: Enterprise data protection

copilot/microsoft-365-copilot-ai-security.md

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ f1.keywords: NOCSH
44
ms.author: adparmar
55
author: adparmar
66
manager: pauloliveria
7-
ms.date: 10/23/2024
7+
ms.date: 10/24/2024
88
audience: Admin
99
ms.topic: article
1010
ms.service: microsoft-365-copilot
@@ -19,7 +19,7 @@ AI has revolutionized many sectors, providing unprecedented capabilities and eff
1919

2020
Customers are keen to explore these opportunities, and they’re thoughtfully considering the important aspects of security that come with them. Based on our interactions with customers who are on their AI transformation journey, we understand that topics such as data security, privacy, model robustness, and cyberattacks are top of mind.
2121

22-
Microsoft understands how critical these considerations are, which is why we employ robust defense-in-depth strategy to make sure productivity tools like Microsoft 365 Copilot are protected against security risks. This multi-layered approach involves a combination of advanced threat intelligence, rigorous security practices, and proactive safeguards. For example, in addition to our own red-teaming exercises to test Microsoft 365 Copilot, we engaged Casaba Security to test nine Copilot implementations across the Microsoft 365 product suite. We promptly addressed and resolved the findings of [their testing](https://servicetrust.microsoft.com/DocumentPage/67d59873-b315-4768-a057-8583cd84680a), which focused on identifying Open Worldwide Application Security Project's (OWASP) top 10 for LLM as well as traditional security vulnerabilities in supporting application infrastructure.
22+
Microsoft understands how critical these considerations are, which is why we employ a robust defense-in-depth strategy to help protect productivity tools like Microsoft 365 Copilot against security risks. This multi-layered approach involves a combination of advanced threat intelligence, rigorous security practices, and proactive safeguards. For example, in addition to our own red-teaming exercises to test Microsoft 365 Copilot, we engaged Casaba Security to test nine Copilot implementations across the Microsoft 365 product suite. We promptly addressed and resolved the findings of [their testing](https://servicetrust.microsoft.com/DocumentPage/67d59873-b315-4768-a057-8583cd84680a), which focused on identifying Open Worldwide Application Security Project's (OWASP) top 10 for LLM as well as traditional security vulnerabilities in supporting application infrastructure.
2323

2424
Microsoft takes extensive steps to ensure that Microsoft 365 Copilot is compliant with our existing privacy, security, and compliance commitments to our customers. And as AI technologies and use cases continue to evolve, our work is never done: Microsoft is committed to continuously advancing protections for Copilot, learning from our own monitoring and testing of our systems, as well as working with customers, partners, and the broader security industry.
2525

@@ -36,23 +36,23 @@ Our comprehensive security posture for AI has the following pillars:
3636
- **Security development lifecycle (SDL)**: Our rigorous SDL integrates security considerations throughout the entire AI development process. This proactive approach ensures vulnerabilities are identified and mitigated from the very beginning.
3737
- **Threat research, detection, and mitigation**: We actively invest in strategies to detect and mitigate threats to our AI models. This includes ongoing vulnerability monitoring and developing countermeasures against potential attacks. Microsoft Threat Intelligence, our global network of researchers, also monitors the [threat landscape](https://www.microsoft.com/security/blog/threat-intelligence/ai-threats/) for threat actors and cyberattacks that might take advantage of AI applications.
3838

39-
Microsoft safeguards privacy, security, and reliability for Microsoft 365 Copilot’s AI features, from the user input stage through the system output stage. Microsoft 365 Copilot is compliant with our existing [privacy, security, and compliance commitments](microsoft-365-copilot-privacy.md), including the General Data Protection Regulation (GDPR) and European Union (EU) Data Boundary. In keeping with these commitments, the information in any prompts entered using Copilot, the retrieved data and generated responses remain within the Microsoft 365 service boundary.
39+
Microsoft safeguards privacy, security, and reliability for Microsoft 365 Copilot’s AI features, from the user input stage through the system output stage. Microsoft 365 Copilot is compliant with our existing [privacy, security, and compliance commitments](microsoft-365-copilot-privacy.md), including the General Data Protection Regulation (GDPR) and European Union (EU) Data Boundary. In keeping with these commitments, Microsoft handles the information in any prompts entered using Copilot, and the retrieved data and generated responses remain secured as Customer Data and subject to our contractual data handling requirements.
4040

4141
The following sections cover how Microsoft addresses various aspects of privacy, security, and compliance that are important customer considerations for adopting Microsoft 365 Copilot.
4242

4343
### Access control and permissions management
4444

4545
Microsoft 365 Copilot accesses resources on behalf of the user, so it can only access resources the user already has permission to access. If the user doesn’t have access to a document for example, then Microsoft 365 Copilot working on the user’s behalf will also not have access either.
4646

47-
The data that it used to generate responses is processed within the Microsoft 365 service boundary and is also encrypted in transit, helping safeguard privacy and prevent data leakage. In addition, Microsoft 365 data, including data from Microsoft Graph and SharePoint, adheres to access control and auditing mechanisms.
47+
The data that it uses to generate responses is processed by Microsoft pursuant to contractual data handling requirements, including being encrypted in transit, helping safeguard privacy and prevent data leakage. In addition, Microsoft 365 data, including data from Microsoft Graph and SharePoint, adheres to access control and auditing mechanisms.
4848

4949
Microsoft 365 Copilot respects Microsoft 365, Microsoft Entra, and Microsoft Purview policies that further limit user access and permission, such as information barriers, Conditional Access, and sensitivity labels.
5050

5151
Microsoft 365 Copilot inherits data loss prevention (DLP) policies to prevent data exfiltration of Copilot-generated responses. Additionally, it enhances data security by applying sensitivity labels to these responses.
5252

5353
### Protecting data during model training
5454

55-
Microsoft 365 Copilot uses pretrained LLM models hosted by Microsoft; it doesn’t use customer data to train these models. In addition, prompt and grounding data isn’t used to train AI models and is never shared with OpenAI or other third parties.
55+
Microsoft 365 Copilot uses pretrained LLM models hosted by Microsoft; it doesn’t use Customer Data to train these models. In addition, prompt and grounding data isn’t used to train AI models and is never shared with OpenAI or other third parties.
5656

5757
### Honoring data residency requirements
5858

@@ -104,13 +104,13 @@ Microsoft 365 Copilot meets regulatory requirements for eDiscovery, audit loggin
104104

105105
While Microsoft safeguards provide strong threat mitigation against misinformation and compromise, as with any AI application, Microsoft 365 Copilot’s responses might not always be accurate. You should still apply human judgment to check these responses.
106106

107-
### Does Microsoft have access to my prompts and responses?
107+
### How does Microsoft treat my prompts and responses?
108108

109-
As with other Microsoft 365 content like email, documents, and chats, Microsoft has no eyes-on access to prompts or responses in Microsoft 365 Copilot.
109+
Microsoft treats prompts and responses as we treat other more traditional forms of content like emails, documents, and chats, and our contractual commitments are the same.
110110

111111
### Does Microsoft 365 Copilot use my data to train AI models?
112112

113-
Prompts, responses, and data accessed through Microsoft Graph aren't used to train foundation LLMs, including those used by Microsoft 365 Copilot. Product improvements are driven through customer-reported incidents and synthetic prompt generation.
113+
Prompts, responses, and Customer Data accessed through Microsoft Graph aren't used to train foundation LLMs, including those used by Microsoft 365 Copilot. Product improvements are driven through techniques such as customer-reported incidents and synthetic prompt generation.
114114

115115
### What should I do if I see unexpected or offensive content?
116116

@@ -137,7 +137,7 @@ The following steps can help administrators control user access and therefore li
137137
- [Restrict SharePoint site access](/sharepoint/restricted-access-control) and [OneDrive content access](/sharepoint/onedrive-site-access-restriction) to specific groups, even after content has been overshared.
138138
- [Use Restricted SharePoint Search](/sharepoint/restricted-sharepoint-search) to limit the websites from which Microsoft 365 Copilot is permitted to reference content.
139139
- [Use Microsoft SharePoint Premium - SharePoint Advanced Management](/sharepoint/advanced-management), which offers reports and tools to analyze and manage overly permissive access-control lists and sharing links across the environment.
140-
- [Review information protection considerations](/purview/ai-microsoft-purview-considerations#information-protection-considerations-for-copilot) for Copilot. Microsoft 365 Copilot honors EXTRACT permissions, inherit labels from referenced files, and automatically labels Copilot-generated content using the [Microsoft Endpoint Data Loss Prevention (DLP)](/office365/servicedescriptions/microsoft-365-service-descriptions/microsoft-365-tenantlevel-services-licensing-guidance/microsoft-purview-service-description#microsoft-data-loss-prevention-endpoint-data-loss-protection-dlp).
140+
- [Review information protection considerations](/purview/ai-microsoft-purview-considerations#information-protection-considerations-for-copilot) for Copilot. Microsoft 365 Copilot honors EXTRACT permissions and automatically [inherits sensitivity labels](/office365/servicedescriptions/microsoft-365-service-descriptions/microsoft-365-tenantlevel-services-licensing-guidance/microsoft-purview-service-description#microsoft-purview-information-protection-sensitivity-labeling) from referenced content to Copilot-generated responses and files.
141141
- [Apply sensitivity labels](https://support.microsoft.com/office/apply-sensitivity-labels-to-your-files-and-email-2f96e7cd-d5a4-403b-8bd7-4cc636bae0f9) to your Microsoft 365 files and email. For Microsoft Purview customers, administrators can [create and configure sensitivity labels](/purview/create-sensitivity-labels) that they want to make available for apps and other services.
142142
- [Use Microsoft Purview AI Hub](/purview/ai-microsoft-purview) (currently in preview) to discover sensitive data shared with Copilot, see files referenced in Copilot responses, and discover unlabeled files referenced by Copilot and associated SharePoint sites, thereby letting you identify and protect files at risk of overexposure.
143143
- Set up policies that remove old and unused data and limit data sprawl due to data oversharing with [Microsoft Purview Data Lifecycle Management](/purview/data-lifecycle-management).
@@ -158,4 +158,4 @@ For example, we recently introduced new Microsoft Defender and Purview capabilit
158158

159159
### Where should I report vulnerabilities in Microsoft 365 Copilot and other AI applications?
160160

161-
If you discover new vulnerabilities in any AI platform, we encourage you to follow responsible disclosure practices for the platform owner. Microsoft’s own procedure (for Copilot) is explained in this page: [Microsoft AI Bounty Program](https://www.microsoft.com/msrc/bounty-ai).
161+
If you discover new vulnerabilities in any AI platform, we encourage you to follow responsible disclosure practices for the platform owner. Microsoft’s own procedure (for Copilot) is explained in this page: [Microsoft AI Bounty Program](https://www.microsoft.com/msrc/bounty-ai).

0 commit comments

Comments
 (0)