Data security and privacy remain top priorities for any business exploring AI. The blog, "Data, Privacy, and Security for Microsoft 365 Copilot," outlines how Microsoft safeguards customer information across Microsoft 365 with robust compliance frameworks, encryption standards, and access controls. Read the blog to understand Microsoft's security-by-design approach and contact ContentMX to discuss how your business can confidently adopt Copilot.
How does Microsoft 365 Copilot utilize organizational data?
Microsoft 365 Copilot connects large language models (LLMs) to your organizational data by accessing content through Microsoft Graph. It generates responses based on user documents, emails, calendar events, chats, and meetings that the user has permission to access. This combination of content and context helps provide accurate and relevant responses. Importantly, prompts and responses are not used to train the foundation LLMs.
What measures are in place to protect organizational data?
Microsoft 365 Copilot employs a permissions model to ensure that only authorized users can access specific data. It uses multiple layers of protection, including encryption for data at rest and in transit, and adheres to privacy regulations like GDPR. Additionally, it implements logical isolation of customer content and honors usage rights granted to users, ensuring that sensitive information remains secure.
What data is stored from user interactions with Copilot?
When users interact with Microsoft 365 Copilot, data such as prompts and responses are stored as part of the user's Copilot activity history. This data is encrypted and processed in line with organizational commitments. Admins can manage this stored data using tools like Microsoft Purview, and users have the option to delete their activity history through the My Account portal.