- Understand how to secure AI data with Microsoft Purview.
After completing this module, you'll be able to:
- Explain how Microsoft Purview DSPM for AI supports AI data risk assessment
- Identify which AI tools are in use and how they interact with sensitive content
- Use DSPM for AI to assess potential data exposure and apply recommended controls
- Use Microsoft Purview Audit to review prompts and responses from Microsoft 365 Copilot
- AI tools like Microsoft 365 Copilot create new ways to interact with sensitive data, but they also introduce new risks. Learn how Microsoft Purview helps you apply security and compliance controls that protect data, manage AI activity, and support responsible use at scale.
In this module you learn how to:
- Control Copilot's access to sensitive content with sensitivity labels
- Prevent data exposure in Copilot interactions using data loss prevention policies
- Retain and audit Copilot prompts and responses with retention and audit
- Investigate Copilot activity using eDiscovery
- Detect risky or inappropriate Copilot usage with Communication Compliance
- Assess AI compliance using Compliance Manager
- Secure enterprise and browser-based AI apps with Microsoft Purview
In this module you learn how to:
- Discover and assess AI usage with DSPM for AI and Compliance Manager
- Apply data loss prevention to restrict risky actions in browsers and endpoints
- Detect policy violations in prompts and responses with Communication Compliance
- Use Insider Risk Management and Adaptive Protection to apply risk-based controls
- Retain AI-generated content with Microsoft Purview retention policies
- Microsoft Purview provides tools to secure developer AI environments by discovering apps, assessing data access, and applying appropriate protections. This includes detecting generative AI usage, assigning user risk levels, and applying dynamic enforcement based on user behavior and data sensitivity.
In this module you learn how to:
- Discover developer AI apps and assess their access to sensitive data
- Enforce protections for Azure AI services and Entra-registered apps
- Govern AI agents built in Copilot Studio
- Retain and classify prompt and response content
- Investigate risky AI usage with Insider Risk Management, Communication Compliance, and Audit
- Apply dynamic protections using Adaptive Protection and risk-based policies
Free AI-powered learning to build in-demand skills
35% Off Finance Skills That Get You Hired - Code CFI35
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Syllabus
- Understand How to Secure AI Data with Microsoft Purview
- Introduction
- Understand AI data security risks
- Understand how Microsoft Purview secures AI data
- Evaluate compliance risks for AI usage
- Identify AI-related data exposure risks
- Understand how Microsoft Purview controls AI data access
- Detect and respond to risky AI activity
- Retain and search Copilot prompts and responses
- Module assessment
- Summary
- Secure Microsoft 365 Copilot interactions with Microsoft Purview
- Introduction
- Understand how Microsoft 365 Copilot changes data protection needs
- Assess Copilot regulatory compliance with Compliance Manager
- Audit Copilot interactions with Microsoft Purview
- Analyze Copilot interactions with Communication Compliance
- Classify and protect Copilot content with sensitivity labels
- Apply DLP policies to Microsoft 365 Copilot
- Apply retention policies to Copilot prompts and responses
- Investigate and delete Copilot activity with eDiscovery
- Module assessment
- Summary
- Secure enterprise and browser-based AI apps with Microsoft Purview
- Introduction
- Understand risks from enterprise and non-Microsoft AI tools
- Assess AI usage for security and compliance
- Identify policy violations with Communication Compliance
- Detect risky AI usage with Insider Risk Management
- Protect sensitive data in AI apps with Microsoft Purview DLP
- Case study: Use Adaptive Protection to respond to AI-related risk
- Apply retention policies to AI app prompts and responses
- Module assessment
- Summary
- Secure developer AI environments with Microsoft Purview
- Introduction
- Understand risks and responsibilities in AI development environments
- Discover and assess AI apps with DSPM for AI
- Classify, restrict, and retain AI prompt data
- Enforce protections in Microsoft Foundry and Foundry Tools
- Apply controls for Microsoft Entra-registered custom AI apps
- Secure AI agents built in Copilot Studio
- Manage data risks in Copilot in Fabric
- Investigate and respond to risky AI activity
- Module assessment
- Summary