DP-200: Implementing an Azure Data Solution

DP-200: Implementing an Azure Data Solution

This exam measures your ability to accomplish the following technical tasks: implement data storage solutions; manage and develop data processing; and monitor and optimize data solutions.

Candidates for this exam are Microsoft Azure data engineers who collaborate with business stakeholders to identify and meet the data requirements to implement data solutions that use Azure data services.

Azure data engineers are responsible for data-related implementation tasks that include provisioning data storage services, ingesting streaming and batch data, transforming data, implementing security requirements, implementing data retention policies, identifying performance bottlenecks, and accessing external data sources.

Candidates for this exam must be able to implement data solutions that use the following Azure services: Azure Cosmos DB, Azure Synapse Analytics (formerly Azure SQL DW), Azure Data Factory, Azure Stream Analytics, Azure Databricks, and Azure Blob storage.

Microsoft Implementing an Azure Data Solution Exam Summary:


Exam Name Microsoft Certified - Azure Data Engineer Associate
Exam Code   DP-200
Exam Price  $165 (USD)
Exam Price  120 mins
Number of Questions  40-60
Passing Score  700 / 1000
Books / Training Course DP-200T01-A: Implementing an Azure Data Solution
Sample Questions  Microsoft Implementing an Azure Data Solution Sample Questions
Practice Exam  Microsoft DP-200 Certification Practice Exam

Microsoft DP-200 Exam Syllabus Topics:

Topic  Details 
Implement Data Storage Solutions (40-45%)
Implement non-relational data stores - implement a solution that uses Cosmos DB, Data Lake Storage Gen2, or Blob storage
- implement data distribution and partitions
- implement a consistency model in Cosmos DB
- provision a non-relational data store
- provide access to data to meet security requirements
- implement for high availability, disaster recovery, and global distribution
Implement relational data stores - provide access to data to meet security requirements
- implement for high availability and disaster recovery
- implement data distribution and partitions for Azure Synapse Analytics
- implement PolyBase
Manage data security - implement data masking
- encrypt data at rest and in motion
Manage and Develop Data Processing (25-30%)
Develop batch processing solutions - develop batch processing solutions by using Data Factory and Azure Databricks
- ingest data by using PolyBase
- implement the integration runtime for Data Factory
- create linked services and datasets
- create pipelines and activities
- create and schedule triggers
- implement Azure Databricks clusters, notebooks, jobs, and autoscaling
- ingest data into Azure Databricks
Develop streaming solutions - configure input and output
- select the appropriate built-in functions
- implement event processing by using Stream Analytics
Monitor and Optimize Data Solutions (30-35%)
Monitor data storage - monitor relational and non-relational data stores
- implement Blob storage monitoring
- implement Data Lake Storage Gen2 monitoring
- implement Azure Synapse Analytics monitoring
- implement Cosmos DB monitoring
- configure Azure Monitor alerts
- implement auditing by using Azure Log Analytics
Monitor data processing - monitor Data Factory pipelines
- monitor Azure Databricks
- monitor Stream Analytics
- configure Azure Monitor alerts
- implement auditing by using Azure Log Analytics
Optimize of Azure data solutions - troubleshoot data partitioning bottlenecks
- optimize Data Lake Storage Gen2
- optimize Stream Analytics
- optimize Azure Synapse Analytics
- manage the data lifecycle

0 comments:

Post a Comment