DP-201: Designing an Azure Data Solution

DP-201: Designing an Azure Data Solution

This exam measures your ability to accomplish the following technical tasks: design Azure data storage solutions; design data processing solutions; and design for data security and compliance.

Candidates for this exam are Microsoft Azure data engineers who collaborate with business stakeholders to identify and meet the data requirements to design data solutions that use Azure data services.

Azure data engineers are responsible for data-related design tasks that include designing Azure data storage solutions that use relational and non-relational data stores, batch and real-time data processing solutions, and data security and compliance solutions.

Candidates for this exam must design data solutions that use the following Azure services: Azure Cosmos DB, Azure Synapse Analytics, Azure Data Lake Storage, Azure Data Factory, Azure Stream Analytics, Azure Databricks, and Azure Blob storage.

Microsoft Designing an Azure Data Solution Exam Summary:


Exam Name Microsoft Certified - Azure Data Engineer Associate
Exam Code   DP-201
Exam Price  $165 (USD)
Exam Price  120 mins
Number of Questions  40-60
Passing Score  700 / 1000
Books / Training Course DP-201T01-A: Designing an Azure Data Solution
Sample Questions  Microsoft Designing an Azure Data Solution Sample Questions
Practice Exam  Microsoft DP-201 Certification Practice Exam

Microsoft DP-201 Exam Syllabus Topics:

Topic  Details 
Design Azure Data Storage Solutions (40-45%)
Recommend an Azure data storage solution based on requirements - choose the correct data storage solution to meet the technical and business requirements
- choose the partition distribution type
Design non-relational cloud data stores - design data distribution and partitions
- design for scale (including multi-region, latency, and throughput)
- design a solution that uses Cosmos DB, Data Lake Storage Gen2, or Blob storage
- select the appropriate Cosmos DB API
- design a disaster recovery strategy
- design for high availability
Design relational cloud data stores - design data distribution and partitions
- design for scale (including latency and throughput)
- design a solution that uses Azure Synapse Analytics
- design a disaster recovery strategy
- design for high availability
Design Data Processing Solutions (25-30%)
Develop batch processing solutions - design batch processing solutions that use Data Factory and Azure Databricks
- identify the optimal data ingestion method for a batch processing solution
- identify where processing should take place, such as at the source, at the destination, or in transit
Develop streaming solutions - design for real-time processing by using Stream Analytics and Azure Databricks
- design and provision compute resources
Design for Data Security and Compliance (25-30%)
Design security for source data access - plan for secure endpoints (private/public)
- choose the appropriate authentication mechanism, such as access keys, shared access signatures (SAS), and Azure Active Directory (Azure AD)
Design security for data policies and standards - design data encryption for data at rest and in transit
- design for data auditing and data masking
- design for data privacy and data classification
- design a data retention policy
- plan an archiving strategy
- plan to purge data based on business requirements

0 comments:

Post a Comment