As a candidate for this exam, you should have subject matter expertise in designing, creating, and deploying enterprise-scale data analytics solutions.
Your responsibilities for this role include performing advanced data analytics at scale, such as:
- Cleaning and transforming data.
- Designing and building enterprise data models.
- Incorporating advanced analytics capabilities.
- Integrating with IT infrastructure.
- Applying development lifecycle practices.
As a professional in this role, you:
- Help collect enterprise-level requirements for data analytics solutions that include Azure and Power BI.
- Advise on data governance and configuration settings for Power BI administration.
- Monitor data usage.
- Optimize performance of the data analytics solutions.
As an Azure enterprise data analyst, you collaborate with other roles, such as:
- Solution architects
- Data engineers
- Data scientists
- AI engineers
- Database administrators
- Power BI data analysts
As a candidate for this exam, you should have advanced Power BI skills, including managing data repositories and data processing in the cloud and on-premises, along with using Power Query and Data Analysis Expressions (DAX). You should also be proficient in consuming data from Azure Synapse Analytics and should have experience querying relational databases, analyzing data by using Transact-SQL (T-SQL), and visualizing data.
Microsoft Designing and Implementing Enterprise-Scale Analytics Solutions Using Microsoft Azure and Microsoft Power BI Exam Summary:
Exam Name | Microsoft Certified - Azure Enterprise Data Analyst Associate |
Exam Code | DP-500 |
Exam Price | $165 (USD) |
Exam Price | 150 mins |
Number of Questions | 40-60 |
Passing Score | 700 / 1000 |
Books / Training | Course DP-500T00: Designing and Implementing Enterprise-Scale Analytics Solutions Using Microsoft Azure and Microsoft Power BI |
Sample Questions | Microsoft Designing and Implementing Enterprise-Scale Analytics Solutions Using Microsoft Azure and Microsoft Power BI Sample Questions |
Practice Exam | Microsoft DP-500 Certification Practice Exam |
Microsoft DP-500 Exam Syllabus Topics:
Topic | Details |
Implement and manage a data analytics environment (25-30%) | |
Govern and administer a data analytics environment | - Manage Power BI assets by using Microsoft Purview - Identify data sources in Azure by using Microsoft Purview - Recommend settings in the Power BI admin portal - Recommend a monitoring and auditing solution for a data analytics environment, including Power BI REST API and PowerShell cmdlets |
Integrate an analytics platform into an existing IT infrastructure | - Identify requirements for a solution, including features, performance, and licensing strategy - Configure and manage Power BI capacity - Recommend and configure an on-premises gateway type for Power BI - Recommend and configure a Power BI tenant or workspace to integrate with Azure Data Lake Storage Gen2 - Integrate an existing Power BI workspace into Azure Synapse Analytics |
Manage the analytics development lifecycle | - Commit Azure Synapse Analytics code and artifacts to a source control repository - Recommend a deployment strategy for Power BI assets - Recommend a source control strategy for Power BI assets - Implement and manage deployment pipelines in Power BI - Perform impact analysis of downstream dependencies from dataflows and datasets - Recommend automation solutions for the analytics development lifecycle, including Power BI REST API and PowerShell cmdlets - Deploy and manage datasets by using the XMLA endpoint - Create reusable assets, including Power BI template (.pbit) files, Power BI data source (.pbids) files, and shared datasets |
Query and transform data (20-25%) | |
Query data by using Azure Synapse Analytics | - Identify an appropriate Azure Synapse pool when analyzing data - Recommend appropriate file types for querying serverless SQL pools - Query relational data sources in dedicated or serverless SQL pools, including querying partitioned data sources - Use a machine learning PREDICT function in a query |
Ingest and transform data by using Power BI | - Identify data loading performance bottlenecks in Power Query or data sources - Implement performance improvements in Power Query and data sources - Create and manage scalable Power BI dataflows - Identify and manage privacy settings on data sources - Create queries, functions, and parameters by using the Power Query Advanced Editor - Query advanced data sources, including JSON, Parquet, APIs, and Azure Machine Learning models |
Implement and manage data models (25-30%) | |
Design and build tabular models | - Choose when to use DirectQuery for Power BI datasets - Choose when to use external tools, including DAX Studio and Tabular Editor 2 - Create calculation groups - Write calculations that use DAX variables and functions, for example handling blanks or errors, creating virtual relationships, and working with iterators - Design and build a large format dataset - Design and build composite models, including aggregations - Design and implement enterprise-scale row-level security and object-level security |
Optimize enterprise-scale data models | - Identify and implement performance improvements in queries and report visuals - Troubleshoot DAX performance by using DAX Studio - Optimize a data model by using Tabular Editor 2 - Analyze data model efficiency by using VertiPaq Analyzer - Optimize query performance by using DAX Studio - Implement incremental refresh (including the use of query folding) - Optimize a data model by using denormalization |
Explore and visualize data (20-25%) | |
Explore data by using Azure Synapse Analytics | - Explore data by using native visuals in Spark notebooks - Explore and visualize data by using the Azure Synapse SQL results pane |
Visualize data by using Power BI | - Create and import a custom report theme - Create R or Python visuals in Power BI - Connect to and query datasets by using the XMLA endpoint - Create and distribute paginated reports in Power BI Report Builder |
0 comments:
Post a Comment