Showing posts with label Analysis Services. Show all posts
Showing posts with label Analysis Services. Show all posts

Thursday, 30 November 2023

Democratizing FinOps: Transform your practice with FOCUS and Microsoft Fabric

Cloud computing has revolutionized the way you build, deploy, and scale applications and services. While you have unprecedented flexibility, agility, and scalability, you also face greater challenges in managing cost, security, and compliance. While IT security and compliance are often managed by central teams, cost is a shared responsibility across executive, finance, product, and engineering teams, which is what makes managing cloud cost such a challenge. Having the right tools to enable cross-group collaboration and make data-driven decisions is critical.

Fortunately, you have everything you need in the Microsoft Cloud to implement a streamlined FinOps practice that brings people together and connects them to the data they need to make business decisions. And with new developments like Copilot in Microsoft Cost Management and Microsoft Fabric, there couldn’t be a better time to take a fresh look at how you manage cost within your organization and how you can leverage the FinOps Framework and the FinOps Open Cost and Usage Specification (FOCUS) to accelerate your FinOps efforts.

There’s a lot to cover in this space, so I’ll split this across a series of blog posts. In this first blog post, I’ll introduce the core elements of Cost Management and Fabric that you’ll need to lay the foundation for the rest of the series, including how to export data, how FOCUS can help, and a few quick options that anyone can use to setup reports and alerts in Fabric with just a few clicks.

No-code extensibility with Cost Management exports


As your FinOps team grows to cover new services, endpoints, and datasets, you may find they spend more time integrating disparate APIs and schemas than driving business goals. This complexity also keeps simple reports and alerts just out of reach from executive, finance, and product teams. And when your stakeholders can’t get the answers they need, they push more work on to engineering teams to fill those gaps, which again, takes away from driving business goals.

We envision a future where FinOps teams can empower all stakeholders to stay informed and get the answers they need through turn-key integration and AI-assisted tooling on top of structured guidance and open specifications. And this all starts with Cost Management exports—a no-code extensibility feature that brings data to you.

As of today, you can sign up for a limited preview of Cost Management expands where you can export five new datasets directly into your storage account without a single line of code. In addition to the actual and amortized cost and usage details you get today, you’ll also see:

◉ Cost and usage details aligned to FOCUS
◉ Price sheets
◉ Reservation details
◉ Reservation recommendations
◉ Reservation transactions

Democratizing FinOps: Transform your practice with FOCUS and Microsoft Fabric

Of note, the FOCUS dataset includes both actual and amortized costs in a single dataset, which can drive additional efficiencies in your data ingestion process. You’ll benefit from reduced data processing times and more timely reporting on top of reduced storage and compute costs due to fewer rows and less duplication of data.

Beyond the new datasets, you’ll also discover optimizations that deliver large datasets more efficiently, reduced storage costs by updating rather than creating new files each day, and more. All exports are scheduled at the same time, to ensure scheduled refreshes of your reports will stay in sync with the latest data. Coupled with file partitioning, which is already available and recommended today, and data compression, which you’ll see in the coming months, the exports preview removes the need to write complex code to extract, transfer, and load large datasets reliably via APIs. This better enables all FinOps stakeholders to build custom reports to get the answers they need without having to learn a single API or write a single line of code.

FOCUS democratizes cloud cost analytics


In case you’re not familiar, FOCUS is a groundbreaking initiative to establish a common provider and service-agnostic format for billing data that empowers organizations to better understand cost and usage patterns and optimize spending and performance across multiple cloud, software as a service (SaaS), and even on-premises service offerings. FOCUS provides a consistent, clear, and accessible view of cost data, explicitly designed for FinOps needs. As the new “language” of FinOps, FOCUS enables practitioners to collaborate more efficiently and effectively with peers throughout the organization and even maximize transferability and onboarding for new team members, getting people up and running quicker.

FOCUS 0.5 was originally announced in June 2023, and we’re excited to be leading the industry with our announcement of native support for the FOCUS 1.0 preview as part of Cost Management exports on November 13, 2023. We believe FOCUS is an important step forward for our industry, and we look forward to our industry partners joining us and collaboratively evolving the specification alongside FinOps practitioners from our collective customers and partners.

FOCUS 1.0 preview adds new columns for pricing, discounts, resources, and usage along with prescribed behaviors around how discounts are applied. Soon, you’ll also have a powerful new use case library, which offers a rich set of problems and prebuilt queries to help you get the answers you need without the guesswork. Armed with FOCUS and the FinOps Framework, you have a literal playbook on how to understand and extract answers out of your data effortlessly, enabling you to empower FinOps stakeholders regardless of how much knowledge or experience they have, to get the answers they need to maximize business value with the Microsoft Cloud.

Democratizing FinOps: Transform your practice with FOCUS and Microsoft Fabric

Democratizing FinOps: Transform your practice with FOCUS and Microsoft Fabric

Microsoft Fabric and Copilot enable self-service analytics


So far, I’ve talked about how you can leverage Cost Management exports as a turn-key solution to extract critical details about your costs, prices, and reservations using FOCUS as a consistent, open billing data format with its use case library that is a veritable treasure map for finding answers to your FinOps questions. While these are all amazing tools that will accelerate your FinOps efforts, the true power of democratizing FinOps lies at the intersection of Cost Management and FOCUS with a platform that enables you to provide your stakeholders with self-serve analytics and alerts. And this is exactly what Microsoft Fabric brings to the picture.

Microsoft Fabric is an all-in-one analytics solution that encompasses data ingestion, normalization, cleansing, analysis, reporting, alerting, and more. I could write a separate blog post about how to implement each FinOps capability in Microsoft Fabric, but but to get you acclimated, let me introduce the basics.

Democratizing FinOps: Transform your practice with FOCUS and Microsoft Fabric

Your first step to leveraging Microsoft Fabric starts in Cost Management, which has done much of the work for you by exporting details about your prices, reservations, and cost and usage data aligned to FOCUS.

Once exported, you’ll ingest your data into a Fabric lakehouse, SQL, or KQL database table and create a semantic model to bring data together for any reports and alerts you’ll want to create. The database option you use will depend on how much data you have and your reporting needs. Below is an example using a KQL database, which uses Azure Data Explorer under the covers, to take advantage of the performance and scale benefits as well as the powerful query language.

Democratizing FinOps: Transform your practice with FOCUS and Microsoft Fabric

Democratizing FinOps: Transform your practice with FOCUS and Microsoft Fabric

Fabric offers several ways to quickly explore data from a semantic model. You can explore data by simply selecting the columns you want to see, but I recommend trying the auto-create a report option which takes that one step further by generating a quick summary based on the columns you select. As an example, here’s an auto-generated summary of the FOCUS EffectiveCost broken down by ChargePeriodStart, ServiceCategory, SubAccountName, Region, PricingCategory, and CommitmentDiscountType. You can apply quick tweaks to any visual or switch to the full edit experience to take it even further.

Democratizing FinOps: Transform your practice with FOCUS and Microsoft Fabric

Those with a keen eye may notice the Copilot button at the top right. If we switch to edit mode, we can take full advantage of Copilot and even ask it to create the same summary:

Democratizing FinOps: Transform your practice with FOCUS and Microsoft Fabric

Copilot starts to get a little fancier with the visuals and offers summarized numbers and a helpful filter. I can also go further with more specific questions about commitment-based discounts:

Democratizing FinOps: Transform your practice with FOCUS and Microsoft Fabric

Of course, this is barely scratching the surface. With a richer semantic model including relationships and additional details, Copilot can go even further and save you time by giving you the answers you need and building reports with less time and hassle.

Democratizing FinOps: Transform your practice with FOCUS and Microsoft Fabric

In addition to having unparalleled flexibility in reporting on the data in the way you want, you can also create fine-grained alerts in a more flexible way than ever before with very little effort. Simply select the visual you want to measure and specify when and how you want to be alerted:

Democratizing FinOps: Transform your practice with FOCUS and Microsoft Fabric

This gets even more powerful when you add custom visuals, measures, and materialized views that offer deeper insights.

This is just a glimpse of what you can do with Cost Management and Microsoft Fabric together. I haven’t even touched on the data flows, machine learning capabilities, and the potential of ingesting data from multiple cloud providers or SaaS vendors also using FOCUS to give you a full, single pane of glass for your FinOps efforts. You can imagine the possibilities of how Copilot and Fabric can impact every FinOps capability, especially when paired with rich collaboration and automation tools like Microsoft Teams, Power Automate, and Power Apps that can help every stakeholder accomplish more together.

Source: microsoft.com

Wednesday, 6 September 2023

The Future of Web Development: Azure App Service Unleashed

Web Development, Azure App Service, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Prep, Microsoft Preparation, Microsoft Learning, Microsoft Tutorial and Materials

In the ever-evolving landscape of web development, staying ahead of the curve is essential. Technology is advancing at breakneck speed, and one of the most influential players in this arena is Azure App Service. In this article, we delve into the future of web development, with a specific focus on how Azure App Service is poised to revolutionize the way we build and deploy web applications.

Introduction


Web development has come a long way since its inception. What once started as simple HTML pages has now evolved into complex, interactive web applications that cater to a global audience. With the advent of cloud computing and platforms like Azure App Service, the possibilities are virtually limitless.

The Power of Azure App Service


Azure App Service is Microsoft's platform-as-a-service (PaaS) offering for building, deploying, and scaling web applications. It provides a comprehensive set of tools and services that make web development faster, more efficient, and cost-effective. Let's delve into some of the key features that make Azure App Service a game-changer.


1. Seamless Integration with Azure Ecosystem

One of the standout features of Azure App Service is its seamless integration with the broader Azure ecosystem. Developers can leverage Azure's vast array of services, including Azure SQL Database, Azure Functions, and Azure Cosmos DB, to build powerful and scalable web applications. This integration streamlines development, reduces complexity, and enables developers to focus on crafting exceptional user experiences.

2. Automatic Scaling

In the dynamic world of the internet, traffic spikes are inevitable. Azure App Service handles this with ease through its automatic scaling capabilities. Whether your application experiences a sudden surge in users or a gradual increase in traffic, Azure App Service can adapt in real-time, ensuring that your web app remains responsive and available.

3. DevOps Integration

Azure App Service is tailor-made for DevOps practices. It seamlessly integrates with popular DevOps tools like Azure DevOps, Jenkins, and GitHub Actions, enabling developers to automate the deployment pipeline. This results in faster release cycles, reduced errors, and improved collaboration among development and operations teams.

4. Security and Compliance

Security is paramount in web development, and Azure App Service doesn't disappoint. It offers robust security features, including identity and access management, DDoS protection, and compliance certifications like ISO 27001 and SOC 2. With Azure App Service, you can build applications that meet the most stringent security and compliance requirements.

5. Multi-language Support

Whether you're a fan of .NET, Node.js, Python, or any other programming language, Azure App Service caters to your preferences. It provides built-in support for multiple programming languages, ensuring that developers have the flexibility to choose the tools they're most comfortable with.

The Future of Web Development with Azure App Service


So, what does the future hold for web development with Azure App Service? The possibilities are both exciting and transformative.

1. Serverless Computing Integration

Azure App Service is set to further embrace serverless computing. This means developers can build applications without worrying about managing servers. With Azure Functions seamlessly integrated, developers can focus solely on writing code to deliver value to end-users.

2. AI and Machine Learning Integration

The future of web applications lies in AI and machine learning. Azure App Service is at the forefront of this transformation, offering pre-built AI models and integrations with Azure Machine Learning. This empowers developers to infuse intelligence into their applications, creating personalized and predictive user experiences.

3. Enhanced Performance Monitoring and Analytics

Performance is a critical factor in web development. Azure App Service is continuously evolving its performance monitoring and analytics tools. Expect real-time insights, advanced debugging capabilities, and optimization suggestions to help you deliver lightning-fast web applications.

4. Global Reach with Edge Computing

As web applications become increasingly global, Azure App Service is expanding its reach with edge computing capabilities. This ensures that your application is closer to your users, resulting in lower latency and a superior user experience, no matter where your audience is located.

Conclusion

The future of web development is undeniably intertwined with Azure App Service. With its powerful features, seamless integration, and a commitment to staying at the cutting edge of technology, Azure App Service is set to revolutionize the way we build, deploy, and manage web applications. As the digital landscape continues to evolve, embracing Azure App Service is not just a choice; it's a strategic imperative.

In conclusion, if you're looking to stay ahead in the competitive world of web development, Azure App Service is your ticket to success. With its extensive capabilities, it empowers developers to create innovative and performant web applications that will undoubtedly leave a mark in the digital realm.

Saturday, 3 August 2019

Power BI and Azure Data Services dismantle data silos and unlock insights

Learn how to connect Power BI and Azure Data Services to share data and unlock new insights with a new tutorial. Business analysts who use Power BI dataflows can now share data with data engineers and data scientists, who can leverage the power of Azure Data Services, including Azure Databricks, Azure Machine Learning, Azure SQL Data Warehouse, and Azure Data Factory for advanced analytics and AI.

With the recently announced preview of Power BI dataflows, Power BI has enabled self-service data prep for business analysts. Power BI dataflows can ingest data from a large array of transactional and observational data sources, and cleanse, transform, enrich, schematize, and store the result. Dataflows are reusable and can be refreshed automatically and daisy-chained to create powerful data preparation pipelines. Power BI is now making available support for storing dataflows in Azure Data Lake Storage (ADLS) Gen2, including both the data and dataflow definition. By storing dataflows in Azure Data Lake Storage Gen2, business analysts using Power BI can now collaborate with data engineers and data scientists using Azure Data Services.

Data silos inhibit data sharing


The ability for organizations to extract intelligence from business data provides a key competitive advantage, however attempting this today can be time consuming and costly. To extract intelligence and create value from data, an application must be able to access the data and understand its structure and meaning. Data often resides in silos that are application or platform specific, creating a major data integration and data preparation challenge.

Consistent data and metadata formats enable collaboration


Power BI, Azure Data Services, Azure Learning, Azure Online Exam, Azure Certifications, Azure Guides
By adopting a consistent way to store and describe data based on the Common Data Model (CDM), Power BI, Azure Data Services and other applications can share and interoperate over data more effectively. Power BI dataflows are stored in ADLS Gen2 as CDM folders. A CDM folder contains a metadata file that describes the entities in the folder, with their attributes and datatypes, and lists the data files for each entity. CDM also defines a set of standard business entities that define additional rich semantics. Mapping the data in a CDM folder to standard CDM entities further facilitates interoperability and data sharing. Microsoft has joined with SAP and Adobe to form an Open Data Initiative to encourage the definition and adoption of standard entities across a range of domains to make it easier for applications and tools to share data through an enterprise Data Lake.

By adopting these data storage conventions, data ingested by Power BI, with its already powerful and easy to use data prep features, can now be further enriched and leveraged in Azure. Similarly, data in Azure can be exported into CDM folders and shared with Power BI.


Azure Data Services enable advanced analytics on shared data


Azure Data Services enable advanced analytics that let you maximize the business value of data stored in CDM folders in the data lake. Data engineers and data scientists can use Azure Databricks and Azure Data Factory dataflows to cleanse and reshape data, ensuring it is accurate and complete. Data from different sources and in different formats can be normalized, reformatted, and merged to optimize the data for analytics processing. Data scientists can use Azure Machine Learning to define and train machine learning models on the data, enabling predictions and recommendations that can be incorporated into BI dashboards and reports, and used in production applications. Data engineers can use Azure Data Factory to combine data from CDM folders with data from across the enterprise to create an historically accurate, curated enterprise-wide view of data in Azure SQL Data Warehouse. At any point, data processed by any Azure Data Service can be written back to new CDM folders, to make the insights created in Azure accessible to Power BI and other CDM-enabled apps or tools.

New tutorial explores data sharing between Power BI and Azure


A tutorial is now available to help you understand how sharing data between Power BI and Azure using CDM folders can break down data silos and unlock new insights. The tutorial with sample code shows how to integrate data from Power BI into a modern data warehousing scenario in Azure. The tutorial allows you to explore the flows highlighted in green in the diagram above.  

In the tutorial, Power BI dataflows are used to ingest key analytics data from the Wide World Importers operational database and store the extracted data with its schema in a CDM folder in ADLS Gen2. You then connect to the CDM folder and process the data using Azure Databricks, formatting and preparing it for later steps, then writing it back to the lake in a new CDM folder. This prepared CDM folder is used by Azure Machine Learning to train and publish an ML model that can be accessed from Power BI or other applications to make real-time predictions. The prepared data is also loaded into staging tables in an Azure SQL Data Warehouse, where it is transformed into a dimensional model. 

Power BI, Azure Data Services, Azure Learning, Azure Online Exam, Azure Certifications, Azure Guides

Azure Data Factory is used to orchestrate the flow of data between the services, as well as to manage and monitor the processing at runtime. By working through the tutorial, you’ll see first-hand how the metadata stored in a CDM folder makes it easier to for each service to understand and share data.  

Sample code accelerates your data integration projects


The tutorial includes sample code and instructions for the whole scenario. The samples include reusable libraries and code in C#, Python, and Scala, as well as reusable Azure Data Factory pipeline templates, that you can use to integrate CDM folders into your own Azure Data Services projects. 

Tuesday, 7 May 2019

Analytics in Azure remains unmatched with new innovations

Digital disruption has created unlimited potential for companies to embrace data as a competitive advantage for their business. As a result, analytics continues to be a key priority for enterprises.

When it comes to analytics, customers tell us that they need a solution that provides them with the best price, performance, security, and privacy, as well as a system that can easily deliver powerful insights across the organization. Azure has them covered. Analytics in Azure provides unparalleled performance at incredible value, not only in one, but two independent benchmark reports. Additionally, as illustrated by Donald Farmer, a well-respected thought leader in the analytics space, analytics in Azure has the most advanced security and privacy features in the market. And without limitations on concurrency, employees can get their insights in seconds from all their enterprise data, through beautiful Power BI visualizations. Analytics in Azure makes insights accessible to everyone in your organization.

Innovations in analytics


Azure Data Factory is a hybrid data integration service for developing scalable extract/transform/load and extract/load/transform (ETL/ELT) data pipelines. With Data Factory, cloud-scale data integration can now be done through an intuitive visual environment and with no code. Data Factory makes ETL/ELT more efficient for developers, and more accessible for data scientists and data analysts, bringing data sources and data types together for simpler and faster insights.

Azure Data Factory now features visual data transformation with Mapping Data Flows, available in preview. With a few clicks, users can focus on building their data models rather than writing and maintaining code. Mapping Data Flows is powered by Azure Databricks and provides the same processing power and scale as the code-based approach directly in Azure Databricks. Azure Data Factory supports Python, .NET, and ARM for those developers that prefer to code.

Azure Certifications, Azure Tutorials and Materials, Azure Guides, Azure Learning

It’s exciting to see partners, like Pragmatic Works, already benefiting from this new innovation.

Our goal is to empower customers who have long standing investments in SSIS to leverage the cloud. Data Factory makes it easy to move existing SSIS ETL to the cloud without any redevelopment, saving substantial time and money. Only Azure offers this truly hybrid capability.

Azure Certifications, Azure Tutorials and Materials, Azure Guides, Azure Learning

A new capability called Wrangling Data Flows, available in preview, gives users the ability to explore and wrangle data at scale. Wrangling Data Flows empowers users to visually discover and explore their data without writing a single line of code.

We are also excited to announce today support for semi-structured data in Azure SQL Data Warehouse, available in preview. Now with one service, both structured and semi-structured data formats (like JSON) can now be analyzed directly from the data warehouse for faster insights.

We remain focused on increasing SQL Data Warehouse’s industry-leading speed and performance and are announcing two new features available in preview that further enable this vision.

Result-set Caching enables queries that have been previously ran to be immediately available to subsequent users thus reducing any lags in reports. Materialized Views allows SQL Data Warehouse to automatically rewrite user queries to leverage deployed query results, leading to significantly improved query performance.


And for organizations with complex requirements and multiple users accessing the same data warehouse, we are pleased to announce the general availability of Workload Importance. This feature will give users the capability to decide which workloads need to be executed first to ensure their most important queries are properly prioritized.

Tuesday, 22 January 2019

Export data in near real-time from Azure IoT Central

We are happy to share that you can now export data to Azure Event Hubs and Azure Service Bus in near real-time from your Azure IoT Central app! Previously, Continuous Data Export enabled exporting your IoT Central measurements, devices, and device templates data to your Azure Blob Storage account once every minute for cold path storage and analytics. Now you can export this data in near real-time to your Azure Event Hubs and Azure Service Bus instances for analytics and monitoring.

For example, an energy company wants to understand and predict trends in energy consumption in different areas over time of day and throughout the week. With electrical equipment connected to IoT Central, they can use Continuous Data Export to export their IoT data to Azure Event Hubs. They run their deployed machine learning models to gain insight over consumption and perform anomaly detection by connecting their Event Hubs to Azure Databricks. They can run highly custom rules for detecting specific outages by sending data from Event Hubs to Azure Stream Analytics. For long term data storage, they can continue to use Continuous Data Export to store all of their device data in Azure Blob Storage.

Azure IoT Central, Azure Certification, Azure Study Materials, Azure Guides, Azure Tutorial and Material

Continuous Data Export in Azure IoT Central

New capabilities


These are the new features and changes to Continuous Data Export in Azure IoT Central:

◈ New export destinations include Azure Event Hubs and Azure Service Bus, in addition to Azure Blob Storage.
◈ Export to all supported destinations using a valid connection string, including destinations that are in a different subscription than your IoT Central app.
◈ Create up to 5 exports per app.
◈ Export is available in both Trial apps and Pay-As-You-Go apps.
◈ Continuous Data Export has moved! Find it in the left navigation menu.

Saturday, 25 August 2018

How to enhance HDInsight security with service endpoints

HDInsight enterprise customers work with some of the most sensitive data in the world. They want to be able to lock down access to this data at the networking layer as well. However, while service endpoints have been available in Azure data sources, HDInsight customers couldn’t leverage this additional layer of security for their big data pipelines due to the lack of interoperability between HDInsight and other data stores. As we have recently announced, HDInsight is now excited to support service endpoints for Azure Blob Storage, Azure SQL databases and Azure Cosmos DB.

With this enhanced level of security at the networking layer, customers can now lock down their big data storage accounts to their specified Virtual Networks (VNETs) and still use HDInsight clusters seamlessly to access and process that data.

In the rest of this post we will explore how to enable service endpoints and point out important HDInsight configurations for Azure Blob Storage, Azure SQL DB, and Azure CosmosDB.

Azure Blob Storage


When using Azure Blob Storage with HDInsight, you can configure selected VNETs on a blob storage firewall settings. This will ensure that only traffic from those subnets can access this storage account.

It is important to check the "Allow trusted Microsoft services to access this storage account." This will ensure that HDInsight service will have access to storage accounts and provision the cluster in a seamless manner.

HDInsight Security, Microsoft Certification, Microsoft Learning, Azure Study Materials

If the storage account is in a different subscription than the HDInsight cluster, please make sure that HDInsight resource provider is registered with the storage subscription. If HDInsight resource provider is not registered properly you might get this error message, which can be solved by registration of the resource provider.

NOTE: HDInsight cluster must be deployed into one of the subnets allowed in the blob storage firewall. This will ensure that the traffic from cluster VMs can reach the storage.

Azure SQL DB


If you are using an external SQL DB for Hive or Oozie metastore, you can configure service endpoints. “Allow access to Azure services” is not a required step from HDInsight point of view, since accessing these databases will happen after the cluster is created and the VMs are injected to the VNET.

HDInsight Security, Microsoft Certification, Microsoft Learning, Azure Study Materials

NOTE: HDInsight cluster must be deployed into one of the subnets allowed in the SQL DB firewall. This will ensure that the traffic from cluster VMs can reach the SQL DB.

Azure Cosmos DB


If you are using the spark connector for Azure Cosmos DB you can enable service endpoints in Cosmos DB firewall settings and seamlessly connect to it from HDInsight cluster.

HDInsight Security, Microsoft Certification, Microsoft Learning, Azure Study Materials

NOTE: HDInsight cluster must be deployed into one of the VNETs allowed in the Cosmos DB firewall. This will ensure that the traffic from cluster VMs can reach the SQL DB.

About HDInsight


Azure HDInsight is Microsoft’s premium managed offering for running open source workloads on Azure. Today, we are excited to announce several new capabilities across a wide range of OSS frameworks.

Azure HDInsight powers some of the top customer’s mission critical applications ranging in a wide variety of sectors including, manufacturing, retail education, nonprofit, government, healthcare, media, banking, telecommunication, insurance and many more industries ranging in use cases from ETL to Data Warehousing, from Machine Learning to IoT and many more.

Thursday, 9 August 2018

Azure HDInsight Interactive Query: Ten tools to analyze big data faster

Customers use HDInsight Interactive Query (also called Hive LLAP, or Low Latency Analytical Processing) to query data stored in Azure storage & Azure Data Lake Storage in super-fast manner. Interactive query makes it easy for developers and data scientist to work with the big data using BI tools they love the most. HDInsight Interactive Query supports several tools to access big data in easy fashion. In this blog we have listed most popular tools used by our customers:

Microsoft Power BI


Microsoft Power BI Desktop has a native connector to perform direct query against  HDInsight Interactive Query cluster. You can explore and visualize the data in interactive manner.

Azure HDInsight, Azure Big Data, Azure Certification, Azure Study Materials

Apache Zeppelin


Apache Zeppelin interpreter concept allows any language/data-processing-backend to be plugged into Zeppelin. You can access Interactive Query from Apache Zeppelin using a JDBC interpreter.

Azure HDInsight, Azure Big Data, Azure Certification, Azure Study Materials

Visual Studio Code


With HDInsight Tools for VS Code, you can submit interactive queries as well at look at job information in HDInsight interactive query clusters.

Azure HDInsight, Azure Big Data, Azure Certification, Azure Study Materials

Visual Studio


Visual Studio integration helps you create and query tables in visual fashion. You can create a Hive tables on top of data stored in Azure Data Lake Storage or Azure Storage. 

Azure HDInsight, Azure Big Data, Azure Certification, Azure Study Materials

Ambari Hive View


Hive View is designed to help you author, optimize, and execute queries. With Hive Views you can:

◈ Browse databases.
◈ Write queries or browse query results in full-screen mode, which can be particularly helpful with complex queries or large query results.
◈ Manage query execution jobs and history.
◈ View existing databases, tables, and their statistics.
◈ Create/upload tables and export table DDL to source control.
◈ View visual explain plans to learn more about query plan.

Azure HDInsight, Azure Big Data, Azure Certification, Azure Study Materials

Beeline


Beeline is a Hive client that is included on the head nodes of HDInsight cluster. Beeline uses JDBC to connect to HiveServer2, a service hosted on HDInsight cluster. You can also use Beeline to access Hive on HDInsight remotely over the internet.

Azure HDInsight, Azure Big Data, Azure Certification, Azure Study Materials

Hive ODBC


Open Database Connectivity (ODBC) API, a standard for the Hive database management system, enables ODBC compliant applications to interact seamlessly with Hive through a standard interface.

Azure HDInsight, Azure Big Data, Azure Certification, Azure Study Materials

Tableau


Tableau is very popular data visualization tool. Customers can build visualizations by connecting Tableau with HDInsight interactive Query.

Azure HDInsight, Azure Big Data, Azure Certification, Azure Study Materials

Azure HDInsight, Azure Big Data, Azure Certification, Azure Study Materials

Apache DBeaver


Apache DBeaver is SQL client and a database administration tool. It is free and open-source (ASL). DBeaver use JDBC API to connect with SQL based databases.

Azure HDInsight, Azure Big Data, Azure Certification, Azure Study Materials

Excel


Microsoft Excel is the most popular data analysis tool and connecting it with big data is even more interesting for our customers. Azure HDInsight Interactive Query cluster can be integrated with Excel with ODBC connectivity.

Azure HDInsight, Azure Big Data, Azure Certification, Azure Study Materials

Wednesday, 17 January 2018

Azure Analysis Services new modeling and tooling features

Following the announcement a few weeks ago that 1400 models are now in Azure Analysis Services, we haven’t stopped there! We are pleased to announce the following further features for 1400 models in Azure.

◈ Shared M expressions are shown in the SSDT Tabular Model Explorer, and can be maintained using the Query Editor.
◈ Data Management View (DMV) improvements.
◈ Opening an file with the .MSDAX extension in SSDT enables DAX non-model related IntelliSense.

Shared M expressions


Shared M expressions are shown in the Tabular Model Explorer! By right clicking the Expressions node, you can edit the expressions in the Query Editor. This should seem familiar to Power BI Desktop users.

Azure Analysis Services, Azure Guides, Azure Tutorials and Materials, Azure Learning

DMV improvements

DMVs expose information about server operations and server health, settings and model structure. They are used for server monitoring, model documentation and various other reasons.

DISCOVER_CALC_DEPENDENCY

M expression dependencies are included in DISCOVER_CALC_DEPENDENCY. The following query returns the output shown below. M expressions and structured data sources are included for 1400 models.

SELECT * FROM $System.DISCOVER_CALC_DEPENDENCY

WHERE OBJECT_TYPE = 'PARTITION' OR OBJECT_TYPE = 'M_EXPRESSION';

Azure Analysis Services, Azure Guides, Azure Tutorials and Materials, Azure Learning

The output represents the same information that is shown by the Query Dependencies visual, which is now available in SSDT from the Query Editor. This visual should seem familiar to Power BI Desktop users.

Azure Analysis Services, Azure Guides, Azure Tutorials and Materials, Azure Learning

MDSCHEMA_MEASUREGROUP_DIMENSIONS

This release provides a fix for MDSCHEMA_MEASUREGROUP_DIMENSIONS. This DMV is used by various client tools to show measure dimensionality. For example, the Explore feature in Excel Pivot Tables allows the user to cross-drill to dimensions related to the selected measures.

Azure Analysis Services, Azure Guides, Azure Tutorials and Materials, Azure Learning

Prior to this release, some rows were missing in the output for 1200 models, which meant the Explore feature did not work correctly. This is now fixed for 1200 and 1400 models.

DAX file editing

Opening a file with the .MSDAX extension allows DAX editing with non-model related IntelliSense such as highlighting, statement completion and parameter info. As you can imagine, we intend to use this for interesting features to be released in the future!

Azure Analysis Services, Azure Guides, Azure Tutorials and Materials, Azure Learning

Saturday, 13 January 2018

1400 compatibility level in Azure Analysis Services

We are excited to announce the public preview of the 1400 compatibility level for tabular models in Azure Analysis Services! This brings a host of new connectivity and modeling features for comprehensive, enterprise-scale analytic solutions delivering actionable insights. The 1400 compatibility level will also be available in SQL Server 2017 Analysis Services, ensuring a symmetric modeling capability across on-premises and the cloud.

Here are just some highlights of the new features available to 1400 models.
  • New infrastructure for data connectivity and ingestion into tabular models with support for TOM APIs and TMSL scripting. This enables:
    • Support for additional data sources, such as Azure Blob storage.
    • Data transformation and data mashup capabilities.
  • Support for BI tools such as Microsoft Excel enable drill-down to detailed data from an aggregated report. For example, when end-users view total sales for a region and month, they can view the associated order details.
  • Object-level security to secure table and column names in addition to the data within them.
  • Enhanced support for ragged hierarchies such as organizational charts and chart of accounts.
  • Various other improvements for performance, monitoring and consistency with the Power BI modeling experience.
In SSDT, you can select the new 1400 compatibility level when creating new tabular model projects. Alternatively, you can upgrade an existing tabular model by selecting the Model.bim file in Solution Explorer and setting the Compatibility Level to 1400 in the Properties window. Models at the 1400 compatibility level cannot be downgraded to lower compatibility levels.

Microsoft Guides, Microsoft Tutorials and Materials, Azure Tutorials and Materials

New Infrastructure for Data Connectivity


1400 models introduce a new infrastructure for data connectivity and ingestion into tabular models with support for TOM APIs and TMSL scripting. This is based on similar functionality in Power BI Desktop and Microsoft Excel 2016. At this point, only the following cloud-based data sources are supported with the 1400 compatibility level in Azure Analysis Services. We intend to add support for more data sources soon.

◈ Azure SQL Data Warehouse
◈ Azure SQL Database
◈ Azure Blog Storage

Detail Rows


A much-requested feature for tabular models is the ability to define a custom row set contributing to a measure value. Multidimensional models achieve this by using drillthrough and rowset actions. This allows end-users to view information in more detail than the aggregated level.

For example, the following PivotTable shows Internet Total Sales by year from the Adventure Works sample tabular model. Users can right-click the cell for 2010 and then select the Show Details menu option to view the detail rows.

Microsoft Guides, Microsoft Tutorials and Materials, Azure Tutorials and Materials

By default, all the columns in the Internet Sales table are displayed. This behavior is often not meaningful for the user because too many columns may be shown, and the table may not have the necessary columns to show useful information such as customer name and order information.

Detail Rows Expression Property for Measures

1400 models introduce the Detail Rows Expression property for measures. It allows the modeler to customize the columns and rows returned to the end user. The following example uses the DAX Editor in SSDT to define the columns to be returned for the Internet Total Sales measure.

Microsoft Guides, Microsoft Tutorials and Materials, Azure Tutorials and Materials

With the property defined and the model deployed, the custom row set is returned when the user selects Show Details. It automatically honors the filter context of the cell that was selected. In this example, only the rows for 2010 value are displayed.

Microsoft Guides, Microsoft Tutorials and Materials, Azure Tutorials and Materials

Object-Level Security


Roles in tabular models already support a granular list of permissions, and row-level filters to help protect sensitive data.

1400 models introduce table- and column-level security allowing sensitive table and column names to be protected in addition to the data within them. Collectively these features are referred to as object-level security (OLS).

The current version requires that OLS is set using the JSON-based metadata, Tabular Model Scripting Language (TMSL), or Tabular Object Model (TOM). We plan to deliver SSDT support soon. The following snippet of JSON-based metadata from the Model.bim file secures the Base Rate column in the Employee table of the Adventure Works sample tabular model by setting the MetadataPermission property of the ColumnPermission class to None.

"roles": [

  {

    "name": "General Users",

    "description": "All allowed users to query the model",

    "modelPermission": "read",

    "tablePermissions": [

      {

        "name": "Employee",

        "columnPermissions": [

           {

              "name": "Base Rate",

              "metadataPermission": "none"

           }

        ]

      }

    ]

  }

Unauthorized users cannot access the Base Rate column using client tools like Power BI and Excel Pivot Tables. Additionally, such users cannot query the Base Rate column using DAX or MDX, or measures that refer to it.

Ragged Hierarchies


Tabular models with previous compatibility levels can be used to model parent-child hierarchies. Hierarchies with a differing number of levels are referred to as ragged hierarchies. An example of a ragged hierarchy is an organizational chart. By default, ragged hierarchies are displayed with blanks for levels below the lowest child. This can look untidy to users, as shown by this organizational chart in Adventure Works:

Microsoft Guides, Microsoft Tutorials and Materials, Azure Tutorials and Materials

1400 models introduce the Hide Members property to correct this. Simply set the Hide Members property to Hide blank members.

Microsoft Guides, Microsoft Tutorials and Materials, Azure Tutorials and Materials

With the property set and the model deployed, the more presentable version of the hierarchy is displayed.

Microsoft Guides, Microsoft Tutorials and Materials, Azure Tutorials and Materials

Other Features


Various other features such as the following are also introduced with the 1400 compatibility level. For more information, please refer to the Analysis Services Team blog for what's new in SQL Server 2017 CTP 2.0 and SQL Server vNext on Windows CTP 1.1 for Analysis Services.

◈ Transaction-performance improvements for a more responsive developer experience.
◈ Dynamic Management View improvements enabling dependency analysis and reporting.
◈ Hierarchy and column reuse to be surfaced in more helpful locations in the Power BI field list.
◈ Date relationships to easily create relationships to date dimensions based on date columns.
◈ DAX enhancements to make DAX more accessible and powerful. These include the IN operator and table/row constructors.