Wednesday, 31 August 2022

Enhance Azure Firewall policy management with Tufin SecureTrack integration

Organizations today are faced with growing network challenges with hundreds of network assets deployed in hybrid and multi-cloud environments. Daily tasks such as enabling access or troubleshooting network connectivity issues have become increasingly more complex and spread across different teams within the organization. In addition, organizations are asked to do more while keeping costs under control. Gaining full and clear visibility and control is needed for Network Security and Cloud Security teams to manage their fragmented networks, enforce security policies, work more efficiently, and reduce their attack surface caused by misconfigurations.

To help organizations address these challenges and achieve agility and security, Tufin announced integration with Azure Firewall to provide holistic security policy management within your Azure and hybrid environments.

Simplified and automated Azure Firewall policy management with Tufin SecureTrack

Tufin SecureTrack delivers real-time visibility and policy-driven, unified security across your hybrid and cloud environments. Azure Firewall integration with Tufin SecureTrack enhances your visibility and security while managing Azure Firewall policies. Key features to simplify and automate network security management across on-premises and cloud include policy automated audit trail and network topology visualization.

Automated audit trail

Keeping track of changes in the Firewall Policy is essential for various use cases—it allows you to reduce the attack surface by gaining visibility into misconfigurations and risky changes. In addition, you can easily identify each change and its deployment time and leverage this information to deploy a new Azure Firewall policy version when needed.

With Tufin SecureTrack, you can leverage automated audit trail to changes made to your Azure Firewall policy. In addition, you can compare different policy versions in an intuitive, side-by-side user interface, to quickly pin each exact change.

Azure Firewall, Azure Exam, Azure Exam Prep, Azure Tutorial and Materials, Azure Guides, Azure Preparation, Azure Policy

Network topology visualization


Having a clear picture of your network topology is required in modern and fast-changing networks to control potential risks and troubleshoot misconfigurations. Maintaining such network mapping manually is a time-consuming task that involves various teams. Even with such a map in hand, it is still difficult to tell if a specific network connection is allowed and in which path.

With Tufin SecureTrack, you can leverage live and automated network topology maps, generated by the configuration of your Azure Firewall and other network assets. This map allows you to simulate a path analysis of a specific network connection, based on your network topology and applied policy rules, for both East and West and North and South traffic. Tufin visualizes how the traffic flows based on the routing and networking configuration, and whether the access is allowed or blocked based on the security configuration.

Azure Firewall, Azure Exam, Azure Exam Prep, Azure Tutorial and Materials, Azure Guides, Azure Preparation, Azure Policy

Azure Firewall, Azure Exam, Azure Exam Prep, Azure Tutorial and Materials, Azure Guides, Azure Preparation, Azure Policy

Source: microsoft.com

Sunday, 28 August 2022

Azure Lab Services August 2022 update—Improved classroom and training experience

Azure Lab Services, Azure Exam, Azure Exam Prep, Azure Tutorial and Materials, Azure Career, Azure Skills, Azure Jobs, Azure Guides

The new updated Azure Lab Services allows you to set up and configure Cloud labs for your classroom and training scenarios. You don’t have to worry about setting up, expanding, or managing on-premises labs anymore. We provide a managed service and take the hassle out of managing and maintaining these labs. The updated service comes with improved performance and enhanced backend reliability. With the introduction of virtual network (VNet) injection and more control of the virtual network, you can now unlock key training and classroom scenarios such as lab-to-lab communication and utilize the service to teach a wide range of courses requiring complex configurations. With this new update you also now have an option to integrate the service with the Canvas learning management system.

The introduction of additional roles, Azure policies, and enhanced cost tracking and management capability provides the features you need to fully understand, manage and maintain your service. The availability of a .NET SDK, Python SDK, Azure PowerShell module, and Azure Resource Manager (ARM) templates makes it easy for IT and administrators to automate and manage all aspects of the service. Learn more about the Azure Lab Services update and how to use it.

With major reliability and performance enhancements to the original service, this major update is bringing a whole slew of additional features for IT organizations, administrators, educators, and students.

The update is bringing features and functionality for all personas of the service including administrators, educators, and students.

New features help IT departments and administrators automate and manage

For the IT staff and the service administrators, now there is a concept of creating a lab plan instead of a lab account in the Azure portal to start the process of creating labs. A lab plan is used to create, configure settings, and manage the labs. For ease of administration of the lab, new roles have been created to provide granular control to different people in the organization who will manage and maintain the labs. We are also introducing default and custom Azure policies with this update to help administrators with more control over the management of the labs.

Similar to the older service, you will have to request additional virtual processors (vCPUs), depending on your Azure subscription, and how many labs and virtual machines you want to create in the labs. With this updated release, there is an improved vCPU capacity management for your subscription, and you don't share the vCPU capacity with other customers when using the service.

With the new release, it is also easier to track costs for your labs or the virtual machines utilizing Azure Cost Management. On the networking front, we are introducing Virtual Network Injection compared to virtual network peering, which was offered in the older service. Virtual Network Injection provides you with control of Azure NSG (Network Security Group) and load balancer for your virtual network. Virtual Network Injection supports some of the common scenarios such as lab-to-lab communication, access to Azure or on-premises license server, and utilizing Azure File services.

In order to make it easy for administrators to manage and maintain the service, we are offering a range of tools including a .NET SDK, Python SDK, Azure PowerShell module, and ARM templates. These tools will not only help you with automating and managing your service but can also be utilized to build value-add services on top of our service for your customers.

In alignment with all the global compliance and regulatory laws around data residency, the customers now have a choice to deploy the labs and related virtual machines in their region of choice, so their data stays local to where they want.

More options and flexibility for educators

Educators and instructors are also getting features and new functionality to improve their experience in the service. The updated service can also be integrated with Canvas, a popular learning management system. This makes it easy for educators to stay in Canvas to create, manage, and maintain their labs, and students can also access the labs and virtual machines from within Canvas. Educators now have the option to create labs with virtual machines and assign students to them with non-admin access.

The auto-shutdown feature of the virtual machines has now been improved to work across both Windows and Linux virtual machines. In addition, there are improvements around virtual machine idle detection based on resource usage and user presence. The update also provides additional flexibility to the educator to skip the virtual machine template creation process if they already have an image to use and don’t want to customize it. Using an already existing image or the default image from the Azure marketplace allows for fast creation of the lab compared to when the educator wants to create a lab with an image but will further customize it after the lab is created.

Faster, easier access for students

The updated service has also introduced improvements to the student experience.  Students can now troubleshoot any virtual machine access issues by redeploying their virtual machine without losing data. If the lab is set up to use Azure Active Directory (AAD) group sync, there is no longer a need to send an invitation email to the students to register for the lab and get access to the virtual machine. Now, a virtual machine is automatically assigned to the student and they can access it immediately.

Source: microsoft.com

Thursday, 25 August 2022

3 ways Azure Speech transforms game development with AI

Azure Exam, Azure Tutorial and Materials, Azure Career, Azure Skills, Azure Jobs, Azure Guides

With Azure Cognitive Services for Speech, customers can build voice-enabled apps confidently and quickly with the Speech SDK. We make it easy for customers to transcribe speech to text (STT) with high accuracy, produce natural-sounding text-to-speech (TTS) voices, and translate spoken audio. In the past few years, we have been inspired by the innovations coming out of the gaming industry, specific to AI.

Why AI for gaming? AI in gaming allows for flexible and reactive video game experiences. As technology continues to change and evolve, AI innovation has led to pioneering and tremendous advances in the gaming industry. Here are three popular use cases:

Use Cases for AI Gaming

Game dialogue prototyping with text to speech: Shorten the amount of time and money spent on the product to get the game to market sooner. Designers and producers can rapidly swap lines of dialogue using different emotional voices and listen to variations in real-time to ensure accuracy.

Greater accessibility with transcription, translation, and text to speech: Make gaming more accessible and add functionality through a single interface. Gameplay instructions that make games more accessible to individuals unable to read the text or language. Storylines for visually impaired gamers or younger users that have yet to be taught to read.

Scalable non-playable character voices and interaction with text to speech: Easily produce voice characters that stay on-brand with consistent quality and speaking styles. Game developers can add emotions, accents, nuances, laughter, and other paralinguistic sounds and expressions to game avatars and NPCs (non-playable characters) that can initiate or participate in a conversation in-game.

Featured Customers for AI Gaming

Flight Simulator: Our first-party game developers are using AI for speech to improve end-user experiences. Flight Simulator is the longest-running franchise in Microsoft history, and the latest critically acclaimed release not only builds on that legacy, but it also pushes the boundaries as the most technologically advanced simulator ever made. By adding authentic air traffic controller voices, Flight Simulator added a small-but-powerful way to elevate the Flight Simulator experience.​ Recording audio to replicate air traffic controllers from every airport on Earth was a huge task—TTS is a great solution that can handle the dynamic content as well as serve the air traffic controller voices as a low-latency, highly available, secure, and scalable solution. Let’s check out a video for the newly released Flight Simulator experience with custom neural voice implemented for real-time air traffic controller voice.

Undead Labs: Undead Labs studio is on a mission to take gaming in bold new directions. They are the makers of the State of Decay franchise and use Azure Neural TTS during game development.

Double Fine: Double Fine is the producer of many popular games, including Psychonauts. They are utilizing our neural TTS to prototype future game projects.

You can check out our use case presentation at Microsoft’s Game Developers Conference 2022 for more details.

Speech Services and Responsible AI

We are excited about the future of Azure Speech with human-like, diverse, and delightful quality under the high-level architecture of XYZ-code AI framework. Our technology advancements are also guided by Microsoft’s Responsible AI process, and our principles of fairness, inclusiveness, reliability and safety, transparency, privacy and security, and accountability. We put these ethical standards into practice through the Office of Responsible AI (ORA)—which sets our rules and governance processes, the AI Ethics and Effects in Engineering and Research (Aether) Committee—which advises our leadership on the challenges and opportunities presented by AI innovations, and Responsible AI Strategy in Engineering (RAISE)—a team that enables the implementation of Microsoft Responsible AI rules across engineering groups.

Source: microsoft.com

Tuesday, 23 August 2022

Microsoft named a Leader in 2022 Gartner® Magic Quadrant™ for Data Integration Tools

In the modern business landscape, the intake of information and data is growing at an incredibly rapid pace. Organizations, regardless of size, need to quickly gain insights from all data to inform customer experiences and empower their employees. Current solutions are bespoke and siloed, leading to users spending considerable time and resources stitching together disparate products across a variety of vendors. This creates costly operational overhead and diverts resources away from value creation. In response to this high-pressure environment, many organizations are looking for cutting-edge data integration platforms and resources, and Microsoft is fully invested in empowering these companies to succeed.

We are excited to share that Gartner has positioned Microsoft as a Leader once again in the 2022 Gartner Magic Quadrant for Data Integration Tools. We believe this recognition shows our continued growth and ongoing commitment to delivering comprehensive and cost-effective data integration solutions.

The Gartner Magic Quadrant for Data Integration Tools evaluated companies on a range of categories including data engineering, cloud migration, and operational data integration tasks.

Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Certification, Microsoft Tutorial and Materials, Microsoft Gartner, Microsoft Guides, Microsoft News

Translating data into a competitive advantage


It’s easy to be overwhelmed with the amount of data businesses are generating every day. Not only do organizations need to deal with the technical requirements of processing their data, they also are operating in a high-risk environment, where the regulatory challenges are significant and noncompliance can mean an expensive penalty.

Against this backdrop, Microsoft brings an end-to-end data integration strategy to drive competitive advantage and deliver better business outcomes. Regardless of where source data is coming from—from operational databases to software as a service (SaaS) to multicloud—Microsoft data integration serves as the foundation that brings this data together and prepares it for cloud-scale analytics.

To lay the groundwork for reliable data pipelines, organizations can choose from more than 100 connectors to seamlessly move data. New capabilities also enable connections without time-consuming export, transform, and load (ETL) processes, so users can achieve insights faster. Microsoft data integration works seamlessly to combine data and prepare it for analysis in a central, secure environment. Simplified data migration, low or no-code ETL, enterprise business workflows, metadata management, and data governance help boost productivity and empowers organizations to achieve more with data. The company’s entire data team—from data engineers to business analysts—can discover and use the data they need, whether users want to write their own queries or leverage a low-code environment to ingest and transform data.

Microsoft services for data integration


With tooling that delivers a comprehensive set of capabilities, organizations can build a solid data integration foundation.

Azure Data Factory is a managed cloud service that's built for petabyte-scale data ingestion, data transformation, and orchestration at scale. Use Azure Data Factory for data engineering (build, manage, and operationalize data ingestion and transformation pipelines), data and cloud migration (customers migrating data from on-premises or another cloud), and operational data integration (ongoing data integration and synchronization to support ongoing and critical business processes).

Azure Data Factory Studio is purpose-built to provide data engineers with a familiar and productive environment for authoring their data integration pipelines and data flows for code-free transformations at scale. The experience provides users with sophisticated control flow and orchestration capability to author robust data integration tasks that operate over large amounts of data. Hundreds of connectors enable data-source-specific connectivity from Azure Data Factory and Power Query.

Power Query is a data transformation and data preparation engine that delivers an approachable user experience with self-service and enterprise-ready connectors to hundreds of data sources, from cloud to on-premises. Power Query enables business analysts to handle data preparation tasks on their own for workloads across Power Platform, Dynamics 365, and Microsoft Excel.

Azure Synapse Link is a service that eliminates barriers between Microsoft data stores and Azure Synapse Analytics. Automatically move data from both operational databases and business applications without time-consuming ETL processes. Get an end-to-end view of the business by easily connecting separate systems—and democratize data access with a solution that brings the power of analytics to every data-connected team.

Azure Synapse Link already connects to a variety of Microsoft data stores, such as Azure Cosmos DB and Azure SQL Database, and will connect to more in the future. Here are the connections available now:

- Azure Synapse Link for Dataverse—now generally available.
- Azure Synapse Link for Cosmos DB—now generally available.
- Azure Synapse Link for SQL (both SQL Server 2022 and Azure SQL Database)—now in preview.

The future of data is integration


In this complex environment where data holds such immense value, our north star is to enable our customers to drive a data culture and power a new class of data-first applications. We want our customers to take intelligent action based on insights unlocked from their data, and turn it into competitive advantage, all while respecting and maintaining compliance. We do this by empowering every individual and organization, delivering data integration and analytic tools and resources to inform every decision, at any scale.

Source: microsoft.com

Saturday, 20 August 2022

Azure Data Explorer: Log and telemetry analytics benchmark

Azure Data Explorer, Azure Exam Prep, Azure Tutorial and Materials, Azure Career, Azure Skills, Azure Jobs, Azure Preparation Exam

Azure Data Explorer (ADX), a component of Azure Synapse Analytics, is a highly scalable analytics service optimized for structured, semi-structured, and unstructured data. It provides users with an interactive query experience that unlocks insights from the ocean of ever-growing log and telemetry data. It is the perfect service to analyze high volumes of fresh and historical data in the cloud by using SQL or the Kusto Query Language (KQL), a powerful and user-friendly query language.

Azure Data Explorer is a key enabler for Microsoft’s own digital transformation. Virtually all Microsoft products and services use ADX in one way or another; this includes troubleshooting, diagnosis, monitoring, machine learning, and as a data platform for Azure services such as Azure Monitor, PlayFab, Sentinel, Microsoft 365 Defender, and many others. Microsoft’s customers and partners are using ADX for a large variety of scenarios from fleet management, manufacturing, security analytics solutions, package tracking and logistics, IoT device monitoring, financial transaction monitoring, and many other scenarios. Over the last years, the service has seen phenomenal growth and is now running on millions of Azure virtual machine cores.

Last year, the third generation of the Kusto engine (EngineV3) was released and is currently offered as a transparent, in-place upgrade to all users not already using the latest version. The new engine features a completely new implementation of the storage, cache, and query execution layers. As a result, performance has doubled or more in many mission-critical workloads.

Superior performance and cost-efficiency with Azure Data Explorer

To better help our users assess the performance of the new engine and cost advantages of ADX, we looked for an existing telemetry and logs benchmark that has the workload characteristics common to what we see with our users:

1. Telemetry tables that contain structured, semi-structured, and unstructured data types.

2. Records in the hundreds of billions to test massive scale.

3. Queries that represent common diagnostic and monitoring scenarios.

As we did not find an existing benchmark to meet these needs, we collaborated with and sponsored GigaOm to create and run one. The new logs and telemetry benchmark is publicly available in this GitHub repo. This repository includes a data generator to generate datasets of 1GB, 1TB, and 100TB, as well as a set of 19 queries and a test driver to execute the benchmark.

The results, now available in the GigaOm report, show that Azure Data Explorer provides superior performance at a significantly lower cost in both single and high-concurrency scenarios. For example, the following chart taken from the report displays the results of executing the benchmark while simulating 50 concurrent users: 

Azure Data Explorer, Azure Exam Prep, Azure Tutorial and Materials, Azure Career, Azure Skills, Azure Jobs, Azure Preparation Exam

Source: microsoft.com

Tuesday, 16 August 2022

Announcing Microsoft Dev Box Preview

Many IT organizations must choose between giving developers the flexibility they need to be productive and keeping developer workstations managed and secure. Supply chain challenges have led to developers waiting weeks or months to get the hardware they need, forcing them to use aging hardware or unsecured personal devices. At the same time, hybrid work has forced IT to open access to corporate and on-premises resources to developers around the world. With access to sensitive source code and customer data, developers are increasingly becoming the target of more sophisticated cyberattacks.

Read More: MB-340: Microsoft Dynamics 365 Commerce Functional Consultant 

Today, we’re excited to announce the preview of Microsoft Dev Box is now available to the public. Microsoft Dev Box is a managed service that enables developers to create on-demand, high-performance, secure, ready-to-code, project-specific workstations in the cloud. Sign in to the Azure portal and search for “dev box” to begin creating dev boxes for your organization.

Focus on code—not infrastructure

Microsoft Dev Box, Microsoft Career, Microsoft Skill, Microsoft Jobs, Microsoft New, Microsoft Tutorial and Materials, Microsoft Prep, Microsoft Prep Exam, Microsoft Preparation, Microsoft Developers

With Microsoft Dev Box, developers can focus on writing the code only they can write instead of trying to get a working environment that can build and run the code. Dev boxes are ready-to-code and preconfigured by the team with all the tools and settings developers need for their projects and tasks. Developers can create their own dev boxes whenever they need to quickly switch between projects, experiment on a proof-of-concept, or kick off a full build in the background while they move on to the next task.

Microsoft Dev Box supports any developer IDE, SDK, or tool that runs on Windows. Developers can target any development workload that can be built from Windows including desktop, mobile, IoT, and web applications. Microsoft Dev Box even supports building cross-platform apps thanks to Windows Subsystem for Linux and Windows Subsystem for Android. Remote access gives developers the flexibility to securely access dev boxes from any device, whether it’s Windows, MacOS, Android, iOS, or a web browser.

Tailor dev boxes to the needs of the team


Microsoft Dev Box, Microsoft Career, Microsoft Skill, Microsoft Jobs, Microsoft New, Microsoft Tutorial and Materials, Microsoft Prep, Microsoft Prep Exam, Microsoft Preparation, Microsoft Developers

With Microsoft Dev Box, developer teams create and maintain dev box images with all the tools and dependencies their developers need to build and run their applications. Developer leads can instantly deploy the right size dev box for specific roles in a team anywhere in the world, selecting from 4 vCPU / 16GB to 32 vCPU / 128GB SKUs to scale to any size application. By deploying dev boxes in the closest Azure region and connecting via the Azure Global Network, dev teams ensure a smooth and responsive experience with gigabit connection speeds for developers around the world.

Using Azure Active Directory groups, IT admins can grant access to sensitive source code and customer data for each project. With role-based permissions and custom network configurations, developer leads can give vendors limited access to the resources they need to contribute to the project—eliminating the need to ship hardware to short-term contractors and helping keep development more secure.

Centralize governance and management


Microsoft Dev Box, Microsoft Career, Microsoft Skill, Microsoft Jobs, Microsoft New, Microsoft Tutorial and Materials, Microsoft Prep, Microsoft Prep Exam, Microsoft Preparation, Microsoft Developers

Developer flexibility and productivity can’t come at the expense of security or compliance. Microsoft Dev Box builds on Windows 365, making it easy for IT administrators to manage dev boxes together with physical devices and Cloud PCs through Microsoft Intune and Microsoft Endpoint Manager. IT admins can set conditional access policies to ensure users only access dev boxes from compliant devices while keeping dev boxes up to date using expedited quality updates to deploy zero-day patches across the organization and quickly isolate compromised devices. Endpoint Manager’s deep device analytics make it easy to audit application health, device utilization, and other critical metrics, giving developers the confidence to focus on their code knowing they’re not exposing the organization to any unnecessary risk.

Microsoft Dev Box uses a consumption-based compute and storage pricing model, meaning organizations only pay for what they use. Automated schedules can warm up dev boxes at the start of the day and stop them at the end of the day while they sit idle. With hibernation, available in a few weeks, developers can resume a stopped dev box and pick up right where they left off.

Get started now


Microsoft Dev Box is available today as a preview from the Azure Portal. During this period, organizations get the first 15 hours of the dev box 8vCPU and 32 GB Memory SKU for free every month, along with the first 365 hours of the dev box Storage SSD 512 GB SKU. Beyond that, organizations pay only for what they use with a consumption-based pricing model. With this model, organizations are charged on a per-hour basis depending on the number of Compute and Storage that are consumed.

Source: microsoft.com

Saturday, 13 August 2022

Microsoft is a Leader in 2022 Gartner Magic Quadrant for Cloud AI Developer Services

Gartner has recognized Microsoft as a Leader in the 2022 Gartner® Magic Quadrant™ for Cloud AI Developer Services, with Microsoft placed furthest in “Completeness of Vision”.

Gartner defines the market as “cloud-hosted or containerized services that enable development teams and business users who are not data science experts to use AI models via APIs, software development kits (SDKs), or applications.”

Microsoft Leader 2022, Gartner Magic Quadrant, Cloud AI Developer Services, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Tutorial and Materials, Microsoft News, Microsoft Praparation, Microsoft

We are proud to be recognized for our Azure AI Platform. In this post, we’ll dig into the Gartner evaluation, what it means for developers, and provide access to the full reprint of the Gartner Magic Quadrant to learn more.

Scale intelligent apps with production-ready AI

“Although ModelOps practices are maturing, most software engineering teams still need AI capabilities that do not demand advanced machine learning skills. For this reason, cloud AI developer services (CAIDS) are essential tools for software engineering teams.”—Gartner

A staggering 87 percent of AI projects never make it into production. Beyond the complexity of data preprocessing and building AI models, organizations wrestle with scalability, security, governance, and more to make their model’s production ready. That’s why over 85 percent of Fortune 100 companies use Azure AI today, spanning industries and use cases.

More and more, we see developers accelerate time to value by using pre-built and customizable AI models as building blocks for intelligent solutions. Microsoft Research has made significant breakthroughs in AI over the years, being the first to achieve human parity across speech, vision, and language capabilities. Today, we’re pushing the boundaries of language model capabilities with large models like Turing, GPT-3, and Codex (the model powering GitHub Copilot) to help developers be more productive. Azure AI packages these innovations into production-ready general models known as Azure Cognitive Services and use case-specific models, Azure Applied AI Services for developers to integrate via API or an SDK, then continue to fine tune for greater accuracy.

For developers and data scientists looking to build production-ready machine learning models at scale, we support automated machine learning also known as autoML. AutoML in Azure Machine Learning is based on breakthrough Microsoft research focused on automating the time-consuming, iterative tasks of machine learning model development. This frees up data scientists, analysts, and developers to focus on value-add tasks outside operations and accelerate their time to production.

Enable productivity for AI teams across the organization

“As more developers use CAIDS to build machine learning models, the collaboration between developers and data scientists will become increasingly important.”—Gartner

As AI becomes more mainstream across organizations, it’s essential that employees have the tools they need to collaborate, build, manage, and deploy AI solutions effectively and responsibly. As Microsoft Chairman and CEO Satya Nadella shared at Microsoft Build, Microsoft is "building models as platforms in Azure" so that developers with different skills can take advantage of breakthrough AI research and embed them into their own applications. This ranges from professional developers building intelligent apps with APIs and SDKs to citizen developers using pre-built models via Microsoft Power Platform.

Azure AI empowers developers to build apps in their preferred language and deploy in the cloud, on-premises, or at the edge using containers. Recently we also announced the capability to use any Kubernetes cluster and extend machine learning to run close to where your data lives. These resources can be run through a single pane with the management, consistency, and reliability provided by Azure Arc.

Operationalize Responsible AI practices

“Vendors and customers alike are seeking more than just performance and accuracy from machine learning model. When selecting AutoML services, they should prioritize vendors that excel at providing explainable, transparent models with built-in bias detection and compensatory mechanisms.”—Gartner

At Microsoft, we apply our Responsible AI Standard to our product strategy and development lifecycle, and we’ve made it a priority to help customers do the same. We also provide tools and resources to help customers understand, protect, and control their AI solutions, including a Responsible AI Dashboard, bot development guidelines, and built-in tools to help them explain model behavior, test for fairness, and more. Providing a consistent toolset to your data science team not only supports responsible AI implementation but also helps provide greater transparency and enables more consistent, efficient model deployments.

Microsoft is proud to be recognized as a Leader in Cloud AI Developer Services, and we are excited by innovations happening at Microsoft and across the industry that empower developers to tackle real-world challenges with AI. You can read and learn from the complete Gartner Magic Quadrant now.

Source: microsoft.com

Tuesday, 9 August 2022

Bluware and Microsoft Azure develop OSDU-enabled interactive AI seismic interpretation solution for energy super major

Azure Tutorial and Material, Azure Career, Azure Skills, Azure Tutorial, Azure Skills, Azure Jobs

Bluware, which develops cloud-native solutions to help oil and gas operators to increase exploration and production workflow productivity through deep learning by enabling geoscientists to deliver faster and smarter decisions about the subsurface and today announced its collaboration with Microsoft for its next-generation automated interpretation solution, InteractivAI™, which is built on the Azure implementation of the OSDU™ Data Platform.

The two companies are working together to provide comprehensive solutions combining Microsoft Cloud implementation of OSDU™ Data Platform with Bluware’s subsurface knowledge. As the world’s energy companies retool for the future, they are juggling priorities between new forms of energy, carbon emissions, and maintaining the growing demand for fossil fuels. Innovative solutions such as cloud computing and machine learning are playing an important role in this transition.

To address an energy super major’s seismic interpretation challenges, Bluware is providing an interactive deep learning solution that runs natively on Azure, called InteractivAI™.

InteractivAI™ is utilized by the organization’s exploration and reservoir development teams to accelerate seismic interpretations and improve results by assisting geoscientists in identifying geological and geophysical features that may have been previously missed, incorrectly interpreted, or simply too time-consuming to interpret.

Using a data-centric approach, the application is unique in its ability, allowing users to train and infer simultaneously. Imagine running deep learning in real-time where the interpreter is providing feedback that the operator can actually see as the network suggests on-the-fly interpretations. This even includes results on data that is either not readily visible to the human eye or very difficult to see. This interactive workflow delivers more precise and comprehensive results in hours compared to months resulting in higher quality exploration and reservoir development.

The interactive deep learning approach

Bluware is pioneering the concept of ‘interactive deep learning’, wherein the scientist remains in the figurative ‘driver’s seat’ and steers the network as it learns and adapts based on the interpreter’s teachings. The adjustment and optimization of training the data set provides immediate feedback to the network, which in turn adjusts weights and biases accordingly in real-time.

Bluware differs from other deep learning approaches which use a neural network that has been pre-trained on multiple data sets. Users must rely on a network that was trained on data they have not seen, created with a set of unknown biases, and therefore something they have no control over.

The basic parameterization exposed to scientists in these traditional approaches gives the illusion of network control without really ceding any significant control to the user. Processing times can be days or weeks, and scientists can only supply feedback to the network once the training is complete, at which point training will need to run again from scratch.

Azure Tutorial and Material, Azure Career, Azure Skills, Azure Tutorial, Azure Skills, Azure Jobs
The interactive deep learning approach is a data-specific approach that focuses on creating the best learning and training model for the geology the user is working with. Unlike traditional deep learning approaches, the idea is to start with a blank, untrained network and train it while labeling to identify any feature of interest. This approach is not limited to salt or faults, but can also be used to capture shallow hazards, injectites, channels, bright spots, and more. This flexibility allows the expert to explore the myriad of possibilities and alternative interpretations within the area of interest.

The energy company initially conducted a two-month evaluation with multiple experts across their global asset teams. The results were remarkable, and the organization is continually adding users. Additionally, Bluware has provided a blueprint for the company’s IT team for an Azure Kubernetes Service (AKS) implementation which will accelerate and expand this Azure-based solution.

A seismic data format designed for the cloud

As companies continue to wrestle with enormous, complex data streams such as petabytes of seismic data, the pressure to invest in digital technology intensifies. Bluware has adapted to this imperative, delivering a cloud-based format for storing seismic data called Volume Data Store™ (VDS). Microsoft and Bluware have worked together to natively enable VDS as part of the Microsoft Cloud implementation of OSDU™ Data Platform, where developers and customers can connect to the seismic data stored and provide interactive AI-driven seismic interpretation workflows by using the InteractivAI™ SaaS from the Azure Appsource.

Bluware and Microsoft are collaborating in parallel to support major energy customers through their seismic shift initiatives including moving petabytes of data to Azure Blob storage in a cloud-native VDS environment.

Revolutionizing the way energy companies store and use seismic data

Bluware designed InteractivAI™ not only with seismic workflows in mind but also with an eye on the trends shaping the future of the energy sector. Creating a cloud-native data format makes it scalable for energy companies to do more with their data while lowering costs and speeding up workflows, allowing them to arrive at more accurate decisions faster leveraging the power of Azure.

About Bluware

In 2018, a group of energy-focused software companies, namely Bluware, Headwave, Hue, and Kalkulo AS merged to become Bluware Corp. to empower change, growth, and a sustainable future for the energy sector.

As companies pivot from fossil fuels to cleaner energy sources, the combination of new industry standards, cloud computing, and AI will be critical for companies to adapt quickly, work smarter, and continue to be profitable. Companies that adapt faster, will have a significant advantage over their competition.

Source: microsoft.com

Sunday, 7 August 2022

Power your file storage-intensive workloads with Azure VMware Solution

Azure VMware Solution, Azure Career, Azure Tutorial and Materials, Azure Prep, Azure Certification, Azure Guides, Azure Learning

If you’ve been waiting for the right time to optimize your storage-intensive VMware applications in the cloud, I have great news for you: Azure NetApp Files for Network File System (NFS) datastores in Azure VMware Solution is now available in preview.

With Azure VMware Solution you can now scale storage independently from compute using Azure NetApp Files datastores, enabling you to run VMware-based storage-intensive workloads like SQL Server, general-purpose file servers, and others in Azure.

Gain the flexibility and scalability of running your storage-heavy workloads on Azure VMware Solution, while delivering high performance and low latency.

Azure NetApp Files as a datastores choice for Azure VMware Solution

Azure NetApp Files is available in preview as a datastores choice for Azure VMware Solution, and Azure NetApp Files NFS volumes can now be attached to the Azure VMware Solution clusters of your choice.

Use cases include migration and disaster recovery (DR)

Azure NetApp Files datastores for Azure VMware solution enable VMware customers to:

◉ Flexibly manage and scale storage resources for workloads running on Azure VMware Solution, independently to compute.

◉ Lower total cost of ownership (TCO) through storage optimization, for VMware workloads

◉ More efficiently leverage Azure VMware Solution as a DR-endpoint for business continuity

Let the powerful file storage solution in the cloud power your VMware workloads

Azure NetApp Files is a fully managed file share service built on trusted NetApp ONTAP storage technology and offered as an Azure first-party solution.

"Azure NetApp Files helps deliver the performance, flexibility, scalability, and cost optimization customers need to migrate any VMWare workload, including ’un-migratable‘, storage-intensive VMware applications, to the Azure cloud and to securely back up on-premises VMware applications to Azure.”—Ronen Schwartz, Senior Vice President and General Manager, NetApp Cloud Volumes

We know every business is different and scaling on its own timetable, so we created three performance tiers for Azure NetApp Files: Standard, Premium, and Ultra. Scale-up and down on-demand as your requirements change. You can store up to 10 PB in a single deployment; achieve up to 4.4 GBps of throughput and sub-millisecond minimum latency in a single volume.

We continue to add features and regions and listen to our customers to better understand what they need to migrate their workloads to Azure. We heard loud and clear from VMware customers that Azure NetApp Files was exactly what they needed to make the move to the cloud.

Fully integrated with Azure VMware Solution

But we didn’t build a silo solution that works only with Azure VMware Solution. We built the most powerful file storage solution in the public cloud to work seamlessly with other Azure services. Now we have extended Azure NetApp Files to work perfectly with Azure VMware Solution to meet the needs of VMware customers.

Source: microsoft.com

Saturday, 6 August 2022

Find the clarity and guidance you need to realize cloud value

Cloud Value, Cloud Strategy, Microsoft Certification, Microsoft Exam, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Prep, Microsoft Prreparation Exam, Microsoft Tutorial and Materials

A modernized cloud workload offers significant benefits—including cost savings, optimized security and management, and opportunities for ongoing innovation. But the process of migrating and modernizing workloads can be challenging. That’s why it’s essential to prepare and plan ahead—and to ensure that your organization finds continued value in the cloud.

Whether you’re just starting your move to the cloud or are looking for ways to optimize your current cloud workloads, my team and I are committed to helping you maximize your cloud investments, overcome technical barriers, adopt new business processes, develop your team’s skills, and achieve sustainable innovation in the cloud. That’s why we invite you to watch sessions from Realizing Success in the Cloud—now available on-demand.

At this digital event, attendees learned about the key components of a successful cloud adoption journey. They heard Microsoft leaders, industry experts, and Azure customers discuss ways to drive value with migration and modernization. They also discovered best practices for boosting adoption across organizations and enabling sustainable innovation in the long term.

Check out these session highlights, which cover three critical areas of the cloud journey:

1. Optimize your business value in the cloud

In the early phases of any new cloud project, it’s essential that you define strategy, understand motivations, and identify business outcomes. Maybe you’re looking to optimize your cost investment and reduce technical debt. Or maybe adoption might enable your team to build new technical capabilities and products. Whether you’re looking to migrate, modernize, or innovate in the cloud, you’ll want to build a business case that sets your organization up for success—and we’ll show you how to put one together.

With the help of Jeremy Winter, Azure VP of Program Management, you’ll explore the process using key technical and financial guidelines. In this session, you’ll discover templates, assessments, and tools for estimating your cloud costs, managing spending, and maximizing the overall value you get from Azure. You’ll also hear how the cloud experts at Insight, a Microsoft technology partner, use Azure enablement resources to help their clients realize savings.

2. Customize your Azure journey

Your organization’s business, security, and industry requirements are unique, which is why you’ll need to develop a tailored plan that will help you successfully execute your vision—and ensure that your deployment and operations needs are being met. That’s why it’s important to understand when to adhere to the best practices of your cloud vendor—and when to customize your journey—with guidance from the experts.

In the session led by Uli Homann, Microsoft VP of Cloud and AI, you’ll learn how to set up scalable, modular cloud environments using Azure landing zones. As you prepare for post-deployment, you’ll find out how to evaluate the cost efficiency, performance, reliability, and security of your workload performance using recommendations from the Azure Well-Architected Framework and Azure Advisor. Uli also speaks with NHS Digital, the technology partner for the UK’s public healthcare system, to discuss how they built a responsive system architecture that could scale and perform under unprecedented demand.

3. Accelerate success with Azure skills training

Whether you’re migrating to the cloud or building a cloud-native app, the skills of your team are key to enabling successful business outcomes. Azure skills training fosters a growth mindset and helps your team develop expertise that impacts your entire organization, from individual career advancement to sustainable, long-term innovation.

In a fireside chat between Sandeep Bhanot, Microsoft VP of Global Technical Learning, and Cushing Anderson, VP of IT Education and Certification at IDC, you’ll hear about key learnings from research that highlight the business value of skills training for accelerating success. You’ll also explore how to use these findings to build a compelling business case for developing skills training programs in your organization.

Watch this event on-demand to:

◉ Get an overview of the cloud enablement tools, programs, and frameworks available to help you realize your goals on Azure.

◉ See these resources in action. Hear success stories from customers like KPMG who have used Azure enablement resources to build, optimize, and achieve ongoing value in the cloud.

◉ Hear insights from Microsoft product experts as they answer questions from the Azure community during the Q and A.

The live event may be over, but you still have the chance to learn and explore at your own pace, on your own time. Discover how to quickly access and use the right set of Azure enablement tools for your specific needs—and pave the way for ongoing success in the cloud. 

Watch now.

Source: azure.microsoft.com

Thursday, 4 August 2022

5 steps to prepare developers for cloud modernization

If you’re thinking about what it takes to modernize your applications, you’re not alone. Companies everywhere now understand that migrating applications to the cloud and shifting to a cloud-first approach is critical to business competitiveness. The purpose of modernizing applications is to better align them to current and future business needs. By deploying enterprise applications to the cloud, you gain greater ability to innovate, improve security, scale to meet demand, manage costs, and deliver rich and consistent customer experiences anywhere in the world more quickly.

Cloud Modernization, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Certification, Microsoft News

But as you move to the cloud, there are many options to choose from and skills to gain. One of the most important parts of this effort is understanding how to prepare developers for cloud modernization—and one of the trickiest parts is knowing where to start.

According to research on Developer Velocity, the number one driver of business performance is best-in-class developer tools. Companies that create the right environment—by providing strong tools and removing points of friction for developers to innovate—have 47 percent higher developer satisfaction and retention rates than those in the lowest quartile for Developer Velocity. With Microsoft Azure, you’ll find not only the tools and technologies that you need to move to the cloud, but also extensive developer support for cloud modernization.

In this article, we’ll walk you through technical documentation, educational resources, and step-by-step guidance to help you build the skills and strategy needed to successfully modernize your applications. We use Azure App Service as our example, but the same concepts apply to other tools you might use in your modernization efforts.

Here are five steps to take to start preparing for cloud modernization:

1. Watch how application migration works.

Migrating existing, on-premises applications to the cloud is often the focus of initial application modernization efforts. Once the business case has been made to migrate an application to the cloud, you’ll need to assess the application for all the dependencies that can affect whether it can be successfully migrated without modifying the application. In the case of App Service, a migration assistant guides you through the assessment. Then, if the assessment indicates that the application can be migrated, the migration assistant performs the migration. To get an introduction to how the assessment and migration process works, watch the overview video on how to migrate web apps to App Service.

2. Learn to migrate an on-premises application to the cloud.

The best way to understand what it takes to migrate an application is to try it for yourself. To learn how to migrate an on-premises web application to App Service, take the step-by-step online course—including a hands-on lab—that guides you through migration and post-migration. Using a sandbox environment and access to free resources, you’ll get an in-depth walkthrough of how to migrate your web application, from assessment through post-migration tasks. You’ll also get background on why the assessment phase is so important, what types of problems it’s intended to identify, and what to do if any problems are found. Next, the course takes you through the migration process and provides guidance on the settings you’ll need to choose from, and it prepares you for additional tasks that might be necessary to get the web app in working order.

3. Build a web app in the language of your choice.

Learning how to build a cloud-native application is another important step in preparing yourself to shift to a cloud-first approach. To give it a try, sign up for an Azure free account, which gives you access to dozens of free services, including App Service. Along with access to a wide range of cloud resources, you get developer support for cloud modernization through quickstart guides that walk you through creating and deploying a web app in App Service using the language of your choice, including .NET, Node.js, Java, Python, and other languages. This is also a great time to explore other Azure cloud capabilities and use the $200 credit that you get with the Azure free account.

4. Assess your own web apps for modernization readiness.

Once you understand the basics of migrating and deploying applications in the cloud, it’s time to get to work on the process of assessing and migrating your own web apps. Use the free App Service migration tool to run a scan on your web app’s public URL. The tool will provide you with a compatibility report on the technologies your app uses and whether App Service fully supports them. If compatible, the tool will guide you through downloading the migration assistant, which simplifies migration in an automated way with minimal or no code changes.

5. Download the App Migration Toolkit.

With a solid background in how to prepare for modernization, you’re in a good position to start putting the full range of Azure developer support for cloud modernization to work. Download the App Migration Toolkit to find the resources you need to successfully modernize your ASP.NET applications from start to finish. From building your business case to best practices and help gaining skills, the toolkit provides practical guidance and support to help you turn your application modernization plans into reality.

While application modernization is a significant initiative that requires strategy, planning, skill-building, and investment of time and resources, the benefits to the business are worth the effort. Fortunately, Azure simplifies the process of figuring out how to prepare developers for cloud modernization. The App Migration Toolkit gives you the skills and knowledge needed to help your organization innovate and stay competitive. 

Source: microsoft.com

Tuesday, 2 August 2022

How Microsoft Azure Cross-region Load Balancer helps create region redundancy and low latency

In this blog, we’ll walk through Microsoft Azure Cross-region Load Balancer (also known as the Global tier of Standard Load Balancer) through a case study with a retail customer. By incorporating Azure Cross-region Load Balancer into their end-to-end architecture, the customer was able to achieve region redundancy, high availability, and low latency for their end applications with a quick turnaround time for scaling events while retaining their IPs for TCP and UDP connections. DNS-based global load balancing solution was considered but not adopted due to long failover time caused by time-to-live not being honored.

Low latency with geo-proximity-based routing algorithm



Figure 1: With Azure Load Balancer all traffic will be routed to a random backend server based on 5-tuple hash.

Figure 2: With Cross-region Load Balancer traffic will be routed to the closest regional deployment.

With the previous setup, all traffic regardless of source IP location will be first forwarded to the load balancer’s region. This could take several hops across data centers which could introduce additional latency to network requests. With Azure Cross-region Load Balancer’s geo-proximity-based routing, end customers are being routed to the closest regional deployment which dramatically improves latency.

Automatic failover for disaster recovery




Figure 3: With Standard SKU Load Balancer, when the only regional deployment or the Load Balancer goes down, all traffic can be impacted.

Figure 4: Cross-region Load Balancer ensures seamless failover for disaster recovery.

Even though Standard Load Balancer offers zone redundancy, it is a regional resource. If a regional outage occurs causing the Load Balancer or all the backend servers to go unavailable, traffic will not be able to be forwarded as it arrives at the Load Balancer frontend. As a result, the website will be unavailable to the end customers. By adding a Cross-region Load Balancer on top of several existing regional deployments, the customer is now armed with region redundancy which ensures high availability of their end application. If web server one goes down, the end customer's traffic will be re-routed to web server two to ensure no packet gets dropped.

Scale up and down with no downtime



Figure 5: Easy scaling when using Microsoft Azure Virtual Machine Scale Sets (VMSS) combined with Cross-region Load Balancer.

Like many other industries, the retail industry faces frequent changes in traffic volume due to seasonality and other spontaneous trends. As a result, the customer’s top concern is to scale up and down in real-time. There are two ways to achieve this today with a Cross-region Load Balancer. One way is to directly add or remove a regional Public Load Balancer behind the Cross-region Load Balancer. Another way is to use Microsoft Azure Virtual Machine Scale Sets with a pre-configured autoscaling policy.

Zero friction for adoption


Azure Load Balancer has been an important part of the customer’s end-to-end architecture for stable connectivity and smart load balancing. By leaving the existing network architecture as is and simply adding a Cross-region Load Balancer on top of the existing load balancer set up, the customer is saved from any additional overhead or friction due to the addition of a Cross-region Load Balancer.

Client IP preservation


Cross-region load balancer is a Layer-4 pass-through network load balancer, which ensures that the Load Balancer preserves the original IP address of the network packet. IP preservation allows you to apply logic in the backend server that is specific to the original client IP address.

Source: microsoft.com