Tuesday, 30 November 2021

SONiC opens more opportunities for network innovation

SONiC, as an open-source operating system for network devices, has been growing rapidly in the last five years. According to Gartner Market Guide for Data Center Switching published early this year, they predict, “By 2025, 40 percent of organizations that operate large datacenter networks (more than 200 switches) will run SONiC in production environments.” And, “due to this rapidly expanding customer interest and commercial ecosystem, there is a strong possibility that, during the next three to six years, SONiC will become analogous to Linux as a server operating system, allowing enterprises to standardize on a NOS that is supported across hardware vendors.”

We have been working with many partners on innovations extending SONiC to new scenarios in the past year. Let’s look at what was showcased in the OCP Global Summit this month, and the opportunities SONiC enables.

Enable high-reliability dual ToR support with smart cable

High availability is a never-ending pursuit for network engineers. Delivering packets for customers without any glitch is a simple ask, however challenging to promise due to all sorts of possible failures on the path. Research shows the critical role of network infrastructure—each switch has a 2 percent chance of suffering a failure within three months of deployments, with 32 percent of failures attributed to hardware faults and 27 percent to unplanned power outages. The classical way to improve the reliability of a path is to add redundancy to reduce the impact of hardware failure. This year the SONiC community developed an innovative way to provide dual ToR (Top of Rack) connectivity to customer VMs. This SONiC-based approach does not require adding more NICs to the existing servers and avoids using the traditional MLAG (Multi-Chassis Link Aggregation) mechanism that is prone to split-brain failure. The secret goes inside the cable. Instead of the conventional Y cable, the new smart cable contains a microcontroller and a hitless MUX. The intelligence sits in the SONiC ToR switches. They manage the MUX inside the smart cable, determine the traffic path for the server, and handle failover rapidly. Measurements show this approach gives dual connectivity through a smart cable and SONiC switches with a failover time of less than 1µs. This capability is available in the SONiC 20201230 release. Microsoft, Broadcom, Credo, and many companies have contributed to this.

SONiC, Network Innovation, Azure Exam Prep, Azure Certification, Azure Learning, Azure Peparation
Figure 1: Dual ToR support through smart cable and SONiC switches.

DASH enables limitless networking


The programmable hardware ecosystem (smart NIC, smart ToR, smart appliance) has been booming in the last two years and will continue to grow. The performance and customizability of such devices are outstanding. This year, the SONiC community has launched a new workstream—DASH (Disaggregated APIs for SONiC Hosts)—to capitalize this for limitless networking. The initial goal is to improve the L4 performance and connection scale of Software Defined Networking operations by 10 times to 100 times over software implementation solutions. DASH leverages modern high-speed SmartNIC hardware to accelerate the flow processing, changing the game of implementing the data plane of SDN. The first set of overlay and underlay SAI APIs for VNET-to-VNET connectivity have been defined, and the test design is under active discussion. We foresee many applications that will benefit from DASH innovation, for example, encryption gateways with high-speed inline encryption and key management, load balancers, service tunneling, and more. The open-source nature of SONiC enables the flexibility to customize for individual use cases. We standardize APIs through SAI (Switch Abstraction Interface) to ensure interoperability across various programmable hardware. The solution inherits comprehensive monitoring, diagnostic capacity, reliability such as hitless upgrades and management of containers from SONiC for free. Nvidia, Pensando, Intel, and many partners are actively contributing to the program.

SONiC, Network Innovation, Azure Exam Prep, Azure Certification, Azure Learning, Azure Peparation
Figure 2: Seven initial DASH scenarios.

PINS—extending SONiC with programmability


There are two ways to run the network—distributed or centralized; each has its strength. In the distributed model, each switch has the intelligence to discover neighbors, build the routing table, and react to topology changes. Such a network can scale and self-heal rapidly if there is a failure. In the centralized model, a dedicated external control system builds up the topology and programs onto switch nodes in the network. The network behavior is deterministic, easy to debug, and enables optimal traffic engineering. In the past year, the PINS (P4 Integrated Network Stack) community and the SONiC community have been working together to integrate SONiC with PINS. This enables a customer to build an SDN (Software Defined Network) centrally controlled network with SONiC switches. The SDN controller programs the network devices, SONiC switches, through P4 with behavior model or contract defined by P4Runtime, which enables extensions to the runtime. This initiative allows SONiC users to have many choices to build their network with a rich hardware ecosystem. The minimum viable product will be in the SONiC 20211130 release, with more L2/L3 functionalities coming in the following releases.

Source: microsoft.com

Thursday, 25 November 2021

Advancing service resilience in Azure Active Directory with its backup authentication service

The most critical promise of our identity services is ensuring that every user can access the apps and services they need without interruption. We’ve been strengthening this promise to you through a multi-layered approach, leading to our improved promise of 99.99 percent authentication uptime for Azure Active Directory (Azure AD). Today, I am excited to share a deep dive into generally available technology that allows Azure AD to achieve even higher levels of resiliency.

The Azure AD backup authentication service transparently and automatically handles authentications for supported workloads when the primary Azure AD service is unavailable. It adds an additional layer of resilience on top of the multiple levels of redundancy in Azure AD. You can think of it as a backup generator or uninterrupted power supply designed to provide additional fault tolerance while staying completely transparent and automatic to you. This system operates in the Microsoft cloud but on separate and decorrelated systems and network paths from the primary Azure AD system. This means that it can continue to operate in case of service, network, or capacity issues across many Azure AD and dependent Azure services.

What workloads are covered by the service?

This service has been protecting Outlook Web Access and SharePoint Online workloads since 2019. Earlier this year we completed backup support for applications running on desktops and mobile devices, or “native” apps. All Microsoft native apps including Office 365 and Teams, plus non-Microsoft and customer-owned applications running natively on devices are now covered. No special action or configuration changes are required to receive the backup authentication coverage.

Starting at the end of 2021, we will begin rolling out support for more web-based applications. We will be phasing in apps using Open ID Connect, starting with Microsoft web apps like Teams Online and Office 365, followed by customer-owned web apps that use Open ID Connect and Security Assertion Markup Language (SAML).

How does the service work?

When a failure of the Azure AD primary service is detected, the backup authentication service automatically engages, allowing the user’s applications to keep working. As the primary service recovers, authentication requests are re-routed back to the primary Azure AD service. The backup authentication service operates in two modes:

◉ Normal mode: The backup service stores essential authentication data during normal operating conditions. Successful authentication responses from Azure AD to dependent apps generate session-specific data that is securely stored by the backup service for up to three days. The authentication data is specific to a device-user-app-resource combination and represents a snapshot of a successful authentication at a point in time.

◉ Outage mode: Any time an authentication request fails unexpectedly, the Azure AD gateway automatically routes it to the backup service. It then authenticates the request, verifies artifacts presented are valid (such as, refresh token, and session cookie), and looks for a strict session match in the previously stored data. An authentication response, consistent with what the primary Azure AD system would have generated, is then sent to the application. Upon recovery, traffic is dynamically re-routed back to the primary Azure AD service.

Azure Active Directory, Azure Exam Prep, Azure Tutorial and Material, Azure Certification, Azure Career, Azure Guides, Azure Preparation

Routing to the backup service is automatic and its authentication responses are consistent with those usually coming from the primary Azure AD service. This means that the protection kicks in with no need for application modifications, nor manual intervention.

Note that the priority of the backup authentication service is to keep user productivity alive for access to an app or resource where authentication was recently granted. This happens to be most of the type of requests to Azure AD—93 percent, in fact. “New” authentications beyond the three-day storage window, where access was not recently granted on the user’s current device, are not currently supported during outages, but most users access their most important applications daily from a consistent device.

How are security policies and access compliance enforced during an outage?


The backup authentication service continuously monitors security events which affect user access to keep accounts secure, even if these events are detected right before an outage. It uses Continuous Access Evaluation to ensure the sessions that are no longer valid are revoked immediately. Examples of security events that would cause the backup service to restrict access during an outage include changes to device state, account disablement, account deletion, access being revoked by an admin, or detection of a high user risk event. Only once the primary authentication service has been restored would a user with a security event be able to regain access.

In addition, the backup authentication service enforces Conditional Access policies. Policies are re-evaluated by the backup service before granting access during an outage to determine which policies apply and whether the required controls for applicable policies like multi-factor authentication (MFA) have been satisfied. If an authentication request is received by the backup service and a control like MFA has not been satisfied, then that authentication would be blocked.

Azure Active Directory, Azure Exam Prep, Azure Tutorial and Material, Azure Certification, Azure Career, Azure Guides, Azure Preparation
Conditional Access policies that rely on conditions such as user, application, device platform, and IP address are enforced using real-time data as detected by the backup authentication service. However, certain policy conditions (such as sign-in risk and role membership) cannot be evaluated in real-time, and are evaluated based on resilience settings. Resilience defaults enable Azure AD to safely maximize productivity when a condition (such as group membership) is not available in real-time during an outage. The service will evaluate a policy assuming that the condition has not changed since the latest access just before the outage.

While we highly recommend customers to keep resilience defaults enabled, there may be some scenarios where admins would rather block access during an outage when a Conditional Access condition cannot be evaluated in real-time. For these rare cases, administrators can disable resilience defaults per policy within Conditional Access. If resilience defaults are disabled by policy, the backup authentication service will not serve requests that are subject to real-time policy conditions, meaning those users may be blocked by a primary Azure AD outage.

Source: microsoft.com

Tuesday, 23 November 2021

Microsoft expands cloud services with two new datacenters in Wyoming

Microsoft Cloud Services, Microsoft Exam Prep, Microsoft Certification, Microsoft Career, Microsoft Guides, Microsoft Learning, Microsoft Online Skills, Microsoft

Today, I’m thrilled to share that we are announcing the launch of two new Microsoft datacenters in Cheyenne, Wyoming—one in Cheyenne Business Parkway and another in Bison Business Park—enabling us to expand and support the growth and demand for our digital services in our West Central US datacenter region. Cheyenne has been home to Microsoft’s cloud infrastructure services since 2012 and this expansion will enable us to continue providing services to current and new customers.

The two new datacenters will be built with sustainable design and operations in mind contributing to Microsoft’s commitment to being carbon negative. We’re engaging in significant efforts towards water conservation and preserving Cheyenne’s water resources is a top priority for us. Our current and new datacenters will use adiabatic cooling, which uses outside air instead of water for cooling when temperatures are below 85 degrees Fahrenheit. This process uses less electricity and up to 90 percent less water than other water-based cooling systems. We’ve already invested over $500K in sustainability grants to local organizations that help preserve Wyoming’s natural environment including Crow Creek Revival that aims to promote, enhance, restore, and revitalize the region’s key watershed.

Not only will our datacenters be built with sustainability top of mind, but Microsoft’s presence in Wyoming will also have a real economic and community impact on the residents. We currently employ over 200 employees and vendors in the region and this expansion will enable us to create even more jobs for the community, including more than 700 positions during peak construction. In addition to building datacenters, Microsoft is investing in new water, sewer, and road infrastructure to create easier access to Bison Business Park, which will also support the growth of new businesses in Wyoming.

“The growth of the data center industry in Wyoming has been led by Microsoft and it is coming to represent a significant sign of the continued diversification of our economy. I appreciate Microsoft's commitment to Wyoming and thank them for the benefits they have brought to multiple sectors of our economy,” said Wyoming Governor Mark Gordon. “Governor Mead was the inspiration for bringing the first Microsoft Data Center to the state in 2012. The incentives that set this train in motion are working. This is a sector of our growing economy that continues to pick up steam. More and more companies are looking to build in Wyoming in the coming years.”

“Additional datacenters are a great win for Cheyenne and all of Laramie County,” said Cheyenne Mayor Patrick Collins. “With it comes more high tech opportunities, a skilled workforce, and expands upon Microsoft’s existing economic impact to our community.”

Expanding Microsoft’s digital capabilities in Wyoming will allow us to meet the demand for new and existing customers in the region, and we’re excited to continue supporting the growth of diverse businesses in the state and look forward to working with state and local leaders on initiatives that can foster job creation and economic opportunity.

Investing in the community

Through Microsoft TechSpark and our Datacenter Community Development program we’ve worked closely with Wyoming organizations to create solutions that can benefit the state.

◉ K12 computer science opportunity: We’ve partnered with the Wyoming Department of Education to host ongoing computer science training for more than 30 school districts across the state, impacting more than 60,000 K-12 students. We’re also collaborating with Code.org and the University of Wyoming to build the capacity of hundreds of K12 teachers to offer computer science education to thousands of students across the state. Microsoft’s TEALS high school computer science program has also partnered with six Wyoming high schools to build high-quality computer science education for approximately 500 students in Laramie, Gillette, Douglas, Casper, and Buffalo.

◉ Quality job creation: Microsoft has invested over $500K to build a technology sector in Wyoming that focuses on job creation, workforce development and fosters the Wyoming startup community. Through a startup accelerator, named “gBETA Cheyenne”, we’ve helped 15 Wyoming startups raise nearly $3 million in investment capital to grow their companies and local technology jobs. We’ve also sponsored startup competitions in Laramie, Cheyenne and the Wind River Reservation to bring attention and mentoring to budding Wyoming startups.

◉ Increasing broadband access: We’re focused on making the internet more accessible for the communities which makes way for even more educational and economic opportunities. That’s why we’ve invested over $350,000 to launch free WiFi at key community locations such as the Downtown Cheyenne Depot, COMEA Shelter, and two Laramie County Public Library locations, helping more than 100,000 visitors and residents annually stay connected. Through Microsoft’s Airband program, we’ve also partnered with Vistabeam to provide increased broadband access to 116,600 Wyomingites in the region, including a previously unserved rural population of 19,000 that had no access to broadband.

◉ Upskilling opportunities: We’ve invested over a million dollars to launch three adult upskilling programs to help respond to COVID-19 challenges, including a Microsoft Datacenter Academy at Laramie County Community College, a tech apprenticeship program with the Array School of Technology and Design, as well as the Upskill Wyoming program. These opportunities provide Wyoming citizens with scholarships and upskilling opportunities that allow them to pursue better-paying remote and in-region jobs.

Source: microsoft.com

Sunday, 21 November 2021

Discover what’s new to Microsoft database services—recap from Microsoft Ignite

Microsoft Ignite, Microsoft Database Services, Azure Exam Prep, Azure Tutorial and Materials, Azure Certification, Azure Preparation

Whether it’s SQL Server, Azure SQL, open source, or NoSQL, Microsoft’s database services are here to help you build cloud-native applications or modernize existing applications with fully managed, flexible databases.

Last week at Microsoft Ignite, there were many announcements to help organizations regardless of where they are in their modernization journey. Our team is invested, and I want to share how these announcements are helping us reach every customer on the planet, modernize existing applications, and provide limitless database scale and performance for new applications.

Reach every customer on the planet


We announced the preview of SQL Server 2022, the most Azure-enabled SQL Server release yet. This enables easier cloud integration than ever before with new disaster recovery functionality for Azure SQL Managed Instance and seamless analytics over on-premises operational data with Synapse Link for SQL Server. We continue to enhance the database engine including new security capabilities like SQL Server Ledger, and built-in query intelligence with Query Store and the third generation of Intelligent Query Processing enhancements.

In addition to announcing a new version of SQL Server, we have new capabilities in Azure Arc. With Azure Arc-enabled data services, including SQL Managed Instance and PostgreSQL Hyperscale, you can get evergreen, managed services anywhere you are. Last week, we announced the general availability of point-in-time-restore (PITR) and in-place upgrade for Azure Arc-enabled SQL Managed Instance. There’s also a new Microsoft Learn module if you’re new to running Azure services anywhere.

Modernize existing applications


Azure is the best destination for SQL Server, and more and more customers are running SQL Server on Azure Virtual Machines. Recently, we’ve added new and improved SQL Server on Azure Virtual Machines marketplace images. The new images have updated configurations to adhere to the latest best practices, along with a new deployment experience that allows you to customize settings, such as MAXDOP and collation, according to your needs. Even more, SQL Assessments have arrived in preview so you can keep your SQL Server on Azure Virtual Machine healthy. You can schedule assessments, run an assessment on-demand, and view assessment results directly from the SQL virtual machine blade in the Azure portal.

Azure SQL Managed Instance had several key announcements, including hardware offers with significantly more memory per vCore (up to 13.6 GB per vCore) and increased storage up to 16 TB in both the general purpose (general availability) and business critical (preview) service tiers, providing more room for app growth. Also, the new Link feature for Azure SQL Managed Instance enables SQL Server customers to use Azure SQL Managed Instance for disaster recovery and restore Azure SQL Managed Instance databases to SQL Server. Details about other highly anticipated capabilities that are now available like Windows (Kerberos) Authentication (limited preview), AAD only authentication, AAD authentication for Linked Server, BULK INSERT and BACKUP / RESTORE with Managed Identity credentials, distributed transactions between SQL Managed Instances, and much more.

Microsoft Ignite, Microsoft Database Services, Azure Exam Prep, Azure Tutorial and Materials, Azure Certification, Azure Preparation
When it comes to open source, Flexible Server, which is available for Azure Database for MySQL (now generally available) and for Azure Database for PostgreSQL (preview) is a new deployment option that is hosted using native Linux and designed to provide more granular control and flexibility. Flexible Server provides for maximum control of your databases, high availability options to ensure zero data loss, built-in capabilities for cost optimization, and increased productivity enabled by integrations with Azure ecosystem. Flexible Server also supports customer-requested capabilities like custom maintenance windows, zone redundant high availability, and cost optimization features like burstable compute, start stop and reserved instances. It’s no surprise that the customer adoption has been so strong.

Last week, we also announced the general availability of Azure Managed Instance for Apache Cassandra providing the ability to provision managed native Apache Cassandra clusters, with automated deployment and scaling operations, accelerating hybrid scenarios and reducing ongoing maintenance.

Provide limitless database scale and performance


In the NoSQL space, Azure Cosmos DB is a fully managed NoSQL database built to support production applications at any size or scale. At Microsoft Ignite, we released many new capabilities such as partial document updates, provisioned throughput maximums, and cost-savings recommendations in Azure Advisor—all generally available. In addition, Azure Synapse Link custom partitioning, Cassandra API server-side retries, and a Logic App connector are now in preview. 

Part of the Azure SQL family, Azure SQL Database is an intelligent and scalable relational database built for the cloud. With serverless compute and Hyperscale storage options that automatically scale resources on demand, we’re enabling you to focus on building applications. Last week, we released configurable differential backups as generally available and introduced memory grant feedback improvements in preview.

Last but certainly not least, we announced the general availability of Postgres 14 on Azure’s Hyperscale (Citus) option. This may be the first time a major cloud provider has announced the general availability for a new Postgres major version on their platform one day after the official release. We also shipped the Citus 10.2 open source release and made it available in Hyperscale (Citus) in the Azure portal. Another key announcement is Azure Private Link support for Hyperscale (Citus) (preview): you can now connect your Hyperscale (Citus) nodes to the Azure virtual network securely and privately.

Source: microsoft.com

Saturday, 20 November 2021

Microsoft and AT&T are accelerating the enterprise customer’s journey to the edge with 5G

Today, we find ourselves at a pivotal moment that’s impacting many enterprise customers’ digital transformation needs. In this place where cloud meets the edge, compute meets mobile, and 5G trends continue to drive innovation—customer demand for advanced network capabilities is surging. For customers, the promise of all these converging technologies is still the ability to create and use innovative solutions and experiences to keep pace with a rapidly evolving digital landscape.

As a result, enterprises are migrating mission-critical workloads to the cloud to better serve their customers. The technologies that are involved are complex, and companies are looking to providers to not merely sell them products, but to help them deliver innovation while developing ever-greater capabilities. With new use cases and connected devices becoming ubiquitous, those enterprises are requiring new edge application solutions close to the end users to help them build innovative solutions within industries as diverse as gaming, automotive, healthcare, manufacturing, and more.

Sign up for the Azure Edge Zones with AT&T private preview today.

Microsoft, AT&T, Microsoft Exam, Microsoft Exam Prep, Microsoft Tutorial and Material, Microsoft Career, Microsoft Certification

Microsoft and AT&T’s deep collaboration meets these needs by supporting our mutual customers’ digital transformation and evolution. We’re bringing our collective cloud and network technologies and expertise to light in areas such as 5G, AI, and IoT—to improve the ways in which people live and work.

We’ve already made considerable progress. In June, we hit a major milestone when we announced an industry-first collaboration to adopt Microsoft cloud technology for AT&T’s 5G core network workloads. This enables AT&T to increase productivity, reduce costs, and deliver innovative services that meet its customers’ evolving needs. We’re also leading development of new solutions that will help enterprises lower costs while increasing efficiency, reliability, and security at the edge of their premises and facilities through capabilities such as AT&T-enabled Azure Sphere and Guardian module, and AT&T MEC with Azure.

Microsoft Azure is available in more than 60 regions, more than any other cloud provider, making it easy for enterprises to choose the datacenter and regions that are right for them and their customers. For densely populated metros where enterprises need low-latency compute resources, we extend the capability of Azure to the operator’s 5G network edge. Azure Edge Zones with AT&T can dramatically improve an application’s performance while reducing overhead and complexity for enterprise customers. A selected set of Azure services deployed at the edge, directly connected to AT&T’s 5G core, enables latency-sensitive enterprise scenarios through optimized routing from the Azure Edge Zones with AT&T to the AT&T mobility network. This enables developers to innovate richer applications with lower latency, higher throughput, and greater reach.

Microsoft, AT&T, Microsoft Exam, Microsoft Exam Prep, Microsoft Tutorial and Material, Microsoft Career, Microsoft Certification

Journey to the mobile network edge


Innovative enterprise customers are exploring ways to combine 5G’s next-generation network capabilities with the power of applications deployed closer to the customer at the network edge. For example, in music, we’re enabling new experiences not previously possible, through virtual jam sessions that offer an experience like musicians playing side by side—reducing latency and unleashing creativity. Working with JamKazam to power audio and video streaming for musicians online, AT&T’s low-latency solution, using the edge and 5G helped the band The Perfect Nines to jam without the constraints of crowded home wi-fi networks. Check out the JamKazam video to learn more. In another example, the AT&T 5G Innovation Studio collaborated with Microsoft Azure and EVA to deliver an important advancement for U.S. based autonomous drones. By creating a unique test environment representative of the Microsoft Azure Edge Zone with AT&T, the low latency of 5G combined with EVA’s app deployed at the network edge with Azure cloud services enabled autonomous drone control beyond visual line of sight.

Powerful relationships can also provide numerous benefits to consumers, as evidenced by General Motors recent collaboration with AT&T, supported by Microsoft’s cloud services—which will improve the quality of in-vehicle experience for drivers by offering improved roadway-centric coverage, higher quality music and video downloads, more reliable and secure over-the-air software updates, as well as faster navigation, mapping, and voice services.

Following the successful proof of concept in Los Angeles and the other positive developments noted above, we are excited about our Azure Edge Zones with AT&T private preview in Atlanta. Azure Edge Zones with AT&T in Dallas and other metros will soon follow. The momentum is building, and your imagination is the only limit to future offerings. With Microsoft and AT&T’s strategic collaboration, customers can unlock low-latency enterprise applications well beyond the traditional network and create the smart cities, roadways, and skyways of the future.

“The power of 5G is about more than just speed. It’s about harnessing ultra-fast and ultra-responsive connectivity to distributed cloud technology for entirely new experiences. As compute expands beyond centralized systems and out to the edge of the 5G network, companies and consumers now essentially have supercomputer capabilities in the air around them. From lightweight virtual reality interfaces that can be used by anyone from gamers to first responders, to hyper-precise location tools for industrial applications and warehousing, the edge is transformative. Our deep collaboration with Microsoft is designed to help customers make that leap and start creating the future.”—Andre Fuetsch, Chief Technology Officer, AT&T Network Services

We invite organizations of all sizes and from every segment to create joint experiments that unlock the capabilities enabled by Azure services at the Edge, connected through AT&T’s 5G network. From using drones over 5G to support public safety and traffic management efforts, to remote patient care, in-car autonomous safety response, and high-performance mobile gaming, the possibilities are endless.

Source: microsoft.com

Thursday, 18 November 2021

Bringing commercial innovations in chip design to national security

Rapid Assured Microelectronics Prototypes (RAMP), Azure Global, Microsoft Azure, Microsoft Exam Prep, Microsoft Certification, Microsoft Certification, Microsoft Career, Microsoft Prep, Azure Skills, Azure Preparation

Semiconductors and microelectronics are some of the most important components in building cutting-edge capabilities for our national security and defense technologies, from satellites and radar to vehicles and communications equipment. Ensuring these components are developed with the utmost regard for security is a critical, yet challenging task. Historically, the security requirements associated with developing microelectronics have limited the U.S. Department of Defense’s (DoD) ability to leverage the latest innovations.

Through a recent DoD-sponsored project, Rapid Assured Microelectronics Prototypes (RAMP) using Advanced Commercial Capabilities, the goal is to leverage commercial best practices to help accelerate the development process and bring reliable, secure state-of-the-art microelectronic design and manufacturing to national security and defense applications. The DoD recently announced it has selected Microsoft to support the second phase of this project.

This project builds on a 40-year history of collaboration between Microsoft and the U.S. DoD, to bring commercial innovation to the national security community. Microsoft previously led a coalition of partners in collaborating with the DoD on the first phase of this initiative: to develop design capabilities that achieve the department’s mission priorities. In this second phase, Microsoft and our partners will build on these successful designs and begin to develop custom integrated chips and System on a Chip (SoC) designs using a secure, collaborative design flow that provides access to advanced manufacturing processes. These new designs will achieve lower power consumption, improved performance, reduced physical size, and improved reliability for application in DoD systems.

Microsoft has engaged microelectronics industry leaders across the commercial and defense industrial base (DIB) to develop this phase of the RAMP project, collaborators include Ansys, Applied Materials, Inc., BAE Systems, Battelle Memorial Institute, Cadence Design Systems, Cliosoft, Inc., Flex Logix, GlobalFoundries, Intel Federal, Raytheon Intelligence and Space, Siemens EDA, Synopsys, Inc., Tortuga Logic, and Zero ASIC Corporation.

The RAMP solution will provide an advanced microelectronics development platform for mission-critical applications, with cloud, AI, and machine learning-enabled automation, security, and quantifiable assurance. This solution will be hosted in Azure Government, offering the broadest range of commercial innovation for governments with services available across all U.S. data classifications.

RAMP is a critical initiative that will enable the DoD to leverage a secure, scalable microelectronic supply chain, while also ensuring the design and manufacturing meets its stringent security and compliance requirements. By leveraging cloud-based secure design capabilities, RAMP will expand the number of foundries available to DoD, enhance resiliency, and foster growth of the domestic semiconductor supply chain. The success of RAMP will also enable the department to be more agile with technology developments, quickly adapt to evolving needs, and adopt the latest technological capabilities. We look forward to continuing our work with the DoD and our industry partners to deliver groundbreaking, transformative solutions to secure the microelectronic supply chain.

Source: microsoft.com

Tuesday, 16 November 2021

Learn how Microsoft Azure is accelerating hardware innovations for a sustainable future

Microsoft Azure, Azure Exam, Azure Exam Prep, Azure Tutorial and Materials, Azure Certification, Azure Materials, Azure Career, Azure Skills

This year’s theme for OCP Global Summit is Open Possibilities, and in the Microsoft Azure team, we wholeheartedly agree—we have been part of tremendous creativity and resiliency as the community continued to collaborate, despite the challenges posed by the pandemic. This year, we are excited to showcase some of our projects and technology at OCP Global Summit and share our learnings on the path of building a more reliable, trusted, and sustainable cloud alongside industry partners and the open source hardware ecosystem.

Accelerating open source innovation over the years

As OCP celebrates its 10-year anniversary, Microsoft is proud to have been one of the major contributors in the journey of reimaging datacenter designs and technology with the vibrant open source hardware community. We joined OCP in 2014, and over the years have shared the designs of our very own datacenters that powered Microsoft Azure (Open Cloud Server, OCS).  We continue to be inspired by the innovations and technology advancement made possible by industry partners and developers coming together as a community. We are glad to see Project Olympus, our open-sourced modular server architecture that was announced in 2016, achieving industry adoption and serving as the standard hardware design for over 60 datacenters regions around the world. In collaboration with the OCP community, Project Cerberus was introduced as the new industry standard for platform security in 2017, to equip datacenter designs with the critical components for robust security, firmware integrity, and hardware root of trust—deployed in Microsoft’s datacenters today. In 2020 we leaned in to innovate further by building upon datacenter foundations, and announced the Modular Building Block Architecture (MBA) initiative that provides interoperability for security, AI, and more.

Liquid cooling—from standards to production deployment to better chip performance

From early discussions on standardization in cold plates and liquid immersion cooling with OCP and partners, Microsoft achieved a milestone of deploying the world’s first two-phase liquid immersion cooling in a production public cloud environment in early 2021. We are excited to share from our latest research, that liquid cooling accentuates the benefits of overclocking, in which the chip components are operated beyond their pre-defined voltage, thermal, and power design limits to further improve performance. Our tests showed that with liquid cooling, the performance of some chipsets can increase by up to 20 percent. As liquid cooling unleashes new possibilities with fewer constraints and resources required for chip, server, and datacenter architecture, we look forward to engaging with the industry to not only further the cooling technology but also ensure safe and ergonomic operations of the immersion tanks.

Sustainably and responsibly meeting the computing demands of the future

Increasing demands for cloud computing leads to increasing demands for physical datacenters. It’s critical that Microsoft develop and operate the sites sustainably. As Microsoft committed to being carbon negative, water positive, and zero waste by 2030, we’ve created first-of-their-kind Microsoft Circular Centers, that will help us extend the lifecycle of servers and reuse them to reduce waste. We’re also working with the OCP community to collaborate on data gathering templates and contribute guidance on Life Cycle Assessments (LCA) as a method to understand and evaluate the environmental impact from our server hardware. As we expand the use of LCA and desire to apply it to assess impacts of bleeding edge technology, it is important to align on relevant metrics for measurement and accompanying supplier considerations across every stage of the hardware value chain. Microsoft will continue to actively work with the OCP community to define and further the sustainability agenda for the future of datacenters and to preserve our only planet.

Continuing to evolve and deliver cloud hardware innovation for scale

Microsoft Azure, Azure Exam, Azure Exam Prep, Azure Tutorial and Materials, Azure Certification, Azure Materials, Azure Career, Azure Skills
We continue to believe that collaboration through open source, community engagement, and industry partnership makes technology advancement more rigorous, faster, and better for the world. Together with OCP, we enable innovation, efficiency, scalability, and sustainability for the datacenter infrastructures of the future. We invite you to come visit Microsoft’s booth at OCP Global Summit 2021 and check out the Datacenter-ready Secure Control Module (DC-SCM) for modular server management, security, and control, as well as our two-phase liquid immersion cooling tank, and the newest addition to Project Olympus, a Modular 3U Chassis, designed for higher power devices and flexible connectivity. At OCP Global Summit’s Experience Center, you will also find us showcasing the latest 15mm E1.s form factor, which was led by Microsoft and developed in partnership with key industry players, for high-performance, high-density SSDs for next-generation servers and storage platforms.

Source: microsoft.com

Monday, 15 November 2021

Complement your Microsoft AZ-104 Exam Prep with Practice Test

You may be often contemplated why IT professionals work hard to pass Microsoft certification exams. What’s so good about these certifications? In addition to being bizarre about the most advanced cloud technologies and aspiring to learn more about them, there are other reasons back for this. One of the most coveted Microsoft certifications is the Microsoft Certified Azure Administrator Associate certification. You can obtain it by passing only one exam, which has the code AZ-104.

When you achieve this certification, you will manage to become an Azure Administrator who has a distinct role in any organization across the globe. Otherwise stated, such a professional will be responsible for deploying and managing an organization’s cloud infrastructure. Professional will work with a team of proficient professionals and engineers who perceive how to utilize cloud computing services to better business processes and boost their performance.

AZ-104 Exam Information

Let’s dive into the AZ-104 exam details. If you want to get through the Microsoft AZ-104 exam, you should have very clear in your mind which prerequisites for this exam are and what skills this exam evaluates. What affects the requirements for being qualified for AZ-104? Your possibilities to pass it from the first attempt will improve significantly if you have at least six months of practical experience using cloud technologies. So, you will learn the new concepts thoroughly and will enhance your odds to pass the AZ-104 exam. In general, this Microsoft exam helps applicants acquire the following skills:

  • Learn Azure governance and identities and perceive how to manage them;
  • Manage and execute various storage solutions;
  • Deploy Azure compute resources and understand how to manage them;
  • Configure virtual networking solutions and manage them appropriately;

Microsoft Azure Administrator AZ-104 Exam: Preparation Journey

Microsoft supports you pass its exams by equipping you with a complete preparation guide and other study resources. Here is a look at some of the means that you can use to prepare:

1. Enroll in an online training

Several online platforms offer Microsoft AZ-104 training. A mere Google search can disclose most of them. Make sure that you analyze each website’s content before enrolling in a training course. The best option is to enroll in a training course offered by Microsoft itself.

2. Use Study Guides

The busy applicants can get the Microsoft AZ-104 study guides from trustworthy publishers. Its significance is that they can always read it whenever you get time.

For example, you can choose to spend a few minutes of your lunch break reading the AZ-104 study guide rather than chatting with others.

3. Engage in Online Communities

Find out Microsoft online study communities and actively participate in them. Even though these forums may not have organized learning, they can give you beneficial information and lead you to an accurate path.

4. Take AZ-104 Practice Tests

Just studying is not enough. You need to take some practice tests to get an idea of what Microsoft AZ-104 looks like. Exam dumps can also help you with that.

What Are the Benefits of Passing the Microsoft AZ-104 Certification Exam?

Now that you know all the essential information of the Microsoft AZ-104 exam, you might want to know how this Microsoft certification exam can benefit your career. There are a lot of benefits that this qualifying exam may deliver to you, and here are a few of them:

1. It Proves Your Knowledge and Skills

The Microsoft AZ-104 exam works as a solid proof of your skills, which many organizations trusts. Once you pass the certification exam, you can prove to your current and potential organizations that you have solid expertise in Microsoft Teams application and proficiency to better business processes using its latest tools.

2. Improve Your Career Development

Passing the Microsoft AZ-104 exam will improve your odds of being hired many times. This is because the Microsoft Certified - Azure Administrator Associate certification is greatly respected by the hiring managers as it is proof of more productivity and dedication to development.

3. Microsoft is an Internationally Accepted Vendor

The reputation of Microsoft is popular in the job market. Both the hiring managers and professionals know that not everyone manages to crack the Microsoft AZ-104 exam. So, passing this exam will allow you to excel in the crowd of non-certified peers and make you a preferred applicant during your job interview.

4. Passing AZ-104 Exam Keeps You Acquainted With the Industry Latest Trends

The Microsoft AZ-104 certification exam covers modern concepts that you will be able to perform right away after finishing your preparation process. Thus, you will become a worthy member of your team.

Conclusion

Passing the Microsoft AZ-104 exam is the best way to follow if you want to learn more about Microsoft Teams. Other than opening the doors of international organizations, it will help you get more confidence in your skills and become more productive while performing your everyday tasks.

Tuesday, 9 November 2021

Protect workloads with inline DDoS protection from Gateway Load Balancer partners

DDoS attacks are rapidly evolving in complexity and frequency. As we highlighted in our 2021 Q1 and Q2 DDoS attack trends review, we see that attacks in Azure have been trending toward shorter durations, mostly short-burst attacks. Workloads that are highly sensitive to latency, such as those in the multiplayer online gaming industry, cannot tolerate such short burst DDoS attacks, which can cause outages ranging from two to 10 seconds that result in availability disruption.

Today, we are announcing the preview of inline DDoS protection which will be offered through partner network virtual appliances (NVAs) that are deployed with Azure Gateway Load Balancer and integrated with Azure DDoS Protection Standard in all Azure regions. Inline DDoS protection mitigates even short-burst low-volume DDoS attacks instantaneously without impacting the availability or performance of highly latency-sensitive applications.

Azure DDoS Protection Standard is the recommended product to protect your resources against L3/4 attacks in Azure. Third-party inline L7 DDoS protection, combined with Azure DDoS Protection Standard, provides comprehensive L3 to L7 protection against volumetric as well as low-volume DDoS attacks. Azure customers using third-party DDoS protection services for inline mitigation now have the option to use the marketplace offering along with Azure DDoS Protection Standard. This solution enables comprehensive inline L7 DDoS protection for high performance and high availability scenarios using different providers.

Gateway Load Balancer enables the protection of such workloads by ensuring the relevant NVAs are injected into the ingress path of the internet traffic. Once chained to a Standard Public Load Balancer frontend or IP configuration on a virtual machine, no additional configuration is needed to ensure traffic to and from the application endpoint is sent to the Gateway Load Balancer.

Easily deploy inline DDoS protection with partner network virtual appliances

Deployment of inline DDoS NVA can be done in a few easy steps:

1. Find your virtual appliance in Azure Marketplace.

2. Deploy the NVA instances.

3. Create a Gateway Load Balancer and place the NVA instances in the backend pool.

4. Chain the Gateway Load Balancer to your public IP or Standard Load Balance frontend.

Gateway Load Balancer provides transparent flow (bump in the wire) using an overlay network with low latency, preserving the health of the host as well as the NVAs during the DDoS attacks.

DDoS Protection, Azure Exam, Azure Exam Prep, Azure Tutorial and Materials, Azure Certification, Azure Guides, Azure Learning, Azure Preparation

Inbound traffic is always inspected with the NVAs in the path and the clean traffic is returned to the backend infrastructure (gamer servers).

Traffic flows from the consumer virtual network to the provider virtual network and then returns to the consumer virtual network. The consumer virtual network and provider virtual network can be in different subscriptions, tenants, or regions enabling greater flexibility and ease of management.

DDoS Protection, Azure Exam, Azure Exam Prep, Azure Tutorial and Materials, Azure Certification, Azure Guides, Azure Learning, Azure Preparation
Enabling Azure DDoS Protection Standard on the VNET of the Standard Public Gateway Load Balancer frontend or VNET of the virtual machine will offer protection from L3/4 DDoS attacks.

1. Unfiltered game traffic from the internet is directed to the public IP of the game servers Gateway Load Balancer.
2. Unfiltered game traffic is redirected to the chained Gateway Load Balancer private IP.
3. The unfiltered game traffic is inspected for DDoS attacks in real-time through the partner NVAs.
4. Filtered game traffic is sent back to the game servers for final processing.
5. Azure DDoS Protection Standard on the gamer servers Gateway Load Balancer protects from L3/4 DDoS attacks and the DDoS protection policies are automatically tuned for game servers traffic profile and application scale.

Source: microsoft.com

Sunday, 7 November 2021

Key foundations for protecting your data with Azure confidential computing

Azure Confidential Computing, Azure Exam Prep, Azure Certification, Azure Guides, Azure Prep, Azure Guides, Azure Online Guides

The exponential growth of datasets has resulted in growing scrutiny of how data is exposed—both from a consumer data privacy and compliance perspective. In this context, confidential computing becomes an important tool to help organizations meet their privacy and security needs surrounding business and consumer data.

Confidential computing technology encrypts data in memory and only processes it once the cloud environment is verified, preventing data access from cloud operators, malicious admins, and privileged software such as the hypervisor. It helps keep data protected throughout its lifecycle—in addition to existing solutions of protecting data at rest and in transit, data is now protected while in use.

Thanks to confidential computing, organizations across the world can now unlock opportunities that were not possible before. For example, they can now benefit from multi-party data analytics and machine learning that combine datasets from parties that would have been unwilling or unable to share them, keeping data private across participants. In fact, RBC created a platform for privacy-preserving analytics for customers to opt-in for more optimized discounts. The platform generates insights into consumer purchasing preferences by confidentially combining RBC’s credit and debit card transactions with retailer data of what specific items consumers purchased.

Industry leadership and standardization

Microsoft has long been a thought leader in the field of confidential computing. Azure introduced “confidential computing” in the cloud when we became the first cloud provider to offer confidential computing virtual machines and confidential containers support in Kubernetes for customers to run their most sensitive workloads inside Trusted Execution Environments (TEEs). Microsoft is also a founding member of the Confidential Computing Consortium (CCC), a group that brings together hardware manufacturers, cloud providers, and solution vendors to jointly work on ways to improve and standardize data protection across the tech industry.

Confidential computing foundations

Our bar for confidentiality aligns and extends the bar set by the CCC to provide a comprehensive foundation for confidential computing. We strive to provide customers the technical controls to isolate data from Microsoft operators, their own operators, or both. In Azure, we have confidential computing offerings that go beyond hypervisor isolation between customer tenants to help protect customer data from access by Microsoft operators. We also have confidential computing with secure enclaves to additionally help prevent access from customer operators.

Our foundation for confidential computing includes:

◉ Hardware root-of-trust to ensure data is protected and anchored into the silicon. Trust is rooted to the hardware manufacturer, so even Microsoft operators cannot modify the hardware configurations.

◉ Remote attestation for customers to directly verify the integrity of the environment. Customers can verify that both the hardware and software on which their workloads run are approved versions and secured before allowing them to access data.

◉ Trusted launch is the mechanism that ensures virtual machines boot with authorized software and that uses remote attestation so that customers can verify. It’s available for all VMs including confidential VMs, bringing Secure Boot and vTPMs, to add defense against rootkits, bootkits, and malicious firmware.

◉ Memory isolation and encryption to ensure data is protected while processing. Azure offers memory isolation by VM, container, or application to meet the various needs of customers, and hardware-based encryption to prevent unauthorized viewing of data, even with physical access in the datacenter.

◉ Secure key management to ensure that keys stay encrypted during their lifecycle and release only to the authorized code.

The above components together form the foundations for what we consider to be confidential computing. And today, Azure has more confidential compute options spanning hardware and software than any other cloud vendor.

Innovative new hardware

Our new Intel-based DCsv3 confidential VMs include Intel SGX that implement hardware-protected application enclaves. Developers can use SGX enclaves to reduce the amount of code that has access to sensitive data to a minimum. Additionally, we will enable Total Memory Encryption-Multi-Key (TME-MK) so that each VM can be secured with a unique hardware key.

Our new AMD-based DCasv5/ECasv5 confidential VMs available provide Secure Encrypted Virtualization-Secure Nested Paging (SEV-SNP) to provide hardware-isolated virtual machines that protect data from other virtual machines, the hypervisor, and host management code. Customers can lift and shift existing virtual machines without changing code and optionally leverage enhanced disk encryption with keys they manage or that Microsoft manages.

To support containerized workloads, we are making all of our confidential VMs available in Azure Kubernetes Service (AKS) as a worker node option. Customers can now protect their containers with Intel SGX or AMD SEV-SNP technologies.

Azure’s memory encryption and isolation options provide stronger and more comprehensive protections for customer data than any other cloud.

Customer successes across industries

Many organizations are already leveraging the great data privacy and security benefits of Azure confidential computing.

Secure AI Labs have been using a private preview of our AMD-based virtual machines to create a platform where healthcare researchers can more easily collaborate with healthcare providers to advance research. Luis Huapaya, VP of Engineering at Secure AI Labs Inc, mentions, “Because of Azure confidential computing, Secure AI Labs can realize all of the benefits of running in Azure without ever sacrificing on security. One could argue that running a virtual payload within Azure confidential computing might be more secure than running in a private server on-premise. It also offers remote attestation, a pivotal security feature because it provides a virtual payload the ability to give cryptographic proof of its identity and verify its running inside an enclave. Azure confidential computing with AMD SEV-SNP makes our job a lot easier.”

While regulated industries have been the early adopters due to compliance needs and highly sensitive data, we are seeing growing interest across industries, from manufacturing to retail and energy, for example.

Signal Messenger, a worldwide messaging app known for high security and privacy, leverages Azure confidential computing with Intel SGX to protect customer data, such as contact info. Jim O’Leary, VP of Engineering at Signal says, “To meet the security and privacy expectations of millions of people every day, we utilize Azure confidential computing to provide scalable, secure environments for our services. Signal puts users first, and Azure helps us stay at the forefront of data protection with confidential computing.”

We are excited to see organizations bring more workloads to Azure with confidence in the data protection of Azure confidential computing to meet the privacy needs of their customers.

Azure confidential cloud

Azure is the world’s computer from cloud to edge. Customers of all sizes, across all industries, want to innovate, build, and securely operate their applications across multi-cloud, on-premises, and edge. Just as HTTPS has become pervasive for protecting data during internet web browsing, here at Azure, we believe that confidential computing will be a necessary ingredient for all computing infrastructure. 

Our vision is to transform the Azure cloud into the Azure confidential cloud, moving from computing in the clear to computing confidentially across the cloud and edge. We want to empower customers to achieve the highest levels of privacy and security for all their workloads.

Alongside our $20 billion investment over the next five years in advancing our security solutions, we will partner with hardware vendors and innovate within Microsoft to bring the highest levels of data security and privacy to our customers. In our journey to become the world’s leading confidential cloud, we will drive confidential computing innovations horizontally across our Azure infrastructure and vertically through all the Microsoft services that run on Azure.

Source: microsoft.com

Saturday, 6 November 2021

3 ways to enhance your cloud journey: break, prepare, and optimize your apps in Azure

Azure, Azure Exam Prep, Azure Certification, Azure Learning, Azure Guides, Azure Preparation, Azure Tutorial and Materials, Azure Learning, Azure Jobs

At our Microsoft Ignite conference this week, we’re sharing new ways to continue to enhance your cloud adoption journey. First, we are excited to announce the preview of Azure Chaos Studio—a fully managed experimentation service to help you track, measure, and mitigate faults with controlled chaos engineering, systemically improving the resilience of your cloud applications. Second, we continue to build a better, easier, and more comprehensive approach to guide your cloud environment preparation, with Azure landing zone—an environment for hosting your workloads, pre-provisioned through code—without losing sight of the application to be deployed. And third, to maximize your organization's investment in the cloud, we have fresh guidance on cloud economics to build your cloud business case with key financial and technical guidance from Azure.

1. Azure Chaos Studio: Track, measure, and mitigate faults in your applications 

Are you confident in knowing how your mission-critical Azure apps will respond to different failure scenarios? Today we’re announcing the preview of Azure Chaos Studio, which enables customers to model and deliberately introduce faults that simulate real-world outages and validate how your application performs under those scenarios. These are the same tools used by Microsoft engineers to build the resilience of cloud services. Chaos Studio enables customers to build on your terms, seeing how apps will respond to real-world disruptions. Gain chaos engineering insights and accelerate the discovery of hard-to-find issues, from late-stage development through to production, all within Azure.

With Azure Chaos Studio you can:

◉ Spend more time learning about application resiliency, without managing tools and scripts.

◉ Get started quickly with agent-based faults that disrupt within resources, and service-based faults that disrupt resources at the control plane.

◉ Validate product quality wherever and whenever you want with CI/CD pipeline integration, using both to monitor production quality through continuous validation.

We continue to work on building a comprehensive and continuously expanding library of fault types (including resource pressure faults for CPU, memory, and disk, network faults to add latency or block access to resources, and faults for service-specific disruptions like Cosmos DB failover, VM Shutdown, and Redis restart) for you and your team to understand how your application would perform in such scenarios and to help you implement a cohesive strategy to make informed decisions to make your application more reliable. 

2. Azure landing zone: Eyes on the target end-state

As you approach every new cloud project, preparing your environment for cloud application is critical for your success. Whether you are creating a brand-new application or modernizing an existing one, adopting cloud technologies is a journey—where you're traveling towards a target end-state. Azure landing zone conceptual architecture is the master architectural plan, based on Microsoft best practices and reference implementations described in the Microsoft Cloud Adoption Framework for Azure. Its scaled-out controls enable you to innovate safely, in alignment with the cloud operating model that works best for your organization, and progress through different stages, arriving at your destination at your own pace.

If the target Azure landing zone conceptual architecture fits within the operating model and resource structure you're already planning to use, we can help you deploy it with a ready-made deployment experience. To help you successfully prepare your cloud environments for any projects, including migration to the cloud, modernization of existing apps, or building cloud-native apps on Azure, with proper scaled-out governance and security in the cloud, quickly, we have developed Azure landing zone accelerators. These are Azure-portal-based templates that allow you to deploy a pre-configured environment implementation with key components in place, such as management groups and policies providing a full implementation of the conceptual architecture based on standards and best practices. Simply point and click through the template by answering a few environment architecture questions to customize it to your needs, in your Azure Portal experience, and to deploy your environment automatically and quickly.

3. Cloud economics: Make your cloud investment count

“More than 70 percent of the customers have cost optimization as their top cloud initiative.” Through research and many customer interactions, we have made available new cost management approaches to guide your business strategy and goals definition. Our updated approach is here to help you understand the cloud economics of your investments and now includes financial tools and templates to build your business case and optimize your cloud investment.

We have also created a Power BI template, backed by an Excel model, that can help you forecast your costs early on as you’re planning your cloud journey. Use the Power BI template and on-premises compute unit list to understand how you can take advantage of Azure offers and benefits like Azure Hybrid Benefit and reserved instances to save against pay-as-you-go pricing and optimize your costs.

Check this new video guidance and learn about some lesser-known tips for estimating and managing your cloud spend. From the Azure Cost Management Portal to Power BI for custom views and reporting, to API-based integration that feeds cost data into your existing financial systems, we'll help you to keep track of your cloud spend and optimize your investment in the cloud.

Source: microsoft.com

Thursday, 4 November 2021

Transform your business with Microsoft's unrivaled end-to-end data platform

Microsoft Exam Prep, Microsoft Tutorial and Materials, Microsoft Career, Microsoft Certification, Microsoft, Microsoft Guides, Microsoft Jobs

Businesses today are experiencing the most significant change in our economy in a generation and the ability to realize the full potential of the cloud through digital transformation has never been greater. There are three core capabilities businesses need that together drive successful transformation: limitless database scale and performance, unmatched analytics and insights, and unified data governance.

Today we shared a number of announcements to help organizations evolve rapidly even in the face of sustained uncertainty. Across industries, companies are investing in these core capabilities to ensure they have the flexibility to innovate anywhere and meet evolving business needs.

Limitless database scale and performance

We are announcing the preview of SQL Server 2022, the most Azure-enabled SQL Server release yet. This enables easier cloud integration than ever before with new disaster recovery functionality for Azure SQL Managed Instance and seamless analytics over on-premises operational data with Synapse Link for SQL Server. Microsoft continues to invest in our Azure SQL family of databases providing flexible options for app migration, modernization, and development. Azure SQL Managed Instance has a number of key announcements, including business offers with significantly more memory per vCore and increased storage to 16 TB in both the general purpose and business-critical service tiers, providing more room for app growth.

Azure is also the best cloud for NoSQL and open source databases. Azure Cosmos DB is a fully managed NoSQL database built to support production applications at any size or scale. Financially backed service-legal agreements (SLA's) guaranteeing consistency, high throughput, less than 10 ms latency reads and writes, and 99.999 percent availability make it simple for developers to independently and elastically scale applications across any Azure region. Today we are announcing the general availability of Azure Managed Instance for Apache Cassandra, providing the ability to provision managed native Apache Cassandra clusters with automated deployment and scaling operations, accelerating hybrid scenarios, and reducing ongoing maintenance. Flexible Server (preview), a new deployment option for Azure Database for MySQL is also now generally available. With Flexible Server, customers can benefit from a fully managed service designed to provide maximum control of your databases, high availability options to ensure zero data loss, built-in capabilities for cost optimization, and increased productivity enabled by the Azure ecosystem.

Unmatched analytics and insights

When it comes to analytics and insights, Azure is simply unmatched. When we debuted Azure Synapse Analytics, we effectively removed the barriers between enterprise data warehousing and big data analytics to enable data professionals to collaborate, build, and manage their analytic solutions with ease. Since its launch, the number of Azure customers running petabyte-scale workloads has increased fivefold. 

But we didn’t stop there. We introduced Azure Synapse Link for Azure Cosmos DB last year, creating a simple, low-cost, cloud-native HTAP implementation that enables immediate, in-the-moment business insights. We continue our investments today with the general availability of Azure Synapse Link for Dataverse, enabling immediate insights with high-value Dynamics 365 data, and a preview of Azure Synapse Link for SQL Server 2022. We are also announcing that Azure Synapse Data Explorer is joining the existing SQL and Apache Spark analytical runtimes in Azure Synapse—a highly scalable engine to automatically index structured, semi-structured, and free-text data commonly found in telemetry, time series, and logs to surface enriched insights that can directly improve business decisions. 

Unified data governance

The need for a comprehensive data governance service has never been stronger. Azure Purview, our unified data governance service, helps businesses manage and govern their on-premises, multi-cloud, and software as a service (SaaS) data. It’s exciting to see the strong interest in Azure Purview since its general availability, with over 57 billion data assets discovered by customers already. Azure Purview ensures your Microsoft data estate is governed through deep integrations with data services in Azure, Microsoft 365, Microsoft Power Platform, and more. Every organization can now build a unified data governance solution to maximize the value of their data in the cloud.

Microsoft Exam Prep, Microsoft Tutorial and Materials, Microsoft Career, Microsoft Certification, Microsoft, Microsoft Guides, Microsoft Jobs

Customers like the London Heathrow Airport, for example, have already achieved tremendous success with Azure Purview. The airport’s data teams use lineage data in Azure Purview to find correlations between datasets which is critical in determining how each airport service affects broader operations.

“We used to have a lot of data about things we generally already understood but now that we’ve adopted Azure Purview, we have insight into the unknown, meaning we can aggregate multiple data sources in a more user-friendly way to discover where we can create efficiencies and make better predictions.”—Dave Draffin, Azure Cloud and Data Architect, Heathrow Airport

Source: microsoft.com