Tuesday, 27 June 2023

Removing barriers to autonomous vehicle adoption with Microsoft Azure

Azure, Azure Exam, Azure Exam Prep, Azure Tutorial and Materials, Azure Career, Azure Skills, Azure Jobs


In the over 150 years since the automotive industry was founded, it has never experienced such rapid innovation and transformational change as it is currently experiencing. Since the advent of the horseless carriage in the 1860s, vehicle manufacturers have continued to improve the quality, safety, speed, and comfort of millions of automotive models sold around the world, each year.

Today, however, all eyes are on autonomous vehicles as a cornerstone of future human mobility.

Exponential market growth expected


Over the past decade, the impact of emerging technologies such as AI, machine vision, and high-performance computing (HPC) has changed the face of the automotive industry. Today, nearly every car manufacturer in the world is exploring the potential and power of these technologies to usher in a new age of self-driving vehicles. Microsoft Azure HPC and Azure AI infrastructure are tools to help accomplish that.

Data suggests that the global autonomous vehicle market, with level two autonomous features present in cars, was worth USD76 billion in 2020, but is expected to grow exponentially over the coming years to reach over USD2.1 trillion by 2030, as levels of autonomy features in cars continue to increase.

The platformization of autonomous taxis also holds enormous potential for the broader adoption and usage of autonomous vehicles. Companies like Tesla, Waymo, NVIDIA, and Zoox are all investing in the emerging category of driverless transportation that leverages powerful AI and HPC capabilities to transform the concept of human mobility. However, several challenges still need to be overcome for autonomous vehicles to reach their potential and become the de facto option for car buyers, passengers, and commuters.

Common challenges persist


Azure, Azure Exam, Azure Exam Prep, Azure Tutorial and Materials, Azure Career, Azure Skills, Azure Jobs
One of the most important challenges with autonomous vehicles is ethics. If the vehicle determines what action to take during a trip, how does it decide what holds the most value during an emergency? To illustrate, if an autonomous vehicle is traveling down a road and two pedestrians suddenly run across the road from opposite directions, what are the ethics underpinning whether the vehicle swerves to collide with one pedestrian instead of another?

Another of the top challenges with autonomous vehicles is that the AI algorithms underpinning the technology are continuously learning and evolving. Autonomous vehicle AI software relies heavily on deep neural networks, with a machine learning algorithm tracking on-road objects as well as road signs and traffic signals, allowing the vehicle to ‘see’ and respond to—for example, a red traffic light.

Where the tech still needs some refinement is with the more subtle cues that motorists are instinctually aware of. For example, a slightly raised hand by a pedestrian may indicate they are about to cross the road. A human will see and understand the cue far better than an AI algorithm does, at least for now.

Another challenge is whether there is sufficient technology and connectivity infrastructure for autonomous vehicles to offer the optimal benefit of their value proposition to passengers, especially in developing countries. With car journeys from A to B evolving into experiences, people will likely want to interact with their cars based on their personal technology preferences, linked to tools from leading technology providers. In addition, autonomous vehicles will also need to connect to the world around them to guarantee safety and comfort to their passengers.

As such, connectivity will be integral to the mass adoption of autonomous vehicles. And with the advent and growing adoption of 5G, it may improve connectivity and enable communication between autonomous vehicles—which could enhance autonomous vehicles’ safety and functioning.

Road safety is not the only concern with autonomous vehicles. Autonomous vehicles will be designed to be hyper-connected, almost like an ultra-high-tech network of smartphones on wheels. However, an autonomous vehicle must be precisely that—standalone autonomous. If connectivity is lost, the autonomous vehicle must still be able to operate fully autonomously.

That being said, there is still the risk that cyberattacks could pose a threat to autonomous vehicle motorists, compared to legacy vehicles currently on the road. In the wake of a successful cyberattack, threat actors may gain access to sensitive personal information or even gain control over key vehicle systems. Manufacturers and software providers will need to take every step necessary to protect their vehicles and systems from compromise.

Lastly, there are also social and cultural barriers to the mainstreaming of autonomous vehicles with many people across the globe still very uncomfortable with the idea of giving up control of their cars to a machine. Once consumers can experience autonomous drives and see how the technology continuously monitors a complete 360-degree view around the vehicle and does not get drowsy or distracted, confidence that autonomous vehicles are safe and secure will grow, and adoption rates will rise.

The future of travel is (nearly) upon us


As the world moves closer to a future where autonomous vehicles are a ubiquitous presence on our roads, the complex challenges that must be addressed to make this a safe and viable option become ever more apparent. The adoption of autonomous vehicles is not simply a matter of developing the technology, but also requires a complete overhaul of how we approach transportation systems and infrastructure.

Azure, Azure Exam, Azure Exam Prep, Azure Tutorial and Materials, Azure Career, Azure Skills, Azure Jobs
To tackle the many challenges posed by autonomous vehicle adoption, companies and researchers are heavily investing resources into solving these complex challenges. For example, one way that researchers are addressing the ethical challenges posed by autonomous vehicles being able to make life or death decisions, is by developing ethical frameworks that guide the decision-making processes of these vehicles.

These frameworks define the principles and values that should be considered when autonomous vehicles encounter ethical dilemmas, such as deciding between protecting the safety of passengers versus that of pedestrians. Such frameworks can help ensure that autonomous vehicles make ethical decisions that are consistent with societal values and moral principles.

Significant investments are also being made into updating existing infrastructure to accommodate autonomous vehicles. Roads, highways, and parking areas must be equipped with the necessary infrastructure to support autonomous vehicles, such as sensors, cameras, and communication systems.

Companies are also working collaboratively with regulators, researchers, and OEMs to develop policies that ensure that autonomous vehicles can operate safely alongside traditional vehicles. This includes considerations such as how traffic signals, road markings, and signage need to be adapted to support autonomous vehicles.

In 2021, for example, Microsoft teamed up with a market leading self-driving car innovator to unlock the potential of cloud computing for autonomous vehicles, leveraging Microsoft Azure to commercialize autonomous vehicle solutions at scale.

Another global automotive group also recently announced a collaboration with Microsoft to build a dedicated cloud-based platform for its autonomous car systems that are currently in development. This ties in with their ambitious plans to invest more than USD32 billion in the digitalization of the car by 2025.

NVIDIA is also taking bold steps to fuel the growth of the autonomous vehicle market. The NVIDIA DRIVE platform is a full-stack AI compute solution for the automotive industry, scaling from advanced driver-assistance systems for passenger vehicles to fully autonomous robotaxis. The end-to-end solution spans from the cloud to the car, enabling AI training and simulation in the data centre, in addition to running deep neural networks in the vehicle for safe and secure operations. The platform is being utilized by hundreds of companies in the industry, from leading automakers to new energy vehicle makers.

Key takeaways


There is little doubt that the future of human mobility is built upon the ground-breaking innovation and technological capabilities of autonomous vehicles. While some challenges still exist, the underlying technology continues to mature and improve, paving the way for an increase in the adoption of self-driving cars long term.

The technology may soon proliferate and displace other, less safe modes of transport, with huge potential upsides for many aspects of our daily lives, such as saving lives and reducing the number of accidents, decreasing commute times, optimizing traffic flow and patterns, thereby lessening congestion, and extending the freedom of mobility for all.

With vehicle manufacturers and software firms continuously iterating on autonomous vehicle technology, continuing to educate the public on their benefits and continuing to work with lawmakers to overcome regulatory hurdles, we may all soon enjoy a new world, one where technology gets us safely from one destination to another, leaving us free to simply enjoy the view.

Source: microsoft.com

Saturday, 24 June 2023

Deploy a holistic view of your workload with Azure Native Dynatrace Service

Microsoft and Dynatrace announced the general availability of Azure Native Dynatrace Service in August 2022. The native integration enables organizations to leverage Dynatrace as a part of their overall Microsoft Azure solution. Users can onboard easily to start monitoring their workloads by deploying and managing a Dynatrace resource on Azure.

Azure Native integration enables you to create a Dynatrace environment like you would create any other Azure resource. One of the key advantages of this integration is the ability to seamlessly ship logs and metrics to Dynatrace. By leveraging Dynatrace OneAgent, users can also gather deeper observability data from compute resources such as virtual machines and Azure App Services. This comprehensive data collection ensures that organizations have a holistic view of their Azure workloads and can proactively identify and resolve issues. 

Furthermore, the integration unifies billing for Azure services, including Dynatrace. Users receive a single Azure bill that encompasses all the services consumed on the platform, providing a unified and convenient billing experience. 

Since its release, Dynatrace Service has seen continuous enhancements. In the following sections, we will explore some of the newer capabilities that have been added to further empower organizations in their monitoring and observability efforts. 

Automatic shipping of Azure Monitor platform metrics 


One of the significant advancements during the general availability of Azure Native Dynatrace Service was the automatic forwarding of logs from Azure Monitor to Dynatrace. The log forwarding capability allows you to configure and send Azure Monitor logs to Dynatrace. Logs start to flow to your Dynatrace environment as soon as the Dynatrace resource on Azure is deployed. The Azure experience allows you to view the summary of all the resources being monitored in your subscription. 

Building further, we have now added another key improvement and that is the ability to automatically obtain metrics from the Azure Monitor platform. This enhancement enables users to effectively view the metrics of various services within Azure on the Dynatrace portal. 

To enable metrics collection, customers can simply check a single checkbox on the Azure portal. This streamlined process makes it easy for organizations to start gathering valuable insights. For further customization, users have the option to specify tags to include or exclude specific resources for metric collection. This allows for a more targeted monitoring approach based on specific criteria.

Azure Native Dynatrace Service, Azure Certification, Azure Guides, Azure Prep, Azure Preparation, Azure Tutorial and Materials

The setup of credentials required for the interaction between Dynatrace and Azure is automated, eliminating the need for manual configuration. Once the metrics are collected, users can conveniently view and analyze them on the Dynatrace portal, providing a comprehensive and centralized platform for monitoring and observability. 

Azure Native Dynatrace Service, Azure Certification, Azure Guides, Azure Prep, Azure Preparation, Azure Tutorial and Materials

Together with logs and metrics monitoring capabilities, Azure Native Dynatrace Service provides holistic monitoring of your Azure workloads. 

Native integration availability in new Azure regions 


During general availability, Azure Native Dynatrace Service was available in two regions, the Eastern United States and Western Europe. However, to cater to the growing demand, native integration is now available in additional regions. You can now create a Dynatrace resource in—The United Arab Emirates North (Middle East), Canada Central, and the Western United States—bringing the total number of supported regions to five. You can select the region in the resource creation experience. When selecting a region to provision a Dynatrace resource, the corresponding Dynatrace environment is provisioned in the same Azure region. This ensures that your data remains within the specified region. Hence, it gives you the power to leverage the power of Dynatrace within the Azure region while complying with the specific data residency regulations and preferences of your organization. 

Monitor activity with Azure Active Directory logs


In the realm of cloud business, early detection of security threats is crucial to safeguarding business operations. Azure Active Directory (Azure AD) activity logs—encompassing audit, sign-in, and provisioning logs—offer organizations essential visibility into the activities taking place within their Azure AD tenant. By monitoring these logs, organizations can gain insights into user and application activities, including user sign-in patterns, application changes, and risk activity detection. This level of visibility empowers organizations to respond swiftly and effectively to potential threats, enabling proactive security measures and minimizing the impact of security incidents on their operations. 

With Azure Native Dynatrace Service, you can route your Azure AD logs to Dynatrace by setting Dynatrace as a destination in Azure AD diagnostic settings.

Azure Native Dynatrace Service, Azure Certification, Azure Guides, Azure Prep, Azure Preparation, Azure Tutorial and Materials

Committed to collaboration and integration


The Azure Native integration for Dynatrace has simplified the process of gaining deep insights into workloads. This integration empowers organizations to optimize their resources, enhance application performance, and deliver high availability to their users. Microsoft and Dynatrace remain committed to collaborating and improving the integration to provide a seamless experience for their joint customers. By working together, both companies strive to continually enhance the monitoring and observability capabilities within the Azure ecosystem. 

The product is constantly evolving to deepen the integration, aiming to monitor a wide range of Azure workloads and uplift user convenience throughout the experience. 

Source: microsoft.com

Thursday, 22 June 2023

Azure Data Manager for Energy and LogScope: Enabling data integration within minutes

Revolutionizing access to OSDU Log Data with LogScope


Harvey Rock Physics (HRP) has revolutionized the way geoscientists and petrophysicists interact with well data, by integrating their LogScope software platform built on the Microsoft Cloud. This innovative solution provides seamless integration of borehole data into petrophysical and geological models, enabling users to view, filter, and deliver data to the OSDU Data Platform, such as Azure Data Manager for Energy with ease.

Founded in 2011, HRP is a leader in petrophysical technologies, specializing in mobile solutions. The LogScope software is built on a modern code base that allows HRP to rapidly create modules to meet client needs. In this case, HRP created a module that allows efficient entry and graphical viewing of data in the OSDU Data Platform.

The OSDU Data Platform provides flexible storage that is application-independent, allowing users to store data and use multiple applications in complex workflows. LogScope provides seamless interrogation through a map view that enables borehole-related data to be spatially visualized and selected for further inspection in a single well or cross-section mode.

LogScope is designed to be readily accessible from the Microsoft Store, with versions available for Windows tablet computers, as well as Apple iPad and iPhone products. This software is developed on a modern code base, allowing HRP to quickly create modules to meet client needs. The LogScope application provides tight integration of borehole data into petrophysical and geological models.

Azure Data Manager, Azure Tutorial and Materials, Azure Guides, Azure Prep, Azure Learning, Azure Certification, Azure Preparation Exam

Search and ingest data in minutes


Download the application from the Microsoft Store or Apple App Store, install it on your device, enter your connection parameters, and you are up and running in minutes. You can search and ingest data from your mobile device. New-generation end users expect the installation and usage of applications to be as easy as downloading any application on their phones or tablets, however, the legacy applications are coming from the old world of IT and have a different user experience. With LogScope we are making deployment and usage of earth science applications as easy as a phone or tablet client.

One of the major advantages of this new tool is the ability to upload or ingest data directly from raw data file formats—typically LAS or DLIS—or directly from data inside LogScope. Large data volumes can be intuitively and quickly uploaded to the OSDU Data Platform.

Azure Data Manager, Azure Tutorial and Materials, Azure Guides, Azure Prep, Azure Learning, Azure Certification, Azure Preparation Exam

Overall, HRP’s LogScope software platform, powered by the Microsoft Cloud, provides an efficient and user-friendly way for geoscientists and petrophysicists to interact with well data. The innovative solution provides seamless integration of borehole data into petrophysical and geological models, enabling users to view, filter, and deliver data to the OSDU Data Platform with ease.

How to work with HRP solutions on Azure Data Manager for Energy


Azure Data Manager for Energy is an enterprise-grade, fully managed, OSDU Data Platform for the energy industry that is efficient, standardized, easy to deploy, and scalable for data management—ingesting, aggregating, storing, searching, and retrieving data. The platform will provide the scale, security, privacy, and compliance expected by our enterprise customers. The platform offers out-of-the-box compatibility with RoQC applications, which accelerates time-to-market and being able to run their domain workflows with ease, with data contained in Azure Data Manager for Energy, and with minimal effort.

Source: microsoft.com

Tuesday, 20 June 2023

Mercedes-Benz enhances drivers’ experience with Azure OpenAI Service

AI + Machine Learning, Announcements, Azure AI, Azure OpenAI Service
With ChatGPT, MBUX Voice Assistant “Hey Mercedes” will become even more intuitive – the U.S. beta program is expected to last three months.

When I started driving in the 1990s, I thought I was living in the future. My first car had everything I thought I could ever need: a built-in radio, lighting when you opened the door, windows you could roll down with a crank, a clock and even air-conditioning for those really hot days growing up on the East Coast.

That car is long gone, but my passion for driving things forward lives on, which is why I’m excited to share how Mercedes-Benz is using Microsoft AI capabilities to enhance experiences for some drivers today.

As the last six months have shown us, the power of generative AI goes beyond cutting-edge language models—it’s what you build with it that matters most. Our Azure OpenAI Service lets companies tap into the power of the most advanced AI models (Open AI’s GPT-4, GPT-3.5, and more) combined with Azure’s enterprise capabilities and AI-optimized infrastructure to do extraordinary things.

Mercedes-Benz takes in-car voice control to a new level with Azure OpenAI Service


Today, Mercedes-Benz announced they are integrating ChatGPT via Azure OpenAI Service to transform the in-car experience for drivers. Starting June 16, drivers in the United States can opt into a beta program that makes the MBUX Voice Assistant’s “Hey Mercedes” feature even more intuitive and conversational. Enhanced capabilities include:

◉ Elevated voice command and interaction: ChatGPT enables more dynamic conversations, allowing customers to experience a voice assistant that not only understands voice commands but also engages in interactive conversations.

◉ Expanded task capability: Whether users need information about their destination, a recipe, or answers to complex questions, the enhanced voice assistant will provide comprehensive responses, allowing drivers to keep their hands on the wheel and eyes on the road.

◉ Contextual follow-up questions: Unlike standard voice assistants that often require specific commands, ChatGPT excels at handling follow-up questions and maintaining contextual understanding. Drivers can ask complex queries or engage in multi-turn conversations, receiving detailed and relevant responses from the voice assistant.

◉ Integration with third-party services: Mercedes-Benz is exploring the ChatGPT plugin ecosystem, which would open up possibilities for integration with various third-party services. This could enable drivers to accomplish tasks like restaurant reservations, movie ticket bookings, and more, using natural speech commands, further enhancing convenience and productivity on the road.

With the three-month beta program, Mercedes-Benz customers can become early adopters of this groundbreaking technology. Based on the findings of the beta program and customer feedback, Mercedes-Benz will consider further integration of this technology into future iterations of their MBUX Voice Assistant while maintaining the highest standards of customer privacy on and off the road.

With Microsoft, Mercedes-Benz is paving the way for a more connected, intelligent, and personalized driving experience, and accelerating the automotive industry through AI.

In case you missed it, at Microsft Build we recently announced updates to Azure OpenAI Service to help you more easily and responsibly deploy generative AI capabilities powered by Azure. You can now:

◉ Use your own data (coming to public preview later this month), allowing you to create more customized, tailored experiences based on organizational data.

◉ Add plugins to simplify integrating external data sources with APIs.

◉ Reserve provision throughput (generally available with limited access later this month) to gain control over the configuration and performance of OpenAI’s large language models at scale.

◉ Create safer online environments and communities with Azure AI Content Safety, a new Azure AI service integrated into Azure OpenAI Service and Azure Machine Learning prompt flow that helps detect and remove content from prompts and generation that don’t meet content management standards.

A responsible approach


Microsoft has a layered approach for generative models, guided by Microsoft’s responsible AI principles. In Azure OpenAI Service, an integrated safety system provides protection from undesirable inputs and outputs and monitors for misuse. In addition, Microsoft provides guidance and best practices for customers to responsibly build applications using these models and expects customers to comply with the Azure OpenAI Code of Conduct. With Open AI’s GPT-4, new research advances from OpenAI have enabled an additional layer of protection. Guided by human feedback, safety is built directly into the GPT-4 model, which enables the model to be more effective at handling harmful inputs, thereby reducing the likelihood that the model will generate a harmful response. 

Source: microsoft.com

Thursday, 15 June 2023

Azure OpenAI Service: 10 ways generative AI is transforming businesses

Azure OpenAI Service, Azure Career, Azure Skills, Azure Jobs, Azure Preparation, Azure Certification

Technology is advancing at an unprecedented pace, and businesses are seeking innovative ways to maintain a competitive edge. Nowhere is this truer than in the realms of generative AI. From generating realistic images and videos to enhancing customer experiences, generative AI has proven to be a versatile tool across various industries. In this article, we explore 10 ways businesses are utilizing this game-changing technology to transform their operations and drive growth.

1. Content creation and design: Effective content creation and design are crucial for attracting and engaging customers.

Generative AI enables businesses to create visually appealing and impactful content quickly and efficiently, helping them stand out in a crowded marketplace. Generative AI has revolutionized content creation by generating high-quality images, videos, and graphics. From designing logos and product visuals to creating engaging social media content, businesses are using generative AI algorithms to automate the creative process—saving time and resources.

The company Typeface ingests information about the brand, including style guidelines, images, and product details. Then, with just a few clicks, customers can generate an assortment of suggested images and text—pre-defined in templates for different use cases—that employees can select and customize for use in an online campaign, marketing email, blog post, or anywhere the company wants to use it.

2. Accelerated automation: Automating IT tasks improves employee experiences, enhances customer interactions, and drives more efficiency within a company’s developer community.

Providing employees with reliable automated support leads to increased efficiency, improved work life, and reduced operational costs.

AT&T is using Azure OpenAI Service to enable IT professionals to request resources like additional virtual machines; migrate legacy code into modern code; and empower employees to complete common human resources tasks, such as changing withholdings, adding a dependent to an insurance plan, or requisitioning a computer for a new hire.

3. Personalized marketing: Personalization increases the chances of customer engagement and conversion and can significantly improve marketing ROI.

Generative AI enables businesses to deliver hyper-personalized marketing campaigns. By analyzing customer data, generative algorithms can create dynamic content tailored to an individual’s preferences—optimizing engagement and conversion rates.

Through the Take Blip platform and Azure OpenAI Service, brands can have one-on-one conversations that include an infinite flow of interactions with each customer. Interactions are digitized: customers’ requests, intentions, and desires can be recorded and used to tune the platform, making future interactions much more productive.

4. Chatbots and virtual assistants: Chatbots and virtual assistants powered by generative AI provide instant and accurate responses to customer queries.

These intelligent systems can understand and respond to customer queries, provide recommendations, and offer personalized support—enhancing customer service, reducing wait times, improving operational efficiency, and boosting customer satisfaction and loyalty.

By using a common chatbot framework along with the Azure Bot Services, Johnson & Johnson employees without technical training can now build their own bots to serve their teams and customers at a fraction of the time and cost it took to develop previous chatbot projects.

5. Product and service innovation: Staying innovative and meeting evolving customer demands is essential for business success.

Local reporters used to be specialists—they focused their time on investigation and writing. Today, they need to be generalists who can create both written and video content and who knows how to maximize viewership on Facebook, Instagram, TikTok, YouTube, and potentially many other distribution channels.”

Nota has used Microsoft Azure OpenAI Service to build two AI-assisted tools—SUM and VID. These tools do a lot of the heavy lifting needed to optimize stories for distribution and turn written pieces into engaging videos that can produce up to 10 times as much revenue as written pieces.

6. Language translation and natural language processing: In a globalized world, language barriers can hinder communication and business growth.

Generative AI has improved language translation and natural language processing capabilities. Businesses can use generative models to accurately translate content in real time, enabling seamless communication across borders and bridging language barriers.

Microsoft Azure AI services augment HelloTalk’s AI learning tools and technical capabilities, allowing users to connect with the world through language and culture exchange.

7. Fraud detection and cybersecurity: Businesses face constant threats from fraudsters and cyberattacks.

By analyzing patterns and anomalies in large datasets, businesses can leverage generative models to detect and prevent fraud, safeguard sensitive information, and protect their digital assets.

Using federated learning techniques along with Azure Machine Learning and Azure confidential computing, Swift and Microsoft are building an anomaly detection model for transactional data—all without copying or moving data from secure locations.

8. Predictive analytics and forecasting: Accurate predictions and forecasting are vital for effective decision-making and operational efficiency.

Generative AI models excel in predictive analytics and forecasting. By analyzing historical data and identifying patterns, businesses can leverage generative algorithms to make accurate predictions and informed decisions, optimizing supply chain management, inventory forecasting, and demand planning.

Azure IoT helps Husky meet their system performance needs and maintain service levels for their customers. It scales quickly as they onboard new Advantage+Elite customers and reduces the time and resources spent on infrastructure maintenance.

9. Creative writing and content generation: Content generation can be time-consuming and resource-intensive. Generative AI algorithms automate the content creation process, allowing businesses to generate articles, blog posts, and other written materials quickly. This technology assists content creators and ensures a consistent flow of fresh and engaging content for audiences.

Generative AI algorithms automate the content creation process, allowing businesses to generate articles, blog posts, and other written materials quickly. This technology assists content creators and ensures a consistent flow of fresh and engaging content for audiences. Businesses and content creators can use these models to generate articles, blog posts, advertising copy, and more—saving time for content creators and providing fresh content to engage audiences.

With Azure OpenAI Service, CarMax is creating content for its website much more efficiently, freeing up its editorial staff to focus on producing strategic, longer-form pieces that require more insight. Letting Azure OpenAI Service take care of data-heavy summarization tasks gives them time to be more creative and feel more fulfilled.

10. Medical research and diagnosis: The healthcare industry can benefit from quickly diagnosing diseases—potentially leading to faster and more accurate diagnoses—improving patient outcomes.

Researchers can utilize generative models to analyze medical images, detect abnormalities, and aid in the development of new treatments. Additionally, generative AI algorithms can assist in diagnosing diseases by analyzing patient symptoms and medical records, potentially leading to more accurate and timely diagnoses.

At Cambridgeshire and Peterborough NHS Foundation Trust, a single patient’s case notes could have up to 2,000 documents. In the past, if you needed information that was stored 1,600 documents ago, you weren’t going to find it. Now, using Azure Cognitive Search it takes as little as three seconds to search for a keyword across those 2,000 documents to find it.

Each of the 10 ways mentioned above addresses significant challenges and opportunities facing businesses today. Azure OpenAI Service empowers businesses to streamline processes, enhance customer experiences, drive innovation, and make data-driven decisions—resulting in improved efficiency, profitability, and competitiveness. In the case of generative AI, what’s good for business is also good for its customers. By leveraging the power of machine learning and generative algorithms, businesses can improve customer experiences while also gaining a competitive advantage in today’s rapidly evolving digital landscape.

Our commitment to responsible AI


Microsoft has a layered approach for generative models, guided by Microsoft’s responsible AI principles. In Azure OpenAI Service, an integrated safety system provides protection from undesirable inputs and outputs and monitors for misuse. In addition, Microsoft provides guidance and best practices for customers to responsibly build applications using these models and expects customers to comply with the Azure OpenAI code of conduct. With GPT-4, new research advances from OpenAI have enabled an additional layer of protection.

Source: microsoft.com

Tuesday, 13 June 2023

Implement digital transformation with the Azure Data Manager for Energy Developer tier

Azure Data Manager, Azure Exam Prep, Azure Tutorial and Materials, Azure Certification, Azure Data Manager Guides, Azure Career, Azure Skills, Azure Jobs

Microsoft launched Azure Data Manager for Energy, a fully managed OSDUTM Data Platform offering, to help energy companies foster and accelerate their digital innovation. Azure Data Manager for Energy helps geoscientists better find, secure, and scale the use cases of their most valuable subsurface data. Customers can now easily deploy a cluster of curated and managed resources, to meet their unique data management needs. This enterprise-grade data platform breaks down these data silos—with the power, security, and scalability of an integrated and open cloud, built in alignment with the OSDU Technical Standard that simplifies implementation and speeds time-to-market. 

Azure Data Manager for Energy, our customers and partners have emphasized the importance of the ability to create new applications and strategically migrate data in a cost-effective manner. Independent software vendors and energy companies at the cutting edge of innovation require a smaller configuration of Azure Data Manager for Energy that reduces the cost barriers to developing and testing new subsurface solutions on the OSDU Data Platform. 

Today, we are pleased to announce the upcoming “Developer” tier for Azure Data Manager for Energy. Starting mid-June 2023, Azure Data Manager for Energy will be available in two tiers: Standard and Developer. The new tier is designed for software vendors and end users looking for more flexibility and speed in building out new applications and testing more OSDU Data Platform backed solutions. The Developer tier will provide users with the same high bar of security features, and application integration services as the Standard tier. With this tier, we aim to reduce the barriers to new innovation and enable customers and partners to select the Azure Data Manager for Energy tier that meets their business needs.

Begin your OSDU Data Platform journey with the Developer tier


Start with the Developer tier and then expand and scale to the Standard tier of Azure Data Manager for Energy. If your organization is in the evaluation stage of the OSDU Data Platform or building out a pre-production environment for testing purposes, we recommend using the Developer tier. The functionalities of the Developer tier are similar to the Standard tier at a lower cost and with reduced resource capacity. Organizations can isolate and manage their test and production environments more cost-effectively.

With the Developer tier, you can start your Azure Data Manager for Energy journey with a small number of users to mitigate cost and risk during early phases of exploration, planning, and proof of concept. In addition, the tier offers you the flexibility to evaluate and plan implementation, build demos for business use cases, as well as define your deployment pipeline. You can also validate that applications are compatible with Azure Data Manager for Energy, validate security features such as customer-managed encryption keys, or plan your broader data migration strategy. Organizations can also leverage the Developer tier for data sensitivity classification for data they want to keep strictly confidential in a separate instance. Customers can also seamlessly test a new OSDU Data Platform release, validate data ingestion scenarios, incorporate new third-party or in-house applications, validate service updates, or test API functionality—all without disrupting the production platform.

Once you move further along in your digitization journey, you can leverage the developer tier as your pre-production environment for testing and implementing large data changes or validate the impact of any service upgrade before moving to production. In addition, you can utilize the Developer tier as a secure and cost-effective way to manage access rights as you successfully execute your joint ventures, allowing you to keep your data and systems completely separated and reduce the likelihood of data pollution scenarios.

When to use the Standard tier 


Once an organization is ready to operationalize domain workflows (such as seismic and well log) or deploy tested predictive reservoir models to a production environment on the cloud, the existing Standard tier is recommended. The Standard tier has higher database throughput and offers service level agreement, multi-region backup, and disaster recovery resources to support production workloads. The Standard tier is targeted for production scenarios that require high availability, reliability, scale, and Azure’s enterprise promises. You can leverage the Standard tier of Azure Data Manager for Energy for full-scale production use cases, such as running subsurface models or migrating seismic data across multiple applications. It supports all the use cases of the Developer tier with the addition of: 

◉ Availability zones 
◉ Backup and disaster recovery 
◉ Service level agreement 
◉ Higher database throughput
◉ Higher data partition maximum relative to the Developer tier

Azure support will be available for both tiers with the highest priority, response times, and severity levels provided to the Standard tier.

Azure Data Manager, Azure Exam Prep, Azure Tutorial and Materials, Azure Certification, Azure Data Manager Guides, Azure Career, Azure Skills, Azure Jobs
Figure 1: Azure Data Manager for Energy tier comparison.

How to participate 


Once available, you will be able to easily create a Developer tier resource by going to Azure Marketplace, create portal, and select your desired tier. Get started creating new subsurface innovations with your implementation of the OSDU Data Platform and the Azure Data Manager for Energy Developer tier by reaching out to your account team.

Azure Data Manager, Azure Exam Prep, Azure Tutorial and Materials, Azure Certification, Azure Data Manager Guides, Azure Career, Azure Skills, Azure Jobs
Figure 2: Developer tier creation page.

Source: microsoft.com

Thursday, 8 June 2023

Azure Virtual WAN now supports full mesh secure hub connectivity

In May 2023, we announced the general availability of Routing intent and routing policies for all Virtual WAN customers. This feature is powered by the Virtual WAN routing infrastructure and enables Azure Firewall customers to set up policies for private and internet traffic. We are also extending the same routing capabilities to all Firewall solutions deployed within Azure Virtual WAN including Network Virtual Appliances and software-as-a-service (SaaS) solutions that provide Firewall capabilities.

Routing Intent also completes two secured hub use cases wherein users can secure traffic between Virtual WAN hubs as well as inspect traffic between different on-premises (branch/ExpressRoute/SD-WAN) that transits through Virtual WAN hubs.

Azure Virtual WAN (vWAN), networking-as-a-service, brings networking, security, and routing functionalities together to simplify networking in Azure. With ease of use and simplicity built in, vWAN is a one-stop shop to connect, protect, route traffic, and monitor your wide area network.

In this blog, we will first describe routing intent use cases, product experiences, and summarize with some additional considerations and resources for using routing intent with Virtual WAN.

Use cases for Virtual WAN


You can use Routing Intent to engineer traffic within Virtual WAN in multiple ways. Here are the main use cases:

Apply routing policies for Virtual Networks and on-premises


Customers implementing hub-and-spoke network architectures with large numbers of routes often find their networks hard to understand, maintain, and troubleshoot. In Virtual WAN, these routes can be simplified for traffic between Azure Virtual Networks and on-premises (ExpressRoute, VPN, and SD-WAN).

Virtual WAN makes this easier for customers by allowing customers to configure simple and declarative private routing policies. It is assumed that private routing policies will be applied for all Azure Virtual Networks and on-premises networks connected to Virtual WAN. Further customizations for Virtual Network and on-premises prefixes are currently not supported. Private routing policies instruct Virtual WAN to program the underlying Virtual WAN routing infrastructure to enable transit between two different on-premises (1) via a security solution deployed in the Virtual Hub. It also enables traffic transiting between two Azure Virtual Networks (2) or between an Azure Virtual Network and an on-premises endpoint (3) via a security solution deployed in the Virtual Hub. The same traffic use cases are supported for Azure Firewall, Network Virtual Appliances, and software-as-a-service solutions deployed in the hub.

Azure, Azure Career, Azure Skills, Azure Jobs, Azure Prep, Azure Preparation, Azure Tutorial and Materials, Azure Guides, Azure Learning, Azure Certifications
Figure 1: Diagram of a Virtual Hub showing sample private traffic flows (between on-premises and Azure).

Apply routing policies for internet traffic


Virtual WAN lets you set up routing policies for internet traffic in order to advertise a default (0.0.0.0/0) route to your Azure Virtual Networks and on-premises. Internet traffic routing configurations allow you to configure Azure Virtual Networks and on-premises networks to send internet outbound traffic (1) to security appliances in the hub. You can also leverage Destination-Network Address Translation (DNAT) features of your security appliance if you want to provide external users access to applications in an Azure Virtual Network or on-premises (2).

Azure, Azure Career, Azure Skills, Azure Jobs, Azure Prep, Azure Preparation, Azure Tutorial and Materials, Azure Guides, Azure Learning, Azure Certifications
Figure 2: Diagram of a Virtual Hub showing internet outbound and inbound DNAT traffic flows.

Apply routing policies for inter-hub cross-region traffic


Virtual WAN automatically deploys all Virtual Hubs across your Virtual WAN in a full mesh, providing zero-touch any-to-any connectivity region-to-region and hub-to-hub using the Microsoft global backbone. Routing policies program Virtual WAN to inspect inter-hub and inter-region traffic between two Azure Virtual Networks (1), between two on-premises (2), and between Azure Virtual Networks and on-premises (3) connected to different hubs. Every packet entering or leaving the hub is routed to the security solution deployed in the Virtual Hub before being routed to its final destination.

Azure, Azure Career, Azure Skills, Azure Jobs, Azure Prep, Azure Preparation, Azure Tutorial and Materials, Azure Guides, Azure Learning, Azure Certifications
Figure 3: Diagram of inter-region and inter-hub traffic flows inspected by security solutions in the hub.

User experience for routing intent


To use routing intent, navigate to your Virtual WAN hub. Under Routing, select Routing Intent and routing policies.

Configure an Internet or Private Routing Policy to send traffic to a security solution deployed in the hub by selecting the next hop type (Azure Firewall, Network Virtual Appliance, or SaaS solution) and corresponding next hop resource.

Azure, Azure Career, Azure Skills, Azure Jobs, Azure Prep, Azure Preparation, Azure Tutorial and Materials, Azure Guides, Azure Learning, Azure Certifications
Figure 4: Example configuration of routing intent with both Private and Internet routing policy in Virtual WAN Portal.

Azure Firewall customers can also configure routing intent using Azure Firewall Manager by enabling the ‘inter-hub’ setting.

Azure, Azure Career, Azure Skills, Azure Jobs, Azure Prep, Azure Preparation, Azure Tutorial and Materials, Azure Guides, Azure Learning, Azure Certifications
Figure 5: Enabling Routing Intent through Azure Firewall Manager.

After configuring routing intent, you can view the effective routes of the security solution by navigating to your Virtual Hub, then select Routing, and click Effective Routes. The effective routes of the security solution provide additional visibility to troubleshoot how Virtual WAN routes traffic that has been inspected by the Virtual hub’s security solution.

Azure, Azure Career, Azure Skills, Azure Jobs, Azure Prep, Azure Preparation, Azure Tutorial and Materials, Azure Guides, Azure Learning, Azure Certifications
Figure 6: View of getting the effective routes on a security solution deployed in the hub.

Before you get started with this feature, here are some key considerations:

1. The feature caters to users that consider Virtual Network and on-premises traffic as private traffic. Virtual WAN applies private routing policies to all Virtual Networks and on-premises traffic.

2. Routing intent is mutually exclusive with custom routing and static routes in the ‘defaultRouteTable’ pointing to Network Virtual Appliance (NVA) deployed in a Virtual Network spoke connected to Virtual WAN. As a result, use cases where users are using custom route tables or NVA-in-spoke use cases are not applicable.

3. Routing Intent advertises prefixes corresponding to all connections to Virtual WAN towards on-premises networks. Users may use Route Maps to summarize and aggregate routes and filter based on defined match conditions.

Source: microsoft.com

Tuesday, 6 June 2023

Increase gaming performance with NGads V620-series virtual machines

Virtual Machines, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Prep, Microsoft Preparation, Microsoft Tutorial and Materials, Microsoft Preparation Exam

Gaming customers across the world tend to look for the same critical components when choosing their playing environment: Performance, Affordability, and Timely Content. And for gaming in the cloud, there’s a fourth: Reliability.

With these clear guidelines in mind, we are excited to announce the public preview of our new NGads V620-series virtual machines (VMs). This VM series has GPU, CPU, and memory resources balanced to generate and stream high-quality graphics for a performant, interactive gaming experience hosted on Microsoft Azure. The new NGads instances give online gaming providers the power and stability that they need, at an affordable price.   

The NGads V620-series are GPU-enabled virtual machines powered by AMD Radeon PRO V620 GPU and AMD EPYC 7763 CPUs. The AMD Radeon PRO V620 GPUs have a maximum frame buffer of 32GB which can be divided up to 4 ways through hardware partitioning, or by providing multiple users with access to shared, session-based operating systems such as Windows Server 2022 or Windows 11 EMS. The AMD EPYC CPUs have a base clock speed of 2.45 GHz and a boost speed of 3.5 GHz. VMs are assigned full cores instead of threads, enabling full access to AMD’s powerful Zen 3 cores.

NGads instances come in four sizes, allowing customers to right-size their gaming environments for the performance and cost that best fits their business needs.

The two smallest instances rely on industry-standard SR-IOV technology to partition the GPUs into one-fourth and one-half instances, enabling customers to run workloads with no interference or security concerns between users sharing the same physical graphics card.

The VMs also feature the AMD Software Cloud Edition, which targets the same optimizations available in the consumer gaming version of the Adrenaline driver but is further tested and optimized for the cloud environment.

Instance Configs vCPU (Physical Cores)  GPU Memory (GiB) GPU Partition Size  Memory (GiB)  Azure Network (Gbps) 
Standard_NG8ads_V620_v1 8 ¼ GPU 16 10
Standard_NG16ads_V620_v1  16  16  ½ GPU  32  20 
Standard_NG32ads_V620_v1  32  32  1x GPU  64  40 
Standard_NG32adms_V620_v1  32  32  1x GPU  176   40 

The NGads V620-series VMs will support a new AMD Cloud Software driver that comes in two editions: A Gaming driver with regular updates to support the latest titles, as well as a Professional driver for accelerated Virtual Desktop environments, with Radeon PRO optimizations to support high-end workstation applications.

Microsoft Azure, do more with less


Deployment in Azure enables gaming and desktop providers to take advantage of the infrastructure investments put in place by Microsoft in data centers across the world. This gives our customers the ability to only pay for what they use. They can depend on an infrastructure framework that is constantly kept up to date with highly reliable uptime. Customers can innovate faster to differentiate their offerings and provide customers with a richer experience. As our customers’ business needs expand, they can benefit from the economies of scale available from Azure. In addition, customers can build a more complete and robust solution through integration with the broad range of cloud services for storage, networking, and application management available as part of the Azure offerings.

Flexible workloads, flexible costs


High-performance GPU-accelerated workloads have always ranged from workstation design apps to VDI and simulation rendering. Each of these has the potential to tax even powerful graphics boards. Gaming workloads bring the additional challenges of requiring very fast graphics remoting—the interactive transfer of graphics and user controls over the internet. Further, there is a wide variety of games, connection types, and resolutions available to the user.

The NGads V620-series helps resolve these challenges by providing support for a range of visualization applications so that gaming or desktop service providers can optimize for precisely the experiences expected by the end users. Service provider customers can choose the right-sized VM that will best serve their needs without over-allocating resources. As the needs of their offering change, the common software support across VMs allows service providers to shift to a VM size with either a higher or lower GPU partition, or to shift capacity to other regions of the world as their business footprint expands.

Performance powered by AMD GPU and CPU


The NGads V620-series combines AMD Radeon™ GPU and Epyc™ CPU technology to provide a powerful and well-balanced environment for hosting rich and highly-interactive cloud services. 

The AMD Radeon PRO V620 GPU is based on AMD’s RDNA™ 2 Architecture, AMD Software, and AMD Graphics Virtualization technology. 

Each AMD Radeon PRO V620 GPU is equipped with 32MB of GDDR6 dedicated memory, a 256-bit memory interface with up to 512GB/s bandwidth, and ECC support for data correction.  To enhance the user experience, they are designed with hardware raytracing using 72 Ray Accelerators, 4608 Stream Processors, and a peak Engine Clock of 2200 MHz.

The AMD software supports the DirectX® 12.0, OpenGL®4.6, OpenCL™ 2.2, and Vulkan® 1.1 APIs for broad compatibility with gaming and graphics applications.  This enables the NG series VMs to support a very broad range of workloads from cloud gaming, GPU-enhanced VDI, and GPU-intensive Workstation-as-a-Service solutions.

The NGads V620-series uses GPU Partitioning to virtualize the GPU and provide partitions from the full 32 GB memory size (1x GPU), 16GB (one-half GPU), or 8GB (one-fourth GPU).  The Azure GPU Partitioning is based on the PCIe standard SR-IOV extension, which provides a highly predictable and secure method to host multiple independent user environments on the same hardware GPU board.

The AMD EPYC 7763 CPU is built on the 7nm process technology, featuring AMD Zen 3 cores, Infinity Architecture, and the AMD Infinity Guard suite of security features. The AMD EPYC CPUs have a base clock speed of 2.45GHz and a boost clock speed of 3.5 GHz to allow the user to take advantage of a single powerful core when required by the application.

Source: microsoft.com

Saturday, 3 June 2023

Microsoft Build 2023: Innovation through Microsoft commercial marketplace

Microsoft, Microsoft Build 2023, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Prep, Microsoft Preparation, Microsoft Guides, Microsoft Learning

As we look forward to Microsoft Build 2023, I am inspired by the innovation coming from our ISV partners and SaaS providers building on the Microsoft Cloud.

In the past year, we’ve seen large-scale, generative AI models support the creation of new capabilities that expand our vision of the possible, improve productivity, and ignite creativity. The general availability of Azure OpenAI Service is helping developers apply these models to a variety of use cases such as natural language understanding, writing assistance, code generation, data reasoning, content summarization, and semantic search. With Azure’s enterprise-grade security and built-in responsible AI, the rate of innovation is growing exponentially.

Making new strides in AI


The Microsoft commercial marketplace makes it possible for customers to find, purchase, and deploy innovative applications and services to drive their business outcomes. At Microsoft Build 2023, we’re proud to highlight several partners with AI solutions available in the marketplace:

Orkes empowers developers to easily build reliable and secure AI applications, tools, and integrations on Azure with the Conductor open source microservices orchestration platform. With built-in elastic scaling and reliability, teams can more quickly bring applications to market.

Run:ai helps companies deliver AI faster and bridge the gap between data science and computing infrastructure by providing a high-performance compute virtualization layer for deep learning, which accelerates the training of neural network models and enables the development of large AI models to help organizations in every industry accelerate AI innovation.

Statsig allows any company to experiment like big tech at a fraction of the cost. With advanced feature management tools such as automated A/B testing and integrated product analytics, developers can use data insights to learn faster and build better products.

Explore security solutions with our partners


As AI is experiencing rapid growth, security has never been more important. Companies of all sizes and across every industry are increasing their investments in cybersecurity. Partners specializing in security solutions that run on the Microsoft Cloud help customers reduce costs, close coverage gaps, and prevent even the most sophisticated attacks.

At Microsoft Build 2023, we’re excited to feature select partners with security solutions offered in the marketplace:

◉ Anjuna is a multi-cloud confidential computing platform for complete data security and privacy, featuring a unique trusted execution environment that leverages hardware-level isolation to intrinsically secure data and code in the cloud so enterprises can run applications inside Azure Confidential Computing instances in minutes without code changes.

◉ Kovrr transforms cyber security data into actionable, financially quantified cyber risk mitigation recommendations to manage enterprise cyber risk exposure, inform which security controls to invest in, and provide insights into how to optimize cyber insurance and capital management strategies.

◉ Noname Security protects APIs from attacks in real-time while detecting vulnerabilities and misconfigurations before they are exploited, offering deeper visibility and security than API gateways, load balancers, and well architected frameworks (WAFs) without requiring agents or network modifications.

Manage your cloud portfolio with the Microsoft commercial marketplace


The Microsoft commercial marketplace continues to grow and is becoming customers’ preferred method for managing their entire cloud portfolio.

Through the marketplace, customers can search across thousands of applications and services in a single catalog, creating a one-stop destination for all cloud needs including AI, security, data, infrastructure, and more. Solutions available on the marketplace are validated for compatibility with Microsoft applications, ensuring that customers can buy with confidence and deploy seamlessly on Azure.

For customers with enterprise agreements, purchases can be added directly to an Azure bill, simplifying the purchasing process and reducing the number of vendors to be paid separately. For organizations with a cloud consumption commitment, the entire purchase can count towards remaining commitment. Thousands of applications in the marketplace are eligible to count towards an Azure commitment, including the solutions highlighted above—Orkes, Run:ai, Statsig, Anjuna, Kovrr, and Noname Security. With the Microsoft commercial marketplace, customers can get the innovative solutions needed to stay ahead in a competitive market while maximizing the value of cloud investments.

Source: microsoft.com

Thursday, 1 June 2023

Build next-generation, AI-powered applications on Microsoft Azure

Microsoft Azure, Azure Exam Prep, Azure Exam Prep, Azure Preparation, Azure Tutorial and Materials, Azure Learning, Azure Guides

The potential of generative AI is much bigger than any of us can imagine today. From healthcare to manufacturing to retail to education, AI is transforming entire industries and fundamentally changing the way we live and work. At the heart of all that innovation are developers, pushing the boundaries of possibility and creating new business and societal value even faster than many thought possible. Trusted by organizations around the world with mission-critical application workloads, Azure is the place where developers can build with generative AI securely, responsibly, and with confidence.

Welcome to Microsoft Build 2023—the event where we celebrate the developer community. This year, we’ll dive deep into the latest technologies across application development and AI that are enabling the next wave of innovation. First, it’s about bringing you state-of-the-art, comprehensive AI capabilities and empowering you with the tools and resources to build with AI securely and responsibly. Second, it’s about giving you the best cloud-native app platform to harness the power of AI in your own business-critical apps. Third, it’s about the AI-assisted developer tooling to help you securely ship the code only you can build.

We’ve made announcements in all key areas to empower you and help your organizations lead in this new era of AI.

Bring your data to life with generative AI


Generative AI has quickly become the generation-defining technology shaping how we search and consume information every day, and it’s been wonderful to see customers across industries embrace Microsoft Azure OpenAI Service. In March, we announced the preview of OpenAI’s GPT-4 in Azure OpenAI Service, making it possible for developers to integrate custom AI-powered experiences directly into their own applications. Today, OpenAI’s GPT-4 is generally available in Azure OpenAI Service, and we’re building on that announcement with several new capabilities you can use to apply generative AI to your data and to orchestrate AI with your own systems.

Microsoft Azure, Azure Exam Prep, Azure Exam Prep, Azure Preparation, Azure Tutorial and Materials, Azure Learning, Azure Guides

We’re excited to share our new Azure AI Studio. With just a few clicks, developers can now ground powerful conversational AI models, such as OpenAI’s ChatGPT and GPT-4, on their own data. With Azure OpenAI Service on your data, coming to public preview, and Azure Cognitive Search, employees, customers, and partners can discover information buried in the volumes of data, text, and images using natural language-based app interfaces. Create richer experiences and help users find organization-specific insights, such as inventory levels or healthcare benefits, and more.

To further extend the capabilities of large language models, we are excited to announce that Azure Cognitive Search will power vectors in Azure (in private preview), with the ability to store, index, and deliver search applications over vector embeddings of organizational data including text, images, audio, video, and graphs. Furthermore, support for plugins with Azure OpenAI Service, in private preview, will simplify integrating external data sources and streamline the process of building and consuming APIs. Available plugins include plugins for Azure Cognitive Search, Azure SQL, Azure Cosmos DB, Microsoft Translator, and Bing Search. We are also enabling a Provisioned Throughput Model, which will soon be generally available in limited access to offer dedicated capacity.

Customers are already benefitting from Azure OpenAI Service today, including DocuSign, Volvo, Ikea, Crayon, and 4,500 others.

We continue to innovate across our AI portfolio, including new capabilities in Azure Machine Learning, so developers and data scientists can use the power of generative AI with their data. Foundation models in Azure Machine Learning, now in preview, empower data scientists to fine-tune, evaluate, and deploy open-source models curated by Azure Machine Learning, models from Hugging Face Hub, as well as models from Azure OpenAI Service, all in a unified model catalog. This will provide data scientists with a comprehensive repository of popular models directly within the Azure Machine Learning registry.

We are also excited to announce the upcoming preview of Azure Machine Learning prompt flow that will provide a streamlined experience for prompting, evaluating, tuning, and operationalizing large language models. With prompt flow, you can quickly create prompt workflows that connect to various language models and data sources. This allows for building intelligent applications and assessing the quality of your workflows to choose the best prompt for your case. See all the announcements for Azure Machine Learning.

It’s great to see momentum for machine learning with customers like Swift, a member-owned cooperative that provides a secure global financial messaging network, who is using Azure Machine Learning to develop an anomaly detection model with federated learning techniques, enhancing global financial security without compromising data privacy. We cannot wait to see what our customers build next.

Run and scale AI-powered, intelligent apps on Azure


Azure’s cloud-native platform is the best place to run and scale applications while seamlessly embedding Azure’s native AI services. Azure gives you the choice between control and flexibility, with complete focus on productivity regardless of what option you choose.

Azure Kubernetes Service (AKS) offers you complete control and the quickest way to start developing and deploying intelligent, cloud-native apps in Azure, datacenters, or at the edge with built-in code-to-cloud pipelines and guardrails. We’re excited to share some of the most highly anticipated innovations for AKS that support the scale and criticality of applications running on it.

To give enterprises more control over their environment, we are announcing long-term support for Kubernetes that will enable customers to stay on the same release for two years—twice as long as what’s possible today. We are also excited to share that starting today, Azure Linux is available as a container host operating system platform optimized for AKS. Additionally, we are now enabling Azure customers to access a vibrant ecosystem of first-party and third-party solutions with easy click-through deployments from Azure Marketplace. Lastly, confidential containers are coming soon to AKS, as a first-party supported offering. Aligned with Kata Confidential Containers, this feature enables teams to run their applications in a way that supports zero-trust operator deployments on AKS.

Azure lets you choose from a range of serverless execution environments to build, deploy, and scale dynamically on Azure without the need to manage infrastructure. Azure Container Apps is a fully managed service that enables microservices and containerized applications to run on a serverless platform. We announced, in preview, several new capabilities for teams to simplify serverless application development. Developers can now run Azure Container Apps jobs on demand and schedule applications and event-driven ad hoc tasks to asynchronously execute them to completion. This new capability enables smaller executables within complex jobs to run in parallel, making it easier to run unattended batch jobs right along with your core business logic. With these advancements to our container and serverless products, we are making it seamless and natural to build intelligent cloud-native apps on Azure.

Integrated, AI-based tools to help developers thrive


Making it easier to build intelligent, AI-embedded apps on Azure is just one part of the innovation equation. The other, equally important part is about empowering developers to focus more time on strategic, meaningful work, which means less toiling on tasks like debugging and infrastructure management. We’re making investments in GitHub Copilot, Microsoft Dev Box, and Azure Deployment Environments to simplify processes and increase developer velocity and scale.

GitHub Copilot is the world’s first at-scale AI developer tool, helping millions of developers code up to 55 percent faster. Today, we announced new Copilot experiences built into Visual Studio, eliminating wasted time when getting started with a new project. We’re also announcing several new capabilities for Microsoft Dev Box, including new starter developer images and elevated integration of Visual Studio in Microsoft Dev Box, that accelerates setup time and improves performance. Lastly, we’re announcing the general availability of Azure Deployment Environments and support for HashiCorp Terraform in addition to Azure Resource Manager.

Microsoft Azure, Azure Exam Prep, Azure Exam Prep, Azure Preparation, Azure Tutorial and Materials, Azure Learning, Azure Guides

Enable secure and trusted experiences in the era of AI


When it comes to building, deploying, and running intelligent applications, security cannot be an afterthought—developer-first tooling and workflow integration are critical. We’re investing in new features and capabilities to enable you to implement security earlier in your software development lifecycle, find and fix security issues before code is deployed, and pair with tools to deploy trusted containers to Azure.

We’re pleased to announce GitHub Advanced Security for Azure DevOps in preview soon. This new solution provides the three core features of GitHub Advanced Security into the Azure DevOps platform, so you can integrate automated security checks into your workflow. It includes code scanning powered by CodeQL to detect vulnerabilities, secret scanning to prevent the inclusion of sensitive information in code repositories, and dependency scanning to identify vulnerabilities in open-source dependencies and provide update alerts.

While security is at the top of the list for any developer, using AI responsibly is no less important. For almost seven years, we have invested in a cross-company program to ensure our AI systems are responsible by design. Our work on privacy and the General Data Protection Regulation (GDPR) has taught us that policies aren’t enough; we need tools and engineering systems that help make it easy to build with AI responsibly. We’re pleased to announce new products and features to help organizations improve accuracy, safety, fairness, and explainability across the AI development lifecycle.

Azure AI Content Safety, now in preview, enables developers to build safer online environments by detecting and assigning severity scores to unsafe images and text across languages, helping businesses prioritize what content moderators review. It can also be customized to address an organization’s regulations and policies. As part of Microsoft’s commitment to responsible AI, we’re integrating Azure AI Content Safety across our products, including Azure OpenAI Service and Azure Machine Learning, to help users evaluate and moderate content in prompts and generated content.

Additionally, the responsible AI dashboard in Azure Machine Learning now supports text and image data in preview. This means users can more easily identify model errors, understand performance and fairness issues, and provide explanations for a wider range of machine learning model types, including text and image classification and object detection scenarios. In production, users can continue to monitor their model and production data for model and data drift, perform data integrity tests, and make interventions with the help of model monitoring, now in preview.

We are committed to helping developers and machine learning engineers apply AI responsibly, through shared learning, resources, and purpose-built tools and systems.

Let’s write this history, together


AI is a massive shift in computing. Whether it is part of your workflow or part of cloud development, powering your next-generation, intelligent apps, this community of developers is leading this shift. 

We are excited to bring Microsoft Build to you, especially this year as we go deep into the latest AI technologies, connect you with experts from within and outside of Microsoft, and showcase real-world solutions powered by AI.

Source: microsoft.com