Tuesday 25 June 2024

Leverage AI to simplify CSRD reporting

Leverage AI to simplify CSRD reporting

Organizations around the world are navigating complex reporting frameworks to meet sustainability goals. For companies working in—or with—the European Union, the Corporate Sustainability Reporting Directive (CSRD) is a sweeping set of requirements to provide non-financial public disclosures on environmental, social, and governance (ESG) topics. CSRD rules began rolling out in 2024, requiring over 11,000 companies to disclose non-financial information. An increase of an additional 50,000 companies that are incorporated, listed, or doing business in the European Union is estimated. As companies are working to comply, allocating the time and resources is a challenge. Microsoft is developing solutions to address the diverse reporting needs of our customers and investing in our partners to create a variety of options that organizations can engage.

To address this growing need, Microsoft Cloud for Sustainability data solutions in Microsoft Fabric (preview) can help organizations take sustainability data in any format, organize, and normalize the data for sustainability regulatory reporting. This quantitative data reporting approach is complemented by a joint solution from Accenture and Avanade that leverages generative AI to provide qualitative insights. This enables organizations to manage workflows associated with multiple sustainability reporting frameworks globally, including CSRD, Global Report Initiative (GRI), and the International Sustainability Standards Board (ISSB). Organizations can optimize both quantitative data from Microsoft Cloud for Sustainability solutions and qualitative data with Accenture and Avanade’s generative AI-powered solution. 

The solution integrates seamlessly with Microsoft Sustainability Manager, offering a comprehensive platform for managing sustainability key performance indicators (KPIs) across different frameworks. The solution’s collaborative features, generative AI-enhanced insights, and streamlined data integration can help organizations simplify compliance-related processes. Meanwhile, the improved richness of its sustainability reporting enables organizations to take more effective actions to achieve their ESG targets.

Navigate the complexities of reporting frameworks


All ESG reporting frameworks carry their own complexities. However, the CSRD has been top of mind in 2024 for organizations in and outside of the European Union as they begin to gather, analyze, and report the required data. This is no simple task—the CSRD encompasses 12 standards and 82 reporting requirements, which amounts to approximately 500 KPIs and over 10,000 underlying data points. In addition to managing this large array of ESG data, companies face other significant challenges associated with CSRD compliance and data management including reporting on the entire value chain versus only on their own organization. The evolving nature of sustainability criteria and metrics further complicates the reporting process. 

Navigating the intricacies of a multitude of reporting frameworks and intricate regulations necessitates extensive data gathering and assimilation. The process of ESG reporting often engages various departments and resources within an organization, introducing its own set of risks and costs. The more manual a process, the more opportunities there are for errors to occur, and the complexities of each reporting framework require time, diligence, and accuracy. A proficient solution can help simplify the process and effectively aid in the generation of accurate reports with fewer resources required. 

Accenture and Avanade’s generative AI-powered solution allows users to select from different reporting frameworks and adapts to the specific requirements of the chosen framework, displaying relevant categories and reporting structures. Users can access the breadth and depth of their data and translate it into the necessary reporting frameworks. This flexibility is crucial for organizations that are subject to multiple reporting obligations or need to adhere to international standards beyond local mandates. 

Leverage AI to simplify CSRD reporting

Streamlining data management using Microsoft Azure OpenAI Service


Accenture and Avanade’s solution addresses the dual challenges of qualitative and quantitative data in sustainability reporting, utilizing Azure OpenAI for enhancing its reporting capabilities, especially for qualitative data input. Using Azure OpenAI to create an AI assistant, Accenture and Avanade’s solution offers a unified platform for sustainability reporting, simplifying the process of compiling CSRD reports, making it easier for users to interact with data and generate reports quickly.

◉ Qualitative data management: Leveraging Azure OpenAI integration, Accenture and Avanade’s solution assists in drafting responses to qualitative questions, such as detailing a company’s sustainability policies, practices, and goals. This AI assistant ensures that responses are not only in keeping with reporting standards but also reflective of best practices and forward-thinking sustainability strategies.
◉ Quantitative data management: The integration of Microsoft Sustainability Manager allows for the automatic import of calculated quantitative metrics. This means that insights or recommendations provided by Accenture and Avanade’s solution is informed by the data in the user’s ESG platform. The AI assistant enables identification of trends and patterns in both qualitative and quantitative data sources, aiding in a more holistic analysis. 

An enhanced collaborative workflow


One of the biggest pain points that companies have related to ESG reporting is managing the approval workflow with multiple process and approval steps. It’s important to have a trail of accountability, which depending on organization size, can exist across several departments and users. Accenture and Avanade’s solution enables organizations to assign responsibility and accountability, thereby streamlining the process of preparing the report and its associated approval processes. 

Importantly, team members can work on the same sections of the report simultaneously and the platform maintains comprehensive audit trails of all changes made to the report. This transparency is vital for accountability, enabling team leaders to monitor progress and ensure that all contributions align. 

Leverage AI to simplify CSRD reporting

Source: microsoft.com

Saturday 22 June 2024

6 findings from IoT Signals report: Manufacturers prepare their shop floor for AI

6 findings from IoT Signals report: Manufacturers prepare their shop floor for AI

Manufacturers are embracing AI to deliver a new level of automation, optimization, and innovation. To unlock the full potential of AI on the shop floor, organizations are testing and investigating technologies and paradigms that empower them to leverage their data more effectively.

Microsoft, in partnership with IoT Analytics market research firm, conducted a global survey of manufacturers to gain insight into how they are seizing the AI opportunity while navigating key industry challenges. We asked manufacturers about their current priorities and future visions, their adoption of modern technologies and paradigms, and the benefits they expect from those technologies 

In this report, we share the key findings from the survey, to show how manufacturing enterprises are preparing their shopfloors for AI to make them secure, scalable, and automated and how they are adopting advanced technologies such as centralized device management, software containerization at the edge, and unified industrial data operations to accelerate that process.

Six findings from manufacturers preparing their shop floor for AI


1. Scale matters the most in the era of AI


Scalability was the main concern for 72% of survey respondents, who highlighted this paradigm as crucial for their factory’s future. Scalability came first, followed by automation and serviceability. These paradigms ensure that factories can efficiently expand with demand, optimize with minimal manual decision making, and maintain high uptime through easy troubleshooting and maintenance. 

What does scale look like for industrial environments? 

Manufacturers face the challenges of keeping up with the changing demands of the market, the regulations, and the competition. They also recognize the potential of AI to transform their operations, optimize their processes, and enhance their products. But they don’t have the luxury of spending months or years on deploying and scaling solutions across their plants. Manufacturers need a faster way to move, a smarter way to manage, and a more flexible way to adapt. That’s why we have introduced a new approach—the adaptive cloud approach. 

2. Cybersecurity and data management are top of mind right now 


Security risks and data handling difficulties pose serious problems, with 58% of respondents seeing cybersecurity as a severe issue and 49% seeing data management as a severe issue. These concerns are motivating customers to improve network security and ensure data is reliable and accessible for decision-making. 

What does security look like for industrial environments? 

Security and data protection are critical for the manufacturing sector, as the sector faces increasing regulatory standards and cyber threats. Manufacturers need to secure existing devices, and plan during device refresh to choose devices that meet industry security standards, will enable them to more easily comply with regulatory standards, and provide security to defend from the latest security threats.

3. Device management is critical for security and data handling 


Device management’s value is evolving beyond updates and device health monitoring to also address security risks and data flow management. The survey data supported this trend, with 68% of respondents noting that the security monitoring aspect of device management was very or extremely important to their organization and 59% of respondents highlighting data management as the second most important aspect of device management. 

Why is centralized device management important? 

Centralized device management is vital for ensuring the performance and security of operations in a factory setting. It helps to keep devices secure and functioning optimally, which contributes to the overall efficiency and productivity of a manufacturing environment. Effective management also enables better oversight and control over the factory processes, improving operational reliability and supporting scalability and adaptability in a dynamic industrial landscape.

4. Containerized workloads are coming to the shop floor 


The adoption of containerized software on the shop floor is rising, with 85% of survey respondents already utilizing this technology. This shift towards containerization at the edge signifies a move to improve operational efficiency, system stability, and security. 55% of respondents indicated that containerized software could significantly or extremely mitigate reliability and uptime challenges, while 53% indicated it could do the same for cybersecurity challenges.

What is containerized software? 

Software containerization enables consistent and repeatable development and deployment of solutions across different environments, in the cloud and in factory. Containerization of OT software is essential for the AI-powered factory of the future, as it enables seamless technology deployment in scalable, serviceable, and automated factories. Kubernetes automates the scaling and management of containerized applications, saving time and resources for manufacturers.

5. Industrial data operations optimize OT data management


Companies want to combine information technology (IT) and operational technology (OT) systems for context driven decision making. 52% of respondents indicated that having a combined IT and OT data platform was very or extremely important for their company. Industrial data operations enhance the integration of IT and OT data by improving data flow, quality and value; therefore, 87% of companies have already adopted industrial data operations technology in some form or are planning to do so.

What are industrial data operations? 

Industrial data operations delivers data in a reliable, real-time manner for optimizing factories and plants. Industrial data operations manages and unifies data from various sources, facilitates seamless integration of information, and ensures data is accessible and usable for decision-making purposes. Industrial data operations helps break down data silos and improve predictive insights through an exchange and integration between shop floor and cloud environments.

6. Respondents are investing in underlying data architecture for AI 


According to the study, manufacturers plan to invest in AI-powered factories of the future within the next two years. On average, respondents expected their organizations to increase their investments in software for orchestrating edge AI by 11%. This investment shows that they recognize the need to overcome technical and skill gaps to fully exploit AI’s capabilities in future manufacturing processes. 

How to invest in underlying architecture for AI? 

Microsoft recommends adopting advanced technology frameworks such as centralized device management, software containerization at the edge, and unified industrial data operations to accelerate industrial transformation and prepare for AI. Azure’s adaptive cloud approach embraces all three advanced technology frameworks. 

Accelerate industrial transformation in manufacturing


A comprehensive survey of manufacturers’ priorities, challenges, and plans for adopting new technologies, such as these, in their factories to prepare for AI. The report shows that manufacturers are looking for solutions that can help them secure, scale, and automate. Microsoft Azure is responding to these needs with its adaptive cloud approach, which offers a flexible and scalable platform for managing devices, applications, and integrated data across the edge and the cloud.

Source: microsoft.com

Thursday 20 June 2024

Improve cloud performance and reliability with a guided learning plan

Improve cloud performance and reliability with a guided learning plan

Businesses have committed to the cloud for its scalability, agility, and security. As customers continue to deepen their investments in cloud and AI services and environments become more complex, the need for proper cloud management increases. Continuous improvement and careful management through all phases of your cloud journey helps avoid unexpected costs and inefficient resource allocation while improving security and reliability. Strategic optimization delivers the resiliency to efficiently and securely handle fluctuating workloads with ease, ensuring you manage your environment for optimal performance.

For cloud professionals looking to systematically upskill and validate their expertise, we’ve created a powerful learning resource called Plans on Microsoft Learn. These customized learning journeys provide a guided, structured approach to mastering specific technical domains and roles with specific learning objectives and milestones. Our official plan, “Improve Reliability, Security and Performance on Azure”, provides learning modules and resources on tools and best practices from Microsoft that can help your business elevate reliability, security, and performance of your cloud and AI investments.

What are Plans on Microsoft Learn?


Plans on Microsoft Learn are hand-crafted curricula that bundle together related Learn modules, learning paths, and certifications into milestones with a logical, end-to-end educational experience. Track progress and percent-completion of each milestone as you work through the plan. Each plan is meticulously designed by Microsoft technical and learning experts to build comprehensive skills for a particular job role or competency area.

Plans offer an efficient, curated approach for learners to navigate Microsoft’s extensive training library. Rather than having to sift through the catalog of individual resources, plans lay out an optimal sequence tailored to each topic and learning objective. They start with fundamentals and progressively advance to more specialized subjects through thoughtful progression.

Beyond just compiling content, plans incorporate hands-on activities, knowledge checks, certifications, and other engagement tools to reinforce practical skills. Up-to-date Microsoft Azure technical content is seamlessly woven in, allowing learners to receive the latest cloud best practices.

What are the benefits of Plans on Microsoft Learn?


While the self-guided flexibility of Microsoft Learn is incredibly empowering, following an official plan yields some distinct benefits:

  • Comprehensiveness. Plans provide complete coverage of all the concepts and skills required to truly master a domain, leaving no gaps.
  • Efficiency. The resources within plans are carefully curated, allowing learners to laser-focus their efforts on just what’s needed.
  • Structure. Clear start-to-finish learning paths prevent knowledge fragmentation and facilitate efficiently building specialized skillsets.
  • Hands-on. Built-in coding, labs, and other interactive components solidify skills through applied practice.
  • Validated expertise. Plans can incorporate certifications to formally validate and prove proficiency.
  • Latest skills. Leveraging Microsoft’s deep technical expertise, plans rapidly integrate the latest cloud service updates and best practices.

What will I learn in the “Improve Reliability, Security and Performance on Azure” Plan?


With cloud spend efficiency becoming an ever-growing priority, these optimization skills are invaluable to organizations seeking to maximize their ROI from Azure. Among our official Learn Plans, the “Improve Reliability, Security, and Performance on Azure” Plan stands out for its immense business value and career impact potential. This comprehensive curriculum is designed to equip learners with deep skills for confidently designing, implementing, and managing cost-optimized Azure architectures at scale.

The Plan kicks off with fundamental cloud concepts like subscription management and organizational structure. It covers core Azure services like virtual machines, storage, databases, and networking through an optimization lens.

Learners then progress to more advanced cost optimization strategies such as reservation model pricing, Microsoft Azure Hybrid Benefit, and the Microsoft Azure Consumption Commitment. Monitoring, analytics, and automation techniques are explored for proactively identifying inefficiencies and waste.

Throughout, learners get hands-on practice with cost management and optimization tools like Azure Advisor, Azure Pricing Calculator, and Cost Management + Billing. Real-world design scenarios challenge them to apply optimization best practices end-to-end.

For those seeking validation, this Skilling Plan aligns perfectly with the AZ-305: Designing Microsoft Azure Infrastructure Solutions expert-level certification exam.

Upon completing the Plan, learners will have cultivated a comprehensive, job-ready skillset for designing and implementing cost-optimized, high-scale Azure architectures. This turbo-charges their impact across roles like cloud architects, solution engineers, cloud administrators and more.

Who should engage with this Plan?


This Azure Skilling Plan is designed for a broad audience, including:

  • Cloud architects and engineers. Gain the skills to design and implement optimized Azure solutions from the ground up.
  • Developers. Learn how to build applications that are inherently cost-efficient and performant.
  • IT pros. Understand how to manage and optimize your existing Azure resources.
  • Anyone with a passion for the cloud. Whether you’re new to Azure or an experienced pro, this plan offers valuable insights and practical skills to level up your cloud game.

Source: microsoft.com

Tuesday 18 June 2024

Get the best value in your cloud journey with Azure pricing offers and resources

Get the best value in your cloud journey with Azure pricing offers and resources

Cloud computing continues to transform the way businesses operate, innovate, and compete. And whether you’re just moving to the cloud or already have an established cloud footprint, you may have questions about how to pay for the services you need, estimate your costs, or optimize your spending. To help answer these questions, Azure provides a variety of resources and offers to help you get the best value at every stage of your cloud journey. 

Read More: DP-500: Designing and Implementing Enterprise-Scale Analytics Solutions Using Microsoft Azure and Microsoft Power BI

This blog post will show you how to approach and think about pricing throughout your cloud adoption journey. We will also give an example of how a hypothetical digital media company would approach their Azure pricing needs as they transition from evaluating and planning to setting up and running their cloud solutions. After reading this post, you will know more about how to select the best Azure pricing option for your business objectives and cloud needs.

Find guidance and resources to navigate Azure pricing options

If you are new to Azure or cloud computing in general, you may want to learn the basics of how cloud services are priced, and what options you have for paying for them. Azure offers a variety of pricing options to suit different needs and scenarios, from free tier and pay-as-you-go to commitment and benefits. Here’s a brief overview of each option: 

Free tier: You can get started with Azure for free, and access over 25 services for 12 months, plus $200 credit to use in your first 30 days. You can also use some services for free, such as Azure App Service, Azure Functions, and Azure DevOps, with certain limits and conditions. The free tier is a great way to explore Azure and learn how it works, without any upfront costs or commitments. 

Pay-as-you-go: You can pay only for the services you use or consume, based on the measured usage and the unit prices of each service. For example, you can pay for the number of virtual machine (VMs) hours, the amount of storage space, or the volume of data transferred. Pay-as-you-go is a flexible and scalable option that lets you adjust your usage and costs according to your changing needs and demands. 

Estimate Azure project costs 

If you have a new project to migrate to or build in Azure, you need an accurate and realistic estimate of your project costs to make an informed decision about moving forward. To help with this decision, Azure provides several tools and resources, such as: 

TCO calculator: You can use the Total Cost of Ownership (TCO) calculator to estimate how much you can save by migrating your on-premises workloads to Azure. You can input your current infrastructure details, such as servers, storage, and network, and see a detailed comparison of the costs of running them on-premises versus on Azure. 

Azure Migrate: You can use Azure Migrate to assess and plan your migration to Azure. You can discover and evaluate your on-premises servers, databases, and applications, and get recommendations on the best Azure services and sizing options for them. You can also get estimated costs and savings for your migration scenario and track your progress and readiness. 

Azure Architecture Center: You can get guidance for architecting solutions on Azure using established patterns and practices such as OpenAI Chatbots, Windows VM Deployment, and Analytics end-to-end with Azure Synapse with cost factors included. 

Calculate costs of Azure products and services 

If you are ready to deploy specific Azure services and you want to budget for them, you may want to consider the different pricing options and offers that are available for each service. Azure provides resources and guidance on how to budget for specific Azure services, such as: 

Azure pricing calculator: Estimate your monthly costs based on your expected usage and configuration such as region or virtual machine series. 

Product pricing details pages: Find detailed pricing information for each Azure service on its pricing details page. You can see the pricing model, the unit prices, the service tiers, and the regional availability.

Azure savings plan for compute: An easy and flexible way to save up to 65% on select compute services, compared to pay-as-you-go prices. The savings plan unlocks lower prices on compute services when you commit to spend a fixed hourly amount for one or three years. You choose whether to pay all upfront or monthly at no extra cost. 

Azure reservations: Reserve Azure resources, such as VMs, SQL Database, or Cosmos DB, for one or three years and save up to 72% on your cloud costs. Improve budgeting and forecasting with a single upfront payment that makes it easy to calculate your investments. Or lower your upfront cash outflow with a monthly payment option at no additional cost. 

Azure Hybrid Benefit: Apply your existing Windows Server, SQL Server licenses with active Software Assurance or subscriptions to Azure Hybrid Benefit to achieve cost savings. Save up to 85% compared to standard pay-as-you-go rates and achieve the lowest cost of ownership when you combine Azure Hybrid Benefit, reservations savings, and Extended Security Updates. You can also apply your active Linux subscription to Azure Hybrid Benefit. 

Manage and optimize your Azure investments 

If you are already using Azure and you want to optimize your spend for your current Azure workloads, you may want to review your usage and costs, and look for ways to enhance your investments. Azure provides several tools and resources to help you with this process, such as: 

Microsoft Cost Management: You can use Microsoft Cost Management with Copilot to monitor and analyze your Azure spending, and to create and manage budgets and alerts. You can see your current and forecasted costs, your cost trends and anomalies, and your cost breakdown by service, resource group, or subscription. You can also get recommendations on how to optimize your costs. 

Azure Advisor: You can use Azure Advisor to get personalized and actionable recommendations on how to improve the performance, security, reliability, and cost-effectiveness of your Azure resources. You can see the potential savings and benefits of each recommendation and apply them with a few clicks. 

FinOps on Azure: You can leverage FinOps best practices on Azure to empower your organization by fostering a culture of data-driven decision-making, accountability, and cross-team collaboration. This approach will help you maximize investments and accelerate business growth through improved organizational alignment 

An example of a company’s cloud journey and pricing needs 

To illustrate how a customer can choose the best pricing option and resources for their cloud journey, let’s look at an example. Contoso, a hypothetical digital media company, wants to migrate their infrastructure and build a new OpenAI Chatbot application in Azure. Here’s how they would think about their Azure pricing needs at each stage of their journey: 

Considering Azure: Contoso wants to understand how Azure pricing works. They use the free tier to try out some Azure services to test functionality. They also leverage the pay-as-you-go model to explore how some services are billed. 

Assess and plan Azure projects: Contoso needs to estimate their project costs. To compare the costs of running on-premises versus on Azure they input their on-premises server infrastructure in the TCO calculator. They also use the Azure Architecture Center to learn how to develop an OpenAI chatbot with best practices.

Deployment in Azure: Contoso is ready to migrate their environment and deploy their company’s chatbot app and wants to budget for the specific Azure services needed. They leverage the product specific pricing pages and the pricing calculator to estimate their monthly costs based on their expected usage and configuration. They purchase Reservations for their stable and predictable VMs and Azure Database usage. They already have on-premise Windows Server licenses, so they enroll in Software Assurance to get a credit for those licenses with the Azure Hybrid Benefit when deploying their VMs to save on operating costs.

Post-deployment optimization in Azure: After running their environment on Azure for a few months, Contoso wants to review and optimize their workloads. They use Azure Advisor to get personalized and actionable recommendations on how to enhance their cost-effectiveness. Leveraging these recommendations, they purchase Azure savings plan for compute for their dynamic compute workloads that may change regions or scope and right-size their VMs.

Source: microsoft.com

Saturday 15 June 2024

Azure OpenAI Service: Transforming legal practices with generative AI solutions

Azure OpenAI Service: Transforming legal practices with generative AI solutions

In today’s fast-paced legal environment, the ability to efficiently manage and analyze vast amounts of data is crucial. And the field of law is no exception. In fact, a 2023 research paper claimed that of all the industries affected by new AI, the legal field was most exposed. AI is increasingly being leveraged to meet this need, enhancing the capabilities of legal professionals, and improving outcomes for clients. A recent Thomson Reuters survey reported that 82% of law firm lawyers said they believe that ChatGPT and generative AI could be readily applied to legal work. Another research report, by economists at Goldman Sachs, estimated that 44% of legal work could be automated.

Over the past several years, the data landscape has exploded, presenting legal teams with the challenge of managing not only increasing volumes of data but also a variety of new data types. Traditional sources like emails and documents are now accompanied by more complex sources such as collaboration platforms, chat data, text messages, and video recordings. Given the potential relevance of all this information in legal matters, modern legal teams must be prepared to identify, collect, and analyze vast amounts of data—often under tight deadlines. A number of law firms and legal service providers are using AI to streamline processes, reduce risks, and increase efficiency. Notably, companies like Icertis, Relativity, and Clifford Chance are pioneering the integration of AI into their workflows, demonstrating the transformative power of Azure AI Services in the legal field.

Key benefits and applications at work


The following AI applications can help teams throughout the legal field manage contracts more efficiently, reduce risk, ensure compliance, and drive better legal business:

  • Enhanced document review: Uses natural language processing to analyze documents, providing relevant insights for legal cases.
  • Accelerated e-discovery: Quickly identifies, collects, and analyzes large volumes of data from various sources.
  • Improved efficiency: Reduces the time and resources needed for document review. 
  • Identification of key information: Uncovers critical terms and conditions buried within documents.
  • Risk management: Assist legal users to consider problematic terms and ensure compliance.
  • Cognitive translation: Implements AI-driven translation to improve communication across languages.
  • Accessible contracts: Natural language processing capabilities help users navigate and understand complex legal language.
  • Enhanced decision-making: Provides insights for more informed strategic decisions.

Below we look at three companies who have adopted Azure OpenAI Service to support their legal practices, illustrating the profound impact and potential of these technologies in reshaping the industry.

◉ Revolutionizing contract life cycle management with generative AI and Azure

Contracts can be crucial in business, but managing them often remains disjointed across departments, and may lead to inefficiencies and risks. Icertis, used by 30% of Fortune 100 companies, has partnered with Microsoft to enhance contract management using AI. Their platform, Icertis Contract Intelligence (ICI), now incorporates generative AI through ICI Copilots, which streamlines contract reviews and uncovers hidden terms, thereby reducing risks and increasing efficiency. Icertis natively interoperates with Microsoft 365, Dynamics 365, and Azure AI to empower users to create, view, manage, and collaborate on contracts in the tools and applications they use every day. This collaboration helps customers achieve considerable time savings and better risk management. The partnership emphasizes continuous innovation and customer value, enhancing contract management capabilities and solidifying both companies’ market positions.

◉ Relativity and Microsoft partner to deliver generative AI to litigators

In the realm of e-discovery, efficiently organizing and analyzing vast amounts of unstructured data is critical. Relativity, partnering with Microsoft, provides AI-powered solutions to address this challenge. Relativity leverages Microsoft Azure and GPT-4 to enhance document review processes. They developed Relativity aiR for Review on top of Azure OpenAI Service to deliver a streamlined experience directly in RelativityOne. Relativity aiR for Review, uses natural language processing to analyze documents and assists with legal reviews and investigations. This partnership promotes security, interoperability, and global reach—enabling legal teams to manage growing data volumes and diverse data types effectively.

The collaboration focuses on responsible AI, aiming to deliver innovative, secure, and efficient tools for legal professionals. Relativity and Microsoft’s joint efforts aim to continue advancing AI capabilities in e-discovery and offer tools to assist with complex data challenges.

◉ Clifford Chance trailblazes Microsoft responsible AI to improve services for its legal clients

Clifford Chance, a leading British multinational law firm, leveraged advanced technologies like Azure, Azure OpenAI Service, and Microsoft Copilot for Microsoft 365 to enhance their legal services. Early implementations, such as cognitive translation, have quickly become some of their fastest-growing products, significantly improving their ability to handle sensitive, text-based information efficiently. Additionally, the firm benefits from intelligent recap and enhanced data security features through Microsoft Teams Premium, resulting in substantial cost savings and improved protection of client data. By integrating AI-powered solutions, Clifford Chance aims to free up their professionals’ time for strategic tasks and provide innovative, efficient services to their clients. The firm is excited about the potential of large language models (LLMs) and other AI tools to further transform their operations, making them more responsive and effective in a highly competitive and time-pressured environment.

Azure OpenAI Service: Impact


The integration of Azure AI Services is assisting legal professionals in managing data, conducting reviews, and offering services. Companies like Icertis, Relativity, and Clifford Chance are leveraging AI to boost operational efficiency and foster a more innovative and responsive legal system. As AI technologies continue to advance, their impact on the legal industry will grow, driving greater efficiencies and yielding better outcomes for all stakeholders.

Our commitment to responsible AI


Organizations across industries are leveraging Azure OpenAI Service and copilot services and capabilities to drive growth, increase productivity, and create value-added experiences. From advancing medical breakthroughs to streamlining manufacturing operations, our customers trust that their data is protected by robust privacy protections and data governance practices. As our customers continue to expand their use of our AI solutions, they can be confident that their valuable data is safeguarded by industry-leading data governance and privacy practices in the most trusted cloud on the market today.  

At Microsoft, we have a long-standing practice of protecting our customers’ information. Our approach to responsible AI is built on a foundation of privacy, and we remain dedicated to upholding core values of privacy, security, and safety in all our generative AI products and solutions.

Source: microsoft.com

Thursday 13 June 2024

Unlock new potential for your SAP workloads on Azure with these learning paths

Unlock new potential for your SAP workloads on Azure with these learning paths

SAP, a leading provider of enterprise resource planning (ERP) software, is a mission-critical consideration for companies migrating to Azure. Microsoft and SAP have a strong partnership centered on helping customers successfully migrate their SAP workloads to Azure, offering solutions that can be seamlessly integrated with Azure’s cloud infrastructure:

  • Accelerated cloud savings: As the leading provider for SAP workloads, let us manage your infrastructure as you streamline your cloud spending. 
  • AI intelligence built-in: Harness the power of AI-powered insights to make data-driven decisions that drive your business forward. 
  • Boost productivity and innovation: Integrated apps streamline your team’s workflow and automate repetitive business processes.
  • Enhanced protection: Our multi-layered cloud security ensures your SAP workloads run smoothly, backed by integrated Azure recovery services.

Together, SAP and Microsoft are not just providing tools and services but are also creating ecosystems that foster innovation and transformation, enabling businesses to stay competitive in a rapidly evolving digital landscape. To help you and your team better take advantage of these benefits, we’ve created an array of learning materials and interactive events—from self-guided courses to Virtual Training Days, certifications to conferences—that build your cloud expertise.

Chart your course to expertise with personalized Learning Paths 


Our Microsoft Learn Learning Paths are curated collections of free, online modules and resources designed to help you build specific skills or gain knowledge in a particular technology or subject area. We have a full range of paths related to migrating SAP workloads, including:

Explore Azure for SAP Workloads: Streamline your SAP operations and maximize ROI with our comprehensive Azure training. Empower your team to seamlessly migrate, manage, and optimize SAP workloads on Azure, leveraging its robust infrastructure and specialized tools. This comprehensive training will enhance your SAP performance, drive efficiency, and unlock innovation within your existing environment. 

Deploy SAP on Azure: To boost your SAP efficiency and reliability, your team will master the deployment of SAP solutions on Azure, including single-instance SAP HANA and SAP NetWeaver high availability configurations. This expertise will streamline migrations, optimize performance, and ensure smooth operations, ultimately reducing costs and maximizing the value of your SAP investment. 

Explore the foundations of IaaS for SAP on Azure: Unlock the power of Azure’s Infrastructure as a Service (IaaS) for your SAP workloads. Our comprehensive training equips your team with the expertise to design and deploy scalable, high-performing SAP solutions on Azure. By mastering essential Azure resources like virtual machines, virtual networks, storage accounts, and Microsoft Entra ID, you’ll optimize your infrastructure for efficiency and resilience, ultimately driving business growth. 

Explore SAP HANA on Azure (Large Instances): Optimize your SAP HANA investment as this program equips your team with the expertise to architect, deploy, and manage SAP HANA on Azure (Large Instances) effectively. By mastering key concepts like high availability, disaster recovery, security, and cost optimization, your business can ensure a resilient, secure, and cost-effective SAP HANA environment in the cloud. 

Plan to deploy the Microsoft Sentinel solution for SAP: This standalone learning module provides an overview of Microsoft Sentinel’s comprehensive security solution for SAP environments. It outlines the roles and responsibilities of the team members involved in designing, deploying, and implementing this solution.

Improve SAP seller productivity with Microsoft Teams and Power Platform: Streamline your sales and service processes with our SAP and Microsoft 365 integration training. Your team will master integrating Teams with SAP Sales and Service Core, enabling seamless collaboration. Additionally, they’ll learn to automate tasks using Power Automate, enhancing productivity and efficiency for your SAP sellers. 

Planning and deploying SAP on Azure: Unlock the power of Azure to optimize your SAP operations. Our comprehensive course teaches you to efficiently deploy, configure, and manage SAP workloads in the cloud. Gain expertise in migrating and operating SAP solutions, mastering essential Azure resources like VMs, storage, and networking. Enhance scalability, performance, and security while reducing costs.

Migrate and Modernize SAP in the Microsoft Cloud: Accelerate your SAP deployment on Azure with our one-day course for experienced SAP professionals. Gain hands-on expertise in setting up Azure infrastructure specifically for SAP, including virtual machines, networking, storage, and identity management. Learn best practices for backup, disaster recovery, and monitoring to ensure optimal performance and reliability of your SAP environment in the cloud. 

Learn from the pros with live, interactive Virtual Training Days 


Virtual Training Days are instructor-led classes designed to equip individuals and teams with in-demand skills related to cloud migration, AI, and other cutting-edge technologies. We offer Virtual Training Days to help you migrate SAP to Azure, optimizing your performance, reliability, and scalability while reducing costs. In this session, Migrate and Modernize SAP on the Microsoft Cloud, you’ll find out how to secure and monitor SAP workloads on Azure. Come explore how this move enhances productivity, fosters secure collaboration, and gives you AI-powered insights for greater efficiency. 

Showcase your accomplishments with Certifications 


Microsoft Certifications are the official badges that prove you’ve got the skills, whether it’s Azure, Microsoft 365, or other technologies. Getting certified isn’t just a pat on the back; it’s a ticket to career opportunities and industry recognition. Plus, it keeps you on top of the latest tech trends. So, if you want your IT game strong, Microsoft Certifications are the way to go.

Planning and Administering Microsoft Azure for SAP Workloads: To qualify for this certification, you need in-depth knowledge of SAP systems and industry standards for migration, integration, and long-term operation on Azure. Professionals in this career collaborate with cloud administrators, database administrators, and clients to implement solutions, recommending services and adjusting resources for optimal performance and resilience.

Connect and learn: Network with industry leaders at the SAP Sapphire conference 


SAP Sapphire 2024 was a series of global events that brought together SAP experts, partners, and customers to explore the latest innovations and strategies in business technology. With a focus on empowering businesses to turn their visions into reality, SAP Sapphire blended in-person and virtual experiences to share valuable insights, practical knowledge, and networking opportunities to help you drive digital transformation and achieve your business goals. 

Azure experts were on hand to share their expertise and discuss developments related to the cloud and AI. Both sessions are available on-demand for viewing:

Unlock innovation with AI on the Microsoft Cloud: We discussed how integrating AI with Azure and SAP solutions can help your organization learn valuable insights, automate processes, and optimize efficiency for your most critical business operations. This session showed participants how to make AI a reality for their organizations, boosting productivity and creating new opportunities for growth. 

Accenture’s RISE with SAP journey with Microsoft and generative AI: Discover how Accenture, a global IT services and consulting firm, successfully migrated to the RISE with SAP solution on Microsoft Cloud, leveraging Microsoft AI solutions for enhanced innovation and efficiency. This collaborative effort among Accenture, Microsoft, and SAP resulted in a transformed enterprise system, delivering greater insights, agility, and operational efficiency.

Source: microsoft.com

Tuesday 11 June 2024

Azure Databricks: Differentiated synergy

Azure Databricks: Differentiated synergy

Companies have long collected data from various sources, leading to the development of data lakes for storing data at scale. However, data lakes lacked critical features such as data quality. The Lakehouse architecture emerged to address the limitations of data warehouses and data lakes. Lakehouse is a robust framework for enterprise data infrastructure, with Delta Lake as the storage layer which has gained popularity. Databricks, a pioneer of the Data Lakehouse, an integral component of their Data Intelligence Platform is available as a fully managed first party Data and AI solution on Microsoft Azure as Azure Databricks, making Azure the optimal cloud for running Databricks workloads. This blog post discusses the key advantages of Azure Databricks in detail: 

1. Seamless integration with Azure.
2. Regional availability and performance.
3. Security and compliance.
4. Unique partnership: Microsoft and Databricks.

1. Seamless integration with Azure 


Azure Databricks is a first-party service on Microsoft Azure, offering native integration with vital Azure Services and workloads that add value, allowing for rapid onboarding onto a Databricks workspace with just a few clicks.

Native integration—as a first party service 


◉ Microsoft Entra ID (formerly Azure Active Directory): Azure Databricks integrates with Microsoft Entra ID, enabling managed access control and authentication effortlessly. Engineering teams jointly at Microsoft and Databricks have natively built this integration out of the box with Azure Databricks, so they don’t have to build this integration on their own. 

◉ Azure Data Lake Storage (ADLS Gen2): Databricks can directly read and write data from ADLS Gen2 which has been collaboratively optimized for fastest possible data access, enabling efficient data processing and analytics. The integration of Azure Databricks with Azure Storage platforms such as Data Lake and Blob Storage provides a more streamlined experience on data workloads. 

◉ Azure Monitor and Log Analytics: Azure Databricks clusters and jobs can be monitored using Azure Monitor and gain insights through Log Analytics.

◉ Databricks extension to VS code: The Databricks extension for Visual Studio Code is specifically designed to work with Azure Databricks, providing a direct connection between the local development environment and Azure Databricks workspace.

Integrated services that deliver value 


◉ Power BI: Power BI is a business analytics service that provides interactive visualizations with self-service business intelligence capabilities. Using Azure Databricks as a data source with Power BI brings the advantages of Azure Databricks performance and technology beyond data scientists and data engineers to all business users. Power BI Desktop can be connected to Azure Databricks clusters and Databricks SQL warehouses. Power BI’s strong enterprise semantic modeling and calculation capabilities allows defining calculations, hierarchies, and other business logic that’s meaningful to customers, and orchestrating the data flows into the model with Azure Databricks Lakehouse. It is possible to publish Power BI reports to the Power BI service and enable users to access the underlying Azure Databricks data using single sign-on (SSO), passing along the same Microsoft Entra ID credentials they use to access the report. With a Premium Power BI license, it is possible to Direct Publish from Azure Databricks, allowing you to create Power BI datasets from tables and schemas from data present in Unity Catalog directly from the Azure Databricks UI. Direct Lake mode is a unique feature currently available in Power BI Premium and Microsoft Fabric FSKU ( Fabric Capacity/SKU) capacity that works with Azure Databricks. It allows for the analysis of very large data volumes by loading parquet-formatted files directly from a data lake. This feature is particularly useful for analyzing very large models with less delay and models with frequent updates at the data source. 

◉ Azure Data Factory (ADF): ADF provides the capability to natively ingest data to the Azure cloud from over 100 different data sources. It also provides graphical data orchestration and monitoring capabilities that are easy to build, configure, deploy, and monitor in production. ADF has native integration with Azure Databricks via the Azure Databricks linked service and can execute notebooks, Java Archive file format (JARs), and Python code activities which enables organizations to build scalable data orchestration pipelines that ingest data from various data sources and curate that data in the Lakehouse.

◉ Azure Open AI: Azure Databricks includes built-in tools to support ML workflows, including AI Functions, a built-in DB SQL function, allowing you to access Large Language Models (LLMs) directly from SQL. With this launch, customers can now quickly experiment with LLMs on their company’s data from within a familiar SQL interface. Once the correct LLM prompt has been developed, it can turn quickly into a production pipeline using existing Databricks tools such as Delta Live Tables or scheduled Jobs.

◉ Microsoft Purview: Microsoft Azure’s data governance solution, Microsoft Purview integrates with Azure Databricks Unity Catalog’s catalog, lineage and policy Application Programming Interfaces (APIs). This allows discovery and request-for-access within Microsoft Purview, while keeping Unity Catalog as the operational catalog on Azure Databricks. Microsoft Purview supports metadata sync with Azure Databricks Unity Catalog which includes metastore catalogs, schemas, tables including the columns, and views including the columns. In addition, this integration enables discovery of Lakehouse data and bringing its metadata into Data Map which allows scanning the entire Unity Catalog metastore or choosing to scan only selective catalogs. The integration of data governance policies in Microsoft Purview and Databricks Unity Catalog enables a single pane experience for Data and Analytics Governance in Microsoft Purview.

Best of both worlds with Azure Databricks and Microsoft Fabric 


Azure Databricks: Differentiated synergy
Microsoft Fabric is a unified analytics platform that includes all the data and analytics tools that organizations need. It brings together experiences such as Data Engineering, Data Factory, Data Science, Data Warehouse, Real-Time Intelligence, and Power BI onto a shared SaaS foundation, all seamlessly integrated into a single service. Microsoft Fabric comes with OneLake, an open and governed, unified SaaS data lake that serves as a single place to store organizational data. Microsoft Fabric simplifies data access by creating shortcuts to files, folders, and tables in its native open format Delta-Parquet into OneLake. These shortcuts allow all Microsoft Fabric engines to operate on the data without the need for data movement or copying with no disruption to existing usage by the host engines.

For instance, creating a shortcut to Delta-Lake tables generated by Azure Databricks enables customers to effortlessly serve Lakehouse data to Power BI via the option of Direct Lake mode. Power BI Premium, as a core component of Microsoft Fabric, offers Direct Lake mode to serve data directly from OneLake without the need to query an Azure Databricks Lakehouse or warehouse endpoint, thereby eliminating the need for data duplication or import into a Power BI model enabling blazing fast performance directly over data in OneLake as an alternative to serving to Power BI via ADLS Gen2. Having access to both Azure Databricks and Microsoft Fabric built on the Lakehouse architecture, Microsoft Azure customers have a choice to work with either one or both powerful open governed Data and AI solutions to get the most from their data unlike other public clouds. Azure Databricks and Microsoft Fabric together can simplify organizations’ overall data journey with deeper integration in the development pipeline.

2. Regional availability and performance 


Azure provides robust scalability and performance capabilities for Azure Databricks: 

  • Azure Compute optimization for Azure Databricks: Azure offers a variety of compute options, including GPU-enabled instances, which accelerate machine learning and deep learning workloads collaboratively optimized with Databricks engineering. Azure Databricks globally spins up more than 10 million virtual machines (VMs) a day. 
  • Availability: Azure currently has 43 available regions worldwide supporting Azure Databricks and growing. 

3. Security and compliance 


All the enterprise grade security, compliance measures of Azure apply to Azure Databricks prioritizing it to meet customer requirements: 

  • Azure Security Center: Azure Security Center provides monitoring and protection of Azure Databricks environment against threats. Azure Security Center automatically collects, analyzes, and integrates log data from a variety of Azure resources. A list of prioritized security alerts is shown in Security Center along with the information needed to quickly investigate the problem along with recommendations on how to remediate an attack. Azure Databricks provides encryption features for additional control of data.
  • Azure Compliance Certifications: Azure holds industry-leading compliance certifications, ensuring Azure Databricks workloads meet regulatory standards. Azure Databricks is certified under PCI-DSS (Classic) and HIPAA (Databricks SQL Serverless, Model Serving).
  • Azure Confidential Compute (ACC) is only available on Azure. Using Azure confidential computing on Azure Databricks allows end-to-end data encryption. Azure offers Hardware-based Trusted Execution Environments (TEEs) to provide a higher level of security by encrypting data in use in addition to AMD-based Azure Confidential Virtual Machines (VMs) which provides full VM encryption while minimizing performance impact.
  • Encryption: Azure Databricks supports customer-managed keys from Azure Key Vault and Azure Key Vault Managed HSM (Hardware Security Modules) natively. This feature provides an additional layer of security and control over encrypted data.

4. Unique partnership: Databricks and Microsoft


One of the standout attributes of Azure Databricks is the unique partnership between Databricks and Microsoft. Here’s why it’s special: 

  • Joint engineering: Databricks and Microsoft collaborate on product development, ensuring tight integration and optimized performance. This includes dedicated Microsoft resources in engineering for developing Azure Databricks resource providers, workspace, and Azure Infra integrations, as well as customer support escalation management in addition to growing engineering investments for Azure Databricks. 
  • Service operation and support: As a first party offering, Azure Databricks is exclusively available in the Azure portal, simplifying deployment and management for customers. Azure Databricks is managed by Microsoft with support coverage under Microsoft support contracts subject to the same SLAs, security policies, and support contracts as other Azure services, ensuring quick resolution of support tickets in collaboration with Databricks support teams as needed. 
  • Unified billing: Azure provides a unified billing experience, allowing customers to manage Azure Databricks costs transparently alongside other Azure services. 
  • Go-To-Market and marketing: Co-marketing, GTM collaboration, and co-sell activities between both organizations that include events, funding programs, marketing campaigns, joint customer testimonials, and account-planning and much more provides elevated customer care and support throughout their data journey. 
  • Commercial: Large strategic enterprises generally prefer dealing directly with Microsoft for sales offers, technical support, and partner enablement for Azure Databricks. In addition to Databricks sales teams, Microsoft has a global footprint of dedicated sales, business development, and planning coverage for Azure Databricks meeting unique needs of all customers.

Let Azure Databricks help boost your productivity


Choosing the right data analytics platform is crucial. Azure Databricks, a powerful data analytics and AI platform, offers a well-integrated, managed, and secure environment for data professionals, resulting in increased productivity, cost savings, and ROI. With Azure’s global presence, integration of workloads, security, compliance, and a unique partnership with Microsoft, Azure Databricks is a compelling choice for organizations seeking efficiency, innovation, and intelligence from their data estate 

Source: microsoft.com

Saturday 8 June 2024

Raise the bar on AI-powered app development with Azure Database for PostgreSQL

Raise the bar on AI-powered app development with Azure Database for PostgreSQL

Known for its reliability and versatility, PostgreSQL is a popular and powerful open-source database system with a wide array of features. By harnessing the might of PostgreSQL in the cloud—with all the scalability and convenience you expect—comes Microsoft Azure Database for PostgreSQL. This fully managed service takes the hassle out of managing your PostgreSQL instances, allowing you to focus on what really matters: building amazing, AI-powered applications.

To better get you acquainted with how Azure Database for PostgreSQL empowers users to migrate their PostgreSQL databases and build intelligent apps, this blog will introduce a roster of new learning paths and events, including a pair of Cloud Skills Challenges. As if that’s not exciting enough, completing one of the challenges automatically enters you in a drawing for a great prize. So, let’s get going! 

Seamless database migration and app creation


Say goodbye to tedious maintenance tasks and hello to seamless deployments, automated patching, and built-in high availability. Azure Database for PostgreSQL is a fully managed service that simplifies the migration of existing PostgreSQL databases to the cloud. We handle the burdens of patching, backups, and scaling—allowing you to focus on your applications. 

Seamless compatibility with PostgreSQL minimizes code changes during the transition and caters to diverse needs and budgets. With migration tooling in Azure Database for PostgreSQL, transferring data and schemas to the cloud becomes a breeze. 

Raise the bar on AI-powered app development with Azure Database for PostgreSQL
Beyond migration, Azure Database for PostgreSQL empowers the development of AI-powered applications. Its native support for the pgvector extension allows for efficient storage and querying of vector embeddings, essential for AI and machine learning tasks. The service seamlessly integrates with other Azure AI services, such as Azure Machine Learning, Azure OpenAI Service, Microsoft Azure AI Language, and Microsoft Azure AI Translator, providing developers with a rich toolkit for building intelligent applications.

Additionally, the service’s scalability ensures optimal performance as AI workloads grow, maintaining cost efficiency throughout the development process. Overall, Azure Database for PostgreSQL provides a comprehensive solution for both migrating to the cloud and building powerful AI applications. 

Here are some key features: 

  • High availability: Up to 99.99% uptime guaranteed with zone-redundant high availability, automated maintenance, patching, and updates.
  • Performance automation: Get analysis of your database workloads to identify opportunities to improve query performance with query store and index recommendations.
  • Security: Includes Microsoft Defender for open-source relational databases to protect your data, and Azure IP Advantage, which is designed to protect businesses and developers who build on Azure from intellectual property risks.
  • Azure AI extension: Generate and store vector embeddings, call Azure AI services, and build AI-powered apps directly within the database.
  • Migration support: Tools to migrate Oracle Database to Azure Database for PostgreSQL are available, making the transition smoother.
  • Cost-effective: Provides operational savings—up to 62% compared with on-premises—with comprehensive database monitoring and optimization tools, which can lead to a lower total cost of ownership. 

Learn at your own pace with curated lessons 


Now that you’ve gotten a primer on Azure Database for PostgreSQL, the next step is engaging with our curated learning paths. The collected modules in these two courses include readings, exercises, and knowledge checks.

◉ Build AI Apps with Azure Database for PostgreSQL

Designed for developers interested in harnessing AI within their PostgreSQL applications on Azure, this learning path explores how the Azure AI extension for Azure Database for PostgreSQL can be leveraged to incorporate AI capabilities into your apps.

By completing this learning path, you’ll gain a solid understanding of the Azure AI extension and its various functionalities. Discover how to evaluate different summarization techniques available through Azure AI services and the azure_ai extension, explore the differences between extractive, abstractive, and query-focused summarization, and apply generative AI summarization techniques to data within a PostgreSQL database. This hands-on experience will empower you to leverage Azure AI services and the azure_ai extension to build intelligent applications that can summarize complex content into concise and informative summaries. 

◉ Configure and migrate to Azure Database for PostgreSQL

This learning path supplies you with the essential skills needed to effectively work with Azure Database for PostgreSQL. It begins with a foundational understanding of PostgreSQL architecture and core concepts, before delving into practical aspects such as connecting to the database, executing queries, and ensuring robust security measures.

You’ll also learn how to create and manage databases, schemas, and tables, and how to leverage stored procedures and functions for code reusability. With insights into how Azure Database for PostgreSQL implements ACID transactions and write-ahead logging for data integrity and durability, you’ll gain confidence in configuring, managing, and migrating existing PostgreSQL databases to Azure.

Complete timed challenges to win Azure prizes


To go along with these learning paths, we’ve also assembled a pair of corresponding Azure Database for PostgreSQL Cloud Skills Challenges. While learning paths are usually self-paced, solitary activities, Cloud Skills Challenges are part interactive learning sprint, part good-natured tournament between you and thousands of your peers around the globe. They’re immersive, gamified learning experiences blending hands-on exercises, tutorials, and assessments to ensure a well-rounded learning experience. 

Connect with PostgreSQL experts at POSETTE 2024 conference


Hosted by Microsoft, POSETTE 2024 (formerly Citus Con), is an exciting developer event dedicated to all things PostgreSQL. The event is a unique opportunity to learn from experts, network with fellow Postgres enthusiasts, and delve into the latest innovations in database technology. 

As a key player in the PostgreSQL community, we’ll be showcasing our commitment to the open-source database system. Attendees can look forward to a session on the future of Azure Database for PostgreSQL, where our experts will share our vision for the service and its integration with other Azure offerings.

Running June 11 to 13, 2024, POSETTE—which stands for Postgres Open Source Ecosystem Talks, Training, and Education—is a free, virtual event featuring four unique livestreams. Registration is optional, and all scheduled talks will be online to watch immediately after the event ends. Don’t miss out on this chance to connect with the Microsoft team and learn how we’re advancing PostgreSQL in the cloud.

Take the next step on your Azure Database for PostgreSQL journey


Whether you’re a seasoned developer or just starting out, PostgreSQL and Azure Database for PostgreSQL is a dream team for building modern, scalable, and AI-powered apps. By offering robust migration tools and seamless integration with AI and machine learning services, Azure Database for PostgreSQL helps users efficiently migrate to the cloud and build sophisticated AI applications.

Get started today with our pair of learning paths and their respective Cloud Skills Challenges to be entered into a drawing for cool Azure prizes, then check out the POSETTE 2024 livestreams to learn more about everything you can do with the world’s most advanced open-source database.

Source: microsoft.com

Thursday 6 June 2024

Announcing Advanced Container Networking Services for your Azure Kubernetes Service clusters

Announcing Advanced Container Networking Services for your Azure Kubernetes Service clusters

Following the successful open sourcing of Retina: A Cloud-Native Container Networking Observability Platform, Microsoft’s Azure Container Networking team is excited to announce a new offering called Advanced Container Networking Services. It’s a suite of services built on top of existing networking solutions for Azure Kubernetes Services (AKS) to address complex challenges around observability, security, and compliance. The first feature in this suite, Advanced Network Observability, is now available in Public Preview.

What is Advanced Container Networking Services?


Advanced Container Networking Services is a suite of services built to significantly enhance the operational capabilities of your Azure Kubernetes Service (AKS) clusters. The suite is comprehensive and is designed to address the multifaceted and intricate needs of modern containerized applications. With capabilities specifically tailored for observability, security, and compliance, customers can unlock a new approach to managing container networking.

Advanced Container Networking Services focuses on delivering a seamless and integrated experience that empowers you to maintain robust security postures, ensure comprehensive compliance and gain deep insights into your network traffic and application performance. This ensures that your containerized applications are not only secure and compliant but also meet or exceed your performance and reliability goals, allowing you to confidently manage and scale your infrastructure.

What is Advanced Network Observability?


Advanced Network Observability is the inaugural feature of the Advanced Container Networking Services suite bringing the power of Hubble’s control plane to both Cilium and Non-Cilium Linux data planes. It unlocks Hubble metrics, Hubble’s command line interface (CLI) and the Hubble user interface (UI) on your AKS clusters providing deep insights into your containerized workloads. Advanced Network Observability empowers customers to precisely detect and root-cause network related issues in a Kubernetes cluster.

This capability provides network flow information in the form of metrics or flow logs at pod-level granularity by collecting data in real time from Linux Kernel leveraging extended Berkeley Packet Filter (eBPF) technology. Along with network traffic flows, volumetric data and dropped packets, it now brings domain name service (DNS) metrics and flow information with deep request and response insights.

  • eBPF based observability powered by either Cilium or Retina.
  • Container Network Interface (CNI) agnostic experience.
  • Monitor network traffic in real time to identify bottlenecks and performance issues with Hubble metrics.
  • Trace packet flows across your cluster to understand and debug complex networking behaviors with on-demand Hubble command line interface (CLI) network flows.
  • Visualize network dependencies and interactions between services to ensure optimal configuration and performance with an unmanaged Hubble UI.
  • Generate detailed metrics and logs to meet compliance requirements and enhance security postures.

Announcing Advanced Container Networking Services for your Azure Kubernetes Service clusters
Architecture diagram of Hubble interfacing with Cilium/Retina.

Container Network Interface (CNI) agnostic Hubble


Advanced Network Observability extends the Hubble control plane beyond Cilium. In Cilium based clusters, Cilium provides the eBPF events to Hubble. In non-Cilium based clusters, Microsoft Retina serves as the dataplane surfacing deep insights to Hubble, providing a seamless interactive experience for customers.

Visualizing Hubble metrics with Grafana


Advanced Network Observability supports two integration modes for visualization:

  • Azure Managed Prometheus and Grafana.
  • Bring your own (BYO) Prometheus and Grafana for advanced users comfortable with increased management overhead.

With the Azure-managed Prometheus and Grafana approach, Azure offers integrated services that simplify the setup and management of monitoring and visualization. Azure Monitor provides a managed instance of Prometheus, which collects and stores metrics from various sources including Hubble.

Querying network flows with Hubble CLI


With Advanced Network Observability, customers can use the Hubble command line interface (CLI) to query for all or filtered network flows across all nodes.

Customers will be able to identify dropped or forwarded flows from all nodes via a single pane of glass.

Service dependency graph with Hubble UI


Customers can deploy Hubble UI on to clusters with Advanced Network Observability enabled to visualize service dependencies. Hubble UI provides on-demand view of flows across the whole cluster and allows customers to select a given namespace and view network flows between different pods within the cluster surfacing in-depth information about each flow.

Benefits


Advanced network visibility

Advanced Network Observability offers unparalleled network visibility by providing granular insights into network traffic at the pod level. This detailed visibility enables administrators to monitor traffic flows, detect anomalies, and gain a comprehensive understanding of network behavior within their Azure Kubernetes Service (AKS) clusters. By leveraging eBPF-based data collection from the Linux Kernel, Advanced Network Observability provides real-time metrics and logs that surface traffic volume, packet drops, and DNS metrics. This enhanced visibility ensures that network administrators can swiftly identify and address potential issues, thereby maintaining optimal network performance and security.

Cross node network flow tracking

With Advanced Network Observability, customers can track network flows across multiple nodes within their Kubernetes clusters. This allows precise tracing of packet flows, making it possible to understand complex networking behaviors and interactions between different nodes. Hubble CLI can query network flows enabling users to filter and analyze specific traffic patterns. This cross-node tracking capability is invaluable for debugging network issues, as it surfaces the entire network flow within a single pane of glass identifying both dropped and forwarded packets across all nodes.

Real-time performance monitoring

Advanced Network Observability provides customers real-time performance monitoring capabilities. By integrating Hubble metrics powered by either Cilium or Retina, users can monitor network traffic in real time, identifying bottlenecks and performance issues as they occur. This immediate feedback loop is critical for maintaining high-performance and ensuring that any degradation in network performance is promptly surfaced and addressed. The managed Hubble metrics and flow logs offer continuous, detailed insights into network operations, allowing for proactive management and rapid troubleshooting.

Multi-Cluster historical analysis

Advanced Network Observability coupled with Azure Managed Prometheus and Grafana extends its benefits to multi-cluster environments, providing historical analysis capabilities that are essential for long-term network management and optimization. By storing and analyzing historical data across multiple clusters, administrators can identify trends, patterns, and recurring issues that may impact network performance and reliability going forward. This historical perspective is crucial for capacity planning, performance benchmarking, and compliance reporting. The ability to review and analyze past network data helps in understanding the evolution of network performance over time and informs future network design and configuration decisions.

Source: microsoft.com

Tuesday 4 June 2024

Microsoft and Broadcom to support license portability for VMware Cloud Foundation on Azure VMware Solution

Microsoft and Broadcom to support license portability for VMware Cloud Foundation on Azure VMware Solution

Microsoft and Broadcom have partnered closely for many years to support our mutual customers, and we continue to build and innovate together as customer needs change. Today, we are pleased to share that Microsoft and Broadcom are expanding our partnership with plans to support VMware Cloud Foundation subscriptions on Azure VMware Solution. Customers that own or purchase licenses for VMware Cloud Foundation will be able to use those licenses on Azure VMware Solution, as well as their own datacenters, giving them flexibility to meet changing business needs.

This provides an additional purchase option for Azure VMware Solution, which has been sold and operated by Microsoft since 2019. Customers can currently purchase the solution with VMware licenses included, and this option will continue to be available for customers that prefer to purchase their VMware licenses as part of their solution from Microsoft.

Azure VMware Solution delivers a fully managed VMware environment that is operated and supported by Microsoft. Customers can move VMware workloads to Azure “as is” with minimal to no refactoring. This streamlines migration and allows customers to continue using familiar skills while learning new Azure skills.  

By migrating to Azure VMware Solution, now available in 33 regions around the world, organizations can take advantage of Azure’s scalable and high-performance cloud infrastructure. Customers can deploy with business-critical capabilities such as backup, high availability, threat protection, and performance monitoring. Moreover, workloads running on Azure VMware Solution can be integrated with Azure’s portfolio of more than 200 cloud services to accelerate innovation, gain deeper insights from data with advanced AI services, and to modernize business applications. 

VMware Cloud Foundation delivers a private cloud platform that is ubiquitous, flexible, and integrated across cloud endpoints. By deploying VMware Cloud Foundation ​on​​ ​Azure, customers benefit from a highly optimized cloud operating model that provides the scale and agility of public cloud with the security and performance of private cloud. Running VMware Cloud Foundation ​on ​Azure enables organizations to modernize IT infrastructure with demonstrable TCO, provide developers a self-service private cloud experience resulting in greater productivity, and achieve better cyber resiliency and security.

With​​​​​​ improved license portability for customers with eligible VMware Cloud Foundation entitlements, customers will be able to purchase subscriptions of the new VMware Cloud Foundation software and have complete mobility to and from their on-premises environment to Azure VMware Solution. VMware customers that have already purchased and begun deploying the new VMware Cloud Foundation will be able to transfer the remaining value of an existing subscription to Azure VMware Solution. Additionally, customers will be able to move their VMware Cloud Foundation subscription between on-premises and Azure VMware Solution as their needs and requirements evolve over time. ​​​​​​Customers will retain the rights to their software subscription when moving their VMware Cloud Foundation subscription to Azure VMware Solution. 

VMware Rapid Migration Plan: Reduce your migration time and cost


In addition to the new VMware license portability benefit, the VMware Rapid Migration Plan provides an additional and comprehensive set of licensing benefits and programs to reduce the cost and time it takes for organizations to migrate to Azure VMware Solution.  

The plan includes:

  • Price protection: With reserved instances customers can lock in pricing for one, three, or five years.  
  • Savings for Windows Server and SQL Server: Windows Server and SQL Server are common workloads on VMware environments. With Software Assurance for on-premises Windows Server and SQL Server licenses, organizations can qualify for the Azure Hybrid Benefit discount to use existing Windows Server and SQL Server licenses in Azure VMware Solution. Free Extended Security Updates are available for older versions that face end of support.
  • Migration support: Use Azure Migrate and Modernize to get resources, expert help, and funding from Microsoft and its partner ecosystem. 
  • Azure credits: Customers that purchase a new reserved instance for Azure VMware Solution can get additional Azure credits valid for Azure VMware Solution or other Azure services. 

Supporting your cloud journey with Microsoft and Broadcom


We are committed to continued partnership and innovation to support our mutual customers as they adapt to changing business needs. VMware Cloud Foundation license portability to Azure VMware Solution will be available later this year, so now is a great time to contact your account team or Microsoft partner to start planning your move.

Source: microsoft.com