Thursday 30 March 2023

Enhanced Azure Arc integration with Datadog simplifies hybrid and multicloud observability

Businesses today are managing complex, distributed environments and need a ubiquitous computing platform for all workloads that can meet them where they are. We’ve seen an increasing need for customers to not only deploy, manage, and operate across on-premises and one or more clouds, but also to have better visibility and insights across all IT investments spanning cloud to edge.

Today, we’re delivering improved observability and management with the general availability of our enhanced Microsoft Azure Arc integration with Datadog. Building on our established collaboration, we are natively integrating Datadog with Azure Arc to meet customers where they are and provide rich insights from Azure Arc–enabled resources directly into Datadog dashboards. Customers can monitor real-time data during cloud migrations and performance of applications running both in the public cloud and in hybrid or multicloud environments.

Benefits of Azure Arc integration with Datadog


With the Azure Arc integration with Datadog, customers can:

◉ Monitor the connection status and agent version of Azure Arc–enabled servers, wherever they are running.
◉ Automatically add Azure tags to associated hosts in Datadog for additional context.
◉ Identify which Azure Arc–enabled servers have the Datadog Agent installed.
◉ Deploy the Datadog Agent onto your Azure Arc–enabled servers as an extension.
◉ Get unified billing for the Datadog service through Azure subscription invoicing.

Datadog is a cloud-scale monitoring and security platform for large-scale applications that aggregates data across your entire stack with more than 600 integrations for centralized visibility and faster troubleshooting on dynamic architectures. This provides developers and operations teams observability into every layer of their applications on Azure, so they diagnose performance issues quickly.

When Datadog first became an Azure Native ISV Service, it allowed customers to streamline their experience for purchasing, configuring, and managing Datadog directly inside the Azure portal. It reduced the learning curve for using Datadog to monitor the health and performance of your applications in Azure and sets customers up for a successful cloud migration or modernization.

For many customers, hybrid deployments are a durable and long-term strategy due to factors such as latency and compliance requirements, and we are committed to meeting customers wherever they are. With Azure Arc, we provide a consistent set of tools and services for customers to extend cloud technology across your distributed infrastructure. More than 12,000 customers are using Azure Arc, double the number a year ago. By partnering with organizations like Datadog, we are unlocking even more innovation and bringing Azure services into the tools our customers are already using.

Enhanced Azure Arc integration features


Features available with today’s general availability include:

Monitor the Arc connection status and agent version

Customers can easily identify any Azure Arc–enabled resources that are not in a connected state. You can also set up Datadog monitors to alert you immediately if the connection is unhealthy. Before this new integration, Azure Arc resources would look like any other virtual machine on-premises or in Azure. Now, you can access critical metadata to ensure your Azure Arc–enabled Windows and Linux servers, SQL servers, and Kubernetes clusters are secured and connected. IT operators will be able to troubleshoot much faster if a resource is disconnected and can quickly restore the connectivity to Azure Arc.

Azure Exam, Azure Exam Prep, Azure Preparation, Azure Career, Azure Skills, Azure Jobs, Azure Certification, Azure Learning, Azure Guides

Datadog can also show which hosts are running an older version of Azure Arc. It then becomes easy to update the agent using Azure Update Management and utilize Azure Automation for latest updates to the Azure Arc agent whenever there is a new version.

Azure Exam, Azure Exam Prep, Azure Preparation, Azure Career, Azure Skills, Azure Jobs, Azure Certification, Azure Learning, Azure Guides

Automatically add Azure tags for easy management and compliance tracking


A popular benefit of Azure Arc is using tags in Azure Resource Manager. Many organizations tag on-premises resources by cost center or datacenter server groups that are subject to specific regulations or requirements. Tags also create an audit trail to help trace the history of a particular resource and identify potential security issues when performing audits.

With the Azure Arc integration, Datadog can build rich visualizations and actionable alerts using the tags you have already created for Azure Arc–enabled resources. Now, when you perform patching or updates for Azure Arc–enabled servers, you get much richer insights to help validate software patches and troubleshoot application issues.

Easily identify which Azure Arc–enabled servers have the Datadog Agent

Azure Arc brings your hybrid and multicloud servers, Kubernetes clusters, and data services into a single dashboard for seamless management between environments. Aside from grouping resources with Azure Resource Manager, Azure Arc–enabled resources benefit from Azure role-based access control (RBAC), so different IT and developer teams can easily delegate access to their applications. For a centralized IT monitoring team, you can now ensure your Azure Arc–enabled resources have the Datadog Agent by cross-referencing these servers with agent data to get a real-time view of which Arc resources have Datadog Agent reporting.

Azure Exam, Azure Exam Prep, Azure Preparation, Azure Career, Azure Skills, Azure Jobs, Azure Certification, Azure Learning, Azure Guides

Source: microsoft.com

Wednesday 29 March 2023

Powerful Microsoft MB-910 Exam Preparation Strategies Everyone Should Know

For businesses to make money, they need to know how to communicate with their customers and potential customers effectively. To improve their communication strategies, businesses use software and applications to keep track of important information that could make a big difference in whether or not they make a sale. One such application is Microsoft Dynamics 365, which is a cloud-based suite of business tools that includes elements of both an Enterprise Resource Planning system (ERP) and a Customer Relationship Management (CRM) system. If you’re looking to establish yourself as a leader in the CRM field, getting certified in MB-910: Microsoft Dynamics 365 Fundamentals (CRM) can be a valuable asset.

Overview of Microsoft Dynamics 365 Fundamentals (CRM) MB-910 Exam

This certification is designed for people who want to prove that they have the expertise and practical experience working with the customer engagement applications of Microsoft Dynamics 365.

To increase their chances of success in obtaining this certification, candidates are recommended to have prior experience in IT, customer engagement, and business operations.

The Microsoft Dynamics 365 Fundamentals (CRM) exam, also known as MB-910, will consist of 40-60 questions answerable in 60 minutes. The exam questions are of different types, such as case studies, short answers, multiple-choice, mark review, drag and drop, and more. To pass the exam, you must score at least 700 out of a possible 1000 points. The cost of the MB-910 exam is $99 USD, including additional taxes. The exam is available only in English, Japanese, Chinese (Simplified), Korean, French, Spanish, Portuguese (Brazil), Russian, Arabic (Saudi Arabia), Chinese (Traditional), Italian, and German languages.

Target Audience

Individuals at different levels of their professional journey and working in various roles can gain advantages from this basic course. Here are some instances:

  • IT experts who aim to demonstrate a basic comprehension of the software they use.
  • Business representatives and other individuals who have knowledge of Dynamics 365 and seek to authenticate their abilities and expertise.
  • Developers who wish to showcase their comprehension of customer relationship management and business operations.
  • Individuals such as students, fresh graduates, or career changers aspire to utilize Dynamics 365 customer engagement features to advance their careers.
  • How to Prepare for the Microsoft MB-910 Exam?

    Preparation is a crucial element when it comes to passing an exam. Even if you have extensive experience, it is beneficial to familiarize yourself with the content, characteristics, and strategies of the specific test you intend to take. If you are planning to take the Microsoft Dynamics 365 Fundamentals (CRM) MB-910 exam, this article provides some helpful tips to help you get ready. Let’s take a closer look at them.

  • Review the study guide, comprehend the exam goals, and establish a study schedule to ensure that you do not procrastinate and leave everything to the last minute.
  • Next, select Microsoft’s learning path and delve further into the main areas and subtopics. Watch the instructor-led videos to gain a better understanding of the subject.
  • Once you feel self-assured about your preparation, proceed to the MB-910 sample questions. There are numerous practice tests available that serve as a platform for this Microsoft certification exam. Use them to evaluate your progress, identify areas of improvement, and attempt the test again until you are satisfied with your performance.
  • Review your score on practice tests. Once you score more than 90% in practice tests, register for the actual exam and add a significant accomplishment to your credentials.
  • Benefits of Taking the MB-910 Practice Test

    Taking MB-910 practice tests is like sowing a seed, which will eventually allow you to reap the benefits of achieving your desired score and obtaining the Microsoft Certified - Dynamics 365 Fundamentals (CRM) certification. When investing time and effort into something, it is natural to wonder about the benefits. So, what exactly can you gain from taking MB-910 practice tests?

    Apart from helping you pinpoint your areas of weakness and strength, taking MB-910 practice tests will give you a more accurate perception of the actual exam questions. This is, of course, on the condition that you regularly take one or two MB-910 practice tests since it’s impossible to become familiar with something after just one attempt unless you possess a photographic memory.

    Another advantage of taking MB-910 practice tests is the ability to create a personalized and effective study schedule to achieve a realistic score. As previously stated, these practice tests will give you an idea of the format of the actual exam. As a result, you can approach the real exam more confidently because you are already familiar with its structure. TRY OUT FREE MB-910 SAMPLE QUESTIONS NOW!

    Conclusion

    If you follow the exam preparation strategies we have provided for Microsoft MB-910, you will likely achieve a good result. You can make your preparation even more effective by taking practice tests and utilizing the feedback you receive to improve further. Keep in mind that learning the strategies is not enough; you need to take action and implement them. Taking action will demonstrate your commitment to passing the exam. So, take action and see how well these strategies work for you!

    Tuesday 28 March 2023

    Monitor Azure Virtual Network Manager changes with event logging

    Today, our customers establish and manage their Azure virtual networks at scale. As their number of network resources grows, the question of how to maintain connectivity and security among their scale of resources arises. This is where Microsoft Azure Virtual Network Manager comes in—your one-stop shop for managing the connectivity and security of your network resources at scale (currently in preview). And when customers use Azure Virtual Network Manager, they also need visibility into what kind of changes were made so that they can audit those events, analyze those changes over time, and debug issues along the way. This capability is now a reality—Azure Virtual Network Manager event logging is now in preview.

    Azure Virtual Network Manager (AVNM) uses Azure Monitor for telemetry collection and analysis like many other Azure services. AVNM now provides event logs that you can interact with through Azure Monitor’s Log Analytics tool in the Azure Portal, as well as through a storage account. You can also send these logs to an event hub or partner solution.

    With this preview announcement, Azure Virtual Network Manager will provide a log category for network group membership change. In the context of AVNM, network groups are defined by the user to contain virtual networks. The membership of a network group can be manually provided (such as by selecting VNetA, VNetB, and VNetC to be a part of this network group) as well as conditionally set through Azure Policy (such as by defining that any virtual network within a certain subscription that contains some string in its name will be added to this network group). The network group membership change log category tracks when a particular virtual network is added to or removed from a network group. This can be used to track network group membership changes over time, to capture a snapshot of a particular virtual network’s network group membership, and more.

    What attributes are part of this event log category?


    This network group membership change category emits one log per network group membership change. So, when a virtual network is added to or removed from a network group, a log is emitted correlating to that single addition or removal for that particular virtual network. If you’re looking at one of these logs from your storage account, you’ll see several attributes:

    Attribute Description 
    time  Datetime when the event was logged.
    resourceId  Resource ID of the network manager. 
    location  Location of the virtual network resource. 
    operationName  Operation that resulted in the virtual network being added or removed. Always the “Microsoft.Network/virtualNetworks/networkGroupMembership/write” operation. 
    category  Category of this log. Always “NetworkGroupMembershipChange.” 
    resultType  Indicates successful or failed operation. 
    correlationId  GUID that can help relate or debug logs. 
    level  Always “Info.” 
    properties  Collection of properties of the log. 

    Within the properties attribute are several nested attributes:

    Properties attribute Description 
    Message Basic success or failure message.
    MembershipId Default membership ID of the virtual network.
    GroupMemberships Collection of what network groups the virtual network belongs to. There may be multiple “NetworkGroupId” and “Sources” listed within this property since a virtual network can belong to multiple network groups simultaneously.
    MemberResourceId Resource ID of the virtual network that was added to or removed from a network group.

    Within the GroupMemberships attribute are several nested attributes:

    GroupMemberships attribute Description 
    NetworkGroupId ID of a network group the virtual network belongs to.
    Sources

    Collection of how the virtual network is a member of the network group.


    Within the Sources attribute are several nested attributes:
     
    Sources attribute Description
    Type Denotes whether the virtual network was added manually (“StaticMembership”) or conditionally via Azure Policy (“Policy”).
    StaticMemberId If the “Type” value is “StaticMembership,” this property will appear.
    PolicyAssignmentId

    If the “Type” value is “Policy,” this property will appear. ID of the Azure Policy assignment that associates the Azure Policy definition to the network group.

    PolicyDefinitionId

    If the “Type” value is “Policy,” this property will appear. ID of the Azure Policy definition that contains the conditions for the network group’s membership.


    How do I get started?


    The first step you’ll need to take is to set up your Log Analytics workspace or your storage account, depending on how you want to consume these event logs. You should note that if you’re using a storage account or event hub, it will need to be in the same region of the network manager you’re accessing logs from. If you’re using a Log Analytics workspace, it can be in any region. The network manager you’re accessing the logs of won’t need to belong to the same subscription as your Log Analytics workspace or storage account, but permissions may restrict your ability to access logs cross-subscription.

    Note that at least one virtual network must be added or removed from a network group in order to generate logs. A log will generate for this event a couple minutes later.

    Accessing Azure Virtual Network Manager’s event logs with Log Analytics

    The first step is to navigate to your desired network manager and select the Diagnostic settings blade under the Monitoring section. Then you can select Add diagnostic setting and select the option to send the logs to your Log Analytics workspace.

    Monitor Azure Virtual Network Manager, Azure Exam, Azure Exam Prep, Azure Tutorial and Materials, Azure Learning, Azure Certification, Azure Prep, Azure Preparation, Azure Guides, Azure Learning

    Then you can navigate to your Log Analytics workspace directly through your network manager by selecting the Logs blade under the Monitoring section.

    Monitor Azure Virtual Network Manager, Azure Exam, Azure Exam Prep, Azure Tutorial and Materials, Azure Learning, Azure Certification, Azure Prep, Azure Preparation, Azure Guides, Azure Learning

    Monitor Azure Virtual Network Manager, Azure Exam, Azure Exam Prep, Azure Tutorial and Materials, Azure Learning, Azure Certification, Azure Prep, Azure Preparation, Azure Guides, Azure Learning

    Alternatively, you can also navigate to your Log Analytics workspace in the Azure Portal and select the Logs blade.

    Monitor Azure Virtual Network Manager, Azure Exam, Azure Exam Prep, Azure Tutorial and Materials, Azure Learning, Azure Certification, Azure Prep, Azure Preparation, Azure Guides, Azure Learning

    From either place, you can run your own queries on your network manager’s emitted logs for network group membership changes, or you can also run our preloaded queries. Our preloaded queries can fetch the most recent network group membership changes and failed network group membership changes.

    Accessing Azure Virtual Network Manager’s event logs with a storage account

    The first step is to again navigate to your desired network manager and select the Diagnostic settings blade under the Monitoring section. Then you can select Add diagnostic setting and select the option to archive the logs to your storage account.

    Then you can navigate to your storage account and select the Storage browser blade.

    Monitor Azure Virtual Network Manager, Azure Exam, Azure Exam Prep, Azure Tutorial and Materials, Azure Learning, Azure Certification, Azure Prep, Azure Preparation, Azure Guides, Azure Learning

    Select Blob containers. A blob container will be automatically generated once network group membership changes occur.

    Monitor Azure Virtual Network Manager, Azure Exam, Azure Exam Prep, Azure Tutorial and Materials, Azure Learning, Azure Certification, Azure Prep, Azure Preparation, Azure Guides, Azure Learning

    Navigate down the blob container’s file path until you reach a JSON file for the datetime specified by that file path.

    Monitor Azure Virtual Network Manager, Azure Exam, Azure Exam Prep, Azure Tutorial and Materials, Azure Learning, Azure Certification, Azure Prep, Azure Preparation, Azure Guides, Azure Learning

    Download the JSON file to view the raw logs for the file path’s datetime.

    Source: microsoft.com

    Saturday 25 March 2023

    Protect against cyberattacks with the new Azure Firewall Basic

    Cyberattacks continue to rise across businesses of all sizes as attackers are adapting their techniques and increasing the complexity of their operations.1 The risk of these attacks is significant for small and medium businesses (SMBs) as they usually don’t have the specialized knowledge or resources to protect against emerging threats and face more challenges when recovering from an attack. In a recent Microsoft survey,2 70 percent of SMBs think cyberthreats are becoming more of a business risk and nearly one in four SMBs stated that they had a security breach in the last year.

    SMBs need solutions that are tailored to their unique needs and challenges. Microsoft is committed to delivering security solutions to meet the needs of all our customers. We are excited to announce the general availability of Azure Firewall Basic, a new SKU of Azure Firewall built for SMBs.

    Since public preview, we have seen a wide adoption of the Azure Firewall Basic. Customers stated the simplicity and ease of use of the Azure Firewall as one of the key benefits for choosing Azure Firewall Basic.  We have also added the capability to deploy Azure Firewall inside a virtual hub in addition to a virtual network. This gives businesses the flexibility to choose the deployment option that best meets their needs.

    Deploying Azure Firewall in a virtual network is recommended for customers who plan to use traditional hub-and-spoke network topology with a Firewall on the hub. Whereas, deploying on a virtual hub is recommended for customers with large or global network deployments in Azure where global transit connectivity across Azure regions and on-premises locations is needed.

    Providing SMBs with a highly available Firewall at an affordable price point


    Azure Firewall Basic brings the simplicity & security of Azure Firewall to SMBs at a cost-effective price point

    It offers Layer 3–Layer 7 filtering and alerts on malicious traffic with built-in threat intelligence from Microsoft threat intelligence. As a cloud-native service, Azure Firewall Basic is simple to deploy with a few clicks and seamlessly integrates with other Azure services, including Microsoft Azure Firewall Manager, Azure Monitor, Azure Events Hub, Microsoft Sentinel, and Microsoft Defender for Cloud.

    Key features of Azure Firewall Basic


    Comprehensive, cloud-native network firewall security

    ◉ Network and application traffic filtering—Centrally create, allow, or deny network filtering rules by source and destination IP address, port, and protocol. Azure Firewall is fully stateful, so it can distinguish legitimate packets for different types of connections. Rules are enforced and logged across multiple subscriptions and virtual networks.
    ◉ Threat intelligence to alert on malicious traffic—Enable threat intelligence-based filtering to alert on traffic from or to known malicious IP addresses and domains. The IP addresses and domains are sourced from the Microsoft threat intelligence feed.
    ◉ Built-in high availability—Azure Firewall Basic provides built-in high availability to ensure that your network traffic is always protected. Azure Firewall Basic can replicate your firewall instance across two availability zones, ensuring that your traffic is always filtered even if one of the zones goes down.

    Simple setup and easy to use

    ◉ Set up in just a few minutes—Use the Quickstart deployment Azure Resource Manager (ARM) templates to easily deploy Azure Firewall Basic directly to your Azure environment.
    ◉ Automate deployment (deploy as code)—Azure Firewall Basic provides native support for Infrastructure as Code (IaC). Teams can define declarative ARM templates that specify the infrastructure required to deploy solutions. Third-party platforms like Terraform also support IaC to manage automated infrastructure.
    ◉ Zero maintenance with automatic updates—Azure Firewall is automatically updated with the latest threat intelligence and security updates to ensure that it stays up-to-date and protected against the latest threats.
    ◉ Centralized management via Azure Firewall Manager—Azure Firewall Manager is a central management solution that allows you to manage multiple Azure Firewall instances and policies across your organization from a single location, ensuring that your security policies are consistent and up to date across your organization.

    Cost-effective

    Designed to deliver essential, cost-effective protection of your Azure resources within your virtual networks.

    Azure Exam, Azure Exam Prep, Azure Tutorial and Materials, Azure Prep, Azure Preparation, Azure Career, Azure Skills, Azure Jobs

    Choose the right Azure Firewall SKU for your business


    Azure Firewall is offered in three SKUs to meet a wide range of use cases and needs:

    1. Azure Firewall Premium is recommended for customers looking to secure highly sensitive applications, such as payment processing. In addition to all features of the Azure Firewall standard, it also supports advanced threat protection capabilities like malware and Transport Layer System (TLS) inspection.
    2. Azure Firewall Standard is recommended for customers looking for Layer 3–Layer 7 firewall and require auto-scaling to handle peak traffic periods of up to 30 gigabits per second (Gbps). It supports enterprise features like threat intelligence, Domain Name System (DNS) proxy, custom DNS, and web categories.
    3. Azure Firewall Basic is recommended for SMB customers with throughput needs of less than 250 megabits per second (Mbps).

    Let’s take a closer look at the features across the three Azure Firewall SKUs.

    Azure Exam, Azure Exam Prep, Azure Tutorial and Materials, Azure Prep, Azure Preparation, Azure Career, Azure Skills, Azure Jobs

    Azure Firewall Basic pricing

     
    Azure Firewall Basic pricing includes both deployment and data processing charges for both virtual network and virtual hub scenarios. Pricing and billing for Azure Firewall Basic with virtual hub will be effective starting May 1, 2023

    Source: microsoft.com

    Tuesday 21 March 2023

    Azure Data Manager for Energy: Achieve interoperability with Petrel

    Azure Data Manager for Energy, Azure Exam Prep, Azure Career, Azure Skills, Azure Jobs

    Microsoft Azure Data Manager for Energy is the first fully managed OSDU™ Data Platform built for the energy industry. This solution is the first step in unraveling the challenge of data—moving from disparate systems and disconnected applications to a holistic approach. The product’s ideation directly reflects the partnership between Microsoft and SLB, capitalizing on each organization’s unique expertise.

    As the energy industry works to achieve a sustainable low carbon future, organizations are taking advantage of the cloud to optimize existing assets and de-risk new ventures. Universally, data is at the core of their digital transformation strategies—yet only a small fraction of energy company data is properly tagged and labeled to be searchable. This leads engineers and geoscientists to spend significant time outside of their expertise trying to discover and analyze data. Azure Data Manager for Energy customers can seamlessly connect to an open ecosystem of interoperable applications from other Independent Software Vendors (ISVs) and the Microsoft ecosystem of productivity tools. Ultimately, the open Microsoft platform enables developers, data managers, and geoscientists alike to innovate the next generation of digital solutions for the energy industry.

    Azure Data Manager for Energy, Azure Exam Prep, Azure Career, Azure Skills, Azure Jobs

    “We all benefit from making the world more open. As an industry, our shared goal is that openness in data will enable a fully liberated and connected data landscape. This is the natural next step towards data-driven workflows that integrate technologies seamlessly and leverage AI for diverse and creative solutions that take business performance to the next level.”—Trygve Randen, Director, Data & Subsurface at Digital & Integration, SLB.

    The co-build partnership between Microsoft and SLB improves customers’ journey and performance, by unlocking data through interoperable applications. Delfi™ digital platform from SLB on Azure features a broad portfolio of applications, including the Petrel E&P Software Platform. The Petrel E&P Software Platform enhanced with AI enables workflows in Petrel to run with significantly faster compute times and include access to new tools, increasing the flexibility and productivity of geoscientists and engineers.

    Microsoft and SLB rearchitected Petrel Data Services to allow Petrel Projects and data to be permanently stored in the customer’s instance. Petrel Data Services leverages core services found in OSDU™, such as partition and entitlement services. This change further aligns Petrel Data Services with the OSDU™ Technical Standard schemas and directly integrates with storage as the system of record. Now when geoscientists or engineers create new Petrel projects or domain data, each is liberated from Petrel into its respective Domain Data Management Service (DDMS) provided by OSDU™, like seismic or wellbore, in Azure Data Manager for Energy. These Petrel liberated projects or data become immediately discoverable in Petrel on Delfi™ Digital Platform or any third-party application developed in alignment with the emerging requirements of the OSDU™ Technical Standard such as INT’s IVAAP.

    By splitting Petrel and Project Explorer software as a service (SaaS) applications from the data infrastructure, data resides in Azure Data Manager for Energy without any dependencies on an external app to access that data. Users can access and manage Petrel liberated Project Explorer and data in Azure Data Manager for Energy independent of any prerequisite application or license. Microsoft provides a secure, scalable infrastructure that governs data safely in the customer tenant while SLB focuses on delivering continuous updates to Petrel and Project Explorer on Delfi™ Digital Platform which expedites feature delivery.

    Petrel and Project Explorer on Azure Data Manager for Energy


    1. Search for and Discover Petrel Projects: Petrel Project Explorer shows all Petrel Project versions liberated from all users and allows the viewing of data associated with each project based on corresponding data entitlements. This includes images of the windows that are created in the project, metadata (coordinate reference systems, time zone, and more), and all data stored in the project. Using Project Explorer allows to preserve every single change throughout the lifetime of a Petrel project and preserve every critical milestone required by regulations or for historical purposes. Data and decisions can be easily shared and connected to other cloud native solutions on Delfi™ Digital Platform, and automatic, full data lineage and project versioning is always available.

    2. Connect Petrel to domain data: Petrel users can consume seismic and wellbore OSDU™ domain data directly from Azure Data Manager for Energy. Furthermore, Petrel Data Services enables the development of diverse and creative solutions for the exploration and production value chain which includes liberated data consumption in other applications like Microsoft Power BI for data analytics.

    3. Data liberation: Petrel Data Services empowers Petrel users to liberate Petrel Project data into Azure Data Manager for Energy where data and project content can be accessed without opening Petrel, providing simpler data overview and in-context access. Data liberation allows for direct consumption into other data analytics applications, generating new data insights into Petrel projects, breaking down data silos, and improving user and corporate data-driven workflows. It relieves users from Petrel project management and improves the focus on domain workflows for value creation.

    Azure Data Manager for Energy, Azure Exam Prep, Azure Career, Azure Skills, Azure Jobs
    Figure 1: Project Explorer on Azure Data Manager for Energy: View all Petrel projects within an organization in one place through an intuitive and performant User Interface (UI).

    Interoperable workflows that de-risk and drive efficiency


    Both traditional and new energy technical workflows are optimized when data and application interoperability are delivered. Geoscientists and engineers, therefore, want to incorporate as much diverse domain data as possible. Customers want to run more scenarios in different applications, compare results with their colleagues, and ultimately liberate the best data and the knowledge associated with it to a data platform for others to discover and consume. With Petrel and Petrel Data Services powered by Azure Data Manager for Energy, customers achieve this interoperability.

    Companies can liberate wellbore and seismic data for discovery in any application developed in alignment with the emerging requirements of the OSDU™ Technical Standard. As Petrel and Petrel Data Services use the standard schemas, all data is automatically preserved and indexed for search, discovery, and consumption. This extensibility model enables geoscientists and engineers as well as data managers to seamlessly access data in their own internal applications. SLB apps on Delfi™ Digital Platform such as Techlog, as well as Microsoft productivity tools including Power BI and an extensive ecosystem of partner apps are all available in this model. Additionally, developers can refocus their efforts on innovating and building new apps—taking advantage of Microsoft Power Platform to build low-code or no-code solutions. This creates the full data-driven loop and ultimately enables integrated workflows for any interoperable apps.

    Azure Data Manager for Energy, Azure Exam Prep, Azure Career, Azure Skills, Azure Jobs
    Figure 2: Azure Data Manager for Energy Data Flow connects seamlessly to a broad ecosystem of interoperable applications across Delfi™, Azure Synapse, Microsoft Power Platform, and the ISV ecosystem.

    Thursday 16 March 2023

    Microsoft and Rockwell Automation collaborate on digital solutions for manufacturers

    Microsoft Career, Microsoft Tutorial and Materials, Microsoft Skills, Microsoft Jobs, Microsoft Prep, Microsoft Guides, Microsoft Learning

    Industrial leaders have worked for years to improve the efficiency of manufacturing processes through digital innovation. But finding a balance between informational technology (IT) and operational technology (OT) requirements in their facilities has been a challenge. Solutions designed to solve immediate operational problems come up short on scalability, security, and manageability. Conversely, solutions prioritizing scale and security do not always solve urgent operational problems and can be too complex or costly to justify.

    Microsoft and Rockwell Automation believe well-designed digital solutions can address operational, system, workforce, and scale needs at the same time. That’s why we have partnered to build digital solutions that deliver transformational outcomes with immediate value and long-term scale, resilience, and agility. By working together, we can leverage our respective strengths and expertise to develop solutions that meet the needs of manufacturers.

    Building solutions for both physical and digital operations


    While collaborating with our customers to design, build, operate, and maintain digital solutions, we've discovered a series of digital accelerators that, when applied correctly, make a critical difference in a solution’s success. These accelerators can bridge the gap between immediate business needs on the factory floor and enterprise-scale requirements by:

    ◉ Taking advantage of cloud-to-edge frameworks: Using the same programming models, tools, application program interfaces (APIs), and management systems for cloud and operational systems helps make sure that edge computing systems are interoperable with existing IT systems and infrastructure. This makes it easier to integrate an OT computing system into an organization's existing workflow and processes.

    Digital native best practices, including Microsoft Azure-to-edge development methods, help you to consistently build and operate modern cloud-based applications from the factory floor to the enterprise. These tools bring more consistency and flexibility into operations and can directly integrate with cloud services. They also allow your applications and dependencies to be packaged into lightweight containers, making them easily deployable on cloud and edge infrastructures.

    Cloud-to-edge native management tools unify processes and data management across a company’s cloud, on-site, and edge systems. This can simplify daily operations and make data more visible and actionable while improving system resilience and security.

    ◉ Leveraging cloud scalability for operational agility: Inherent cloud scalability is an accelerator for digital operations because it allows you to consistently deliver modern operational services to any geography and at any service level.

    With cloud service flexibility, adding new functionalities does not impact the performance of existing applications, and expanding the delivery of capabilities across regions is frictionless and independent of scale.

    Additionally, the cloud offers AI model development and training, as well as other services that were previously only available with on-premises computing. It can support model training and deployment to the edge and dramatically reduce data-transfer requirements. Once trained in the cloud, AI models can be deployed on the edge, where low latency and speed are critical in modern manufacturing processes.

    ◉ Integrating IT and OT systems and data: IT and OT systems integration is an accelerator for digital operations because it allows you to feed modern applications with critical data needed to build effective digital feedback loops for operations.

    An effective, well-designed digital feedback loop needs to enable data visibility and utilization across IT and OT systems to deliver better productivity and more uptime. Integrating data from both can help organizations to make more informed decisions by combining data from different sources, such as customer data from IT systems and production data from OT systems. It also helps organizations to improve productivity and uptime by enabling real-time monitoring and control of industrial processes. Rockwell Automation creates technology—from sensors to input and output control systems and operations software—that is vital in managing OT data at the edge and, with help from Microsoft platforms, integrating it with IT data.

    When building solutions that integrate OT and IT data, it’s critical to include data and system security with Zero Trust as a key design principle. Access to operational systems, from networks to controllers, is typically granted based on implied trust. But this excess trust can be used by threat actors to breach networks. A Zero Trust approach improves cybersecurity by assuming no implicit trust and strengthening perimeters around business-critical assets. This includes securing OT data in transit and at rest, which ensures data confidentiality and integrity and certifies user identity.

    How manufacturers are working with Microsoft and Rockwell Automation


    Microsoft and Rockwell Automation are jointly delivering edge-to-cloud products that help you manage and make sense of the data in your factory. The result is actionable insights that can improve your business operations.

    ◉ FactoryTalk InnovationSuite, powered by PTC, is a comprehensive set of tools that includes edge-to-enterprise analytics, machine learning, industrial internet of things (IIoT), and augmented reality. It is designed using the cloud-to-edge principle for building digital operations solutions, leverages cloud-native methods, takes advantage of IT and OT data integration, and is infinitely scalable when deployed on the Microsoft Azure cloud.

    ◉ FactoryTalk Hub is a cloud-native approach to digital manufacturing operations, featuring tools under Design Hub, Operations Hub, and Maintenance Hub. Each hub consists of multiple solutions, born in the cloud and scalable for quick-time-to-value for manufacturers, original equipment manufacturers (OEMs), and system integrators.

    These solutions are helping Fonterra, a New Zealand-based dairy products co-operative owned by 9,000 farmers and families that are responsible for approximately 30 percent of the world’s dairy exports. Across 10 plants, Rockwell Automation-based solutions are capturing overall equipment and site-level efficiency data, which is then analyzed, summarized, and displayed locally. The data also is delivered to Microsoft Azure-based systems, which turn the data into insightful production reports that can be securely accessed across the company.

    Source: microsoft.com

    Tuesday 14 March 2023

    Azure previews powerful and scalable virtual machine series to accelerate generative AI

    Azure, Generative AI, Azure Career, Azure Skills, Azure Jobs, Azure Prep, Azure Preparation, Azure Tutorial and Materials, Azure Guides, Azure

    Delivering on the promise of advanced AI for our customers requires supercomputing infrastructure, services, and expertise to address the exponentially increasing size and complexity of the latest models. At Microsoft, we are meeting this challenge by applying a decade of experience in supercomputing and supporting the largest AI training workloads to create AI infrastructure capable of massive performance at scale. The Microsoft Azure cloud, and specifically our graphics processing unit (GPU) accelerated virtual machines (VMs), provide the foundation for many generative AI advancements from both Microsoft and our customers.

    “Co-designing supercomputers with Azure has been crucial for scaling our demanding AI training needs, making our research and alignment work on systems like ChatGPT possible.”—Greg Brockman, President and Co-Founder of OpenAI. 

    Azure's most powerful and massively scalable AI virtual machine series


    Today, Microsoft is introducing the ND H100 v5 VM which enables on-demand in sizes ranging from eight to thousands of NVIDIA H100 GPUs interconnected by NVIDIA Quantum-2 InfiniBand networking. Customers will see significantly faster performance for AI models over our last generation ND A100 v4 VMs with innovative technologies like:

    ◉ 8x NVIDIA H100 Tensor Core GPUs interconnected via next gen NVSwitch and NVLink 4.0
    ◉ 400 Gb/s NVIDIA Quantum-2 CX7 InfiniBand per GPU with 3.2Tb/s per VM in a non-blocking fat-tree network
    ◉ NVSwitch and NVLink 4.0 with 3.6TB/s bisectional bandwidth between 8 local GPUs within each VM
    ◉ 4th Gen Intel Xeon Scalable processors
    ◉ PCIE Gen5 host to GPU interconnect with 64GB/s bandwidth per GPU
    ◉ 16 Channels of 4800MHz DDR5 DIMMs

    Delivering exascale AI supercomputers to the cloud


    Generative AI applications are rapidly evolving and adding unique value across nearly every industry. From reinventing search with a new AI-powered Microsoft Bing and Edge to AI-powered assistance in Microsoft Dynamics 365, AI is rapidly becoming a pervasive component of software and how we interact with it, and our AI Infrastructure will be there to pave the way. With our experience of delivering multiple-ExaOP supercomputers to Azure customers around the world, customers can trust that they can achieve true supercomputer performance with our infrastructure. For Microsoft and organizations like Inflection, NVIDIA, and OpenAI that have committed to large-scale deployments, this offering will enable a new class of large-scale AI models.

    "Our focus on conversational AI requires us to develop and train some of the most complex large language models. Azure's AI infrastructure provides us with the necessary performance to efficiently process these models reliably at a huge scale. We are thrilled about the new VMs on Azure and the increased performance they will bring to our AI development efforts."—Mustafa Suleyman, CEO, Inflection.

    AI at scale is built into Azure’s DNA. Our initial investments in large language model research, like Turing, and engineering milestones such as building the first AI supercomputer in the cloud prepared us for the moment when generative artificial intelligence became possible. Azure services like Azure Machine Learning make our AI supercomputer accessible to customers for model training and Azure OpenAI Service enables customers to tap into the power of large-scale generative AI models. Scale has always been our north star to optimize Azure for AI. We’re now bringing supercomputing capabilities to startups and companies of all sizes, without requiring the capital for massive physical hardware or software investments.

    “NVIDIA and Microsoft Azure have collaborated through multiple generations of products to bring leading AI innovations to enterprises around the world. The NDv5 H100 virtual machines will help power a new era of generative AI applications and services.”—Ian Buck, Vice President of hyperscale and high-performance computing at NVIDIA. 

    Today we are announcing that ND H100 v5 is available for preview and will become a standard offering in the Azure portfolio, allowing anyone to unlock the potential of AI at Scale in the cloud.

    Source: azure.microsoft.com

    Saturday 11 March 2023

    3 Microsoft Azure AI product features that accelerate language learning

    Microsoft Azure AI, Azure Career, Azure Skills, Azure Jobs, Azure Prep, Azure Preparation, Azure Tutorial and Materials, Azure Learning

    The Microsoft Azure Cognitive Speech Services platform is a comprehensive collection of technologies and services aimed at accelerating the incorporation of speech into applications and amplifying differentiation to the market as a result. Among the services available are Speech to Text, Text to Speech, custom neural voice (CNV) Conversation Transcription Service, Speaker Recognition, Speech Translation, Speech SDK, and Speech Device Development Kit (DDK).

    AI for education is an emerging technology that has the potential to revolutionize the way we teach and learn languages. One of the most important aspects of language learning is the ability to pronounce words accurately, and this is where Azure Cognitive Speech Service's new Pronunciation Assessment feature comes in. Another key opportunity is the development of synthetic bilingual voices for language learning experiences with Custom Neural Voice, in addition to our speech-to-text capabilities.

    1. Pronunciation Assessment


    The new feature is designed to provide instant feedback to users on the accuracy, fluency, and prosody of their speech when learning a new language. The service utilizes Azure Neural Text-to-Speech and Transformer models, along with ordinal regression and a hierarchical structure, to improve the accuracy of word-level assessment. The service is currently available in more than 10 languages, including American English, British English, Australian English, French, Spanish, and Chinese, with additional languages in preview.

    The Pronunciation Assessment feature offers several benefits for educators, service providers, and students:

    ◉ For educators, it provides instant feedback, eliminates the need for time-consuming oral language assessments, and offers consistent and comprehensive assessments.

    ◉ For service providers, it offers high real-time capabilities, worldwide speech cognitive service, and supports growing global business.

    ◉ For students and learners, it provides a convenient way to practice and receive feedback, authoritative scoring to compare with native pronunciation, and helps to follow the exact text order for long sentences or full documents.

    Pronunciation Assessment is a powerful tool for language learning and teaching. By leveraging AI technologies such as TTS, Transformer, and Ordinal Regression, it provides instant and accurate feedback on speech pronunciation. With its wide range of supported languages and its ability to work with low-resource locales, it offers language learners of all backgrounds the opportunity to improve their language skills. With Pronunciation Assessment, educators can offer a more engaging and accessible learning experience, service providers can improve education customers' productivity, and students can practice more conveniently anywhere and anytime.

    At the Microsoft Reimagine Education event on February 9, 2023, we announced several new features to support student success. Speech Pronunciation assessment is used in Reading Coach on Immersive Reader and the Speaker Progress in Microsoft Teams. It can be used inside and outside of the classroom to save teachers time and improve learning outcomes for students on reading fluency, accessible to all learners.

    2. Speech-to-Text


    Teachers and language learners naturally will mix native language and learning language during the learning conversation. Azure Speech to text supports real-time language identification for multilingual language learning scenarios, and helps human-human interaction with better understanding and readable context.

    The latest multilingual modeling technology and transfer learning techniques were used to develop new speech-to-text (STT) languages based on vast amounts of data. These models have been trained in acoustics and language knowledge across different languages, and can handle both dictation and conversation in a variety of language domains. The output includes Inverse Text Normalization (ITN), capitalization (when appropriate), and automatic punctuation to enhance readability. Developers can easily integrate these languages into their projects using either a real-time streaming application programming interface (API) or batch transcription. The benefits of using a unified model across all languages will be immediately apparent.

    3. Prebuilt and Custom Neural Voice (CNV)


    Neural voice (Text-to-Speech) can read out learning materials natively and empower self-served learning anytime anywhere. Microsoft Azure AI provides more than 449 prebuilt neural voices across 147 languages and variances to enable users for AI teacher, content read-aloud capabilities, and more.

    Custom Neural Voice (CNV) is a feature offered by Azure AI that enables users to create a unique, customized, synthetic voice for their applications. This feature uses human speech samples as training data to generate a highly natural-sounding voice for a brand or characters. Education companies are using this technology to personalize language learning, by creating unique characters with distinct voices that match the culture and background of their target audience. For example, Duolingo used Custom Neural Voice to help bring nine new characters to life within the language learning platform, and Pearson used it to improve pronunciation assessment. CNV is based on neural text-to-speech technology and allows users to create synthetic voices that are rich in speaking styles, cross languages, and adaptable. The realistic and natural-sounding voice is great for representing brands and personifying machines for conversational interactions with users.

    Customer Inspiration


    As technology continues to advance, it's becoming increasingly clear that the future of education lies in the integration of AI. Azure AI is at the forefront of this revolution, providing education companies with powerful tools to improve the learning experience and drive student engagement and achievement. We are inspired by five customers in the education space:

    1. Pearson: The company wanted to use AI to deliver better services to students and empower teachers with highly accurate assessments, using Azure to develop AI-based services for language learners. They adopted new Microsoft algorithms and a leading-edge pronunciation assessment feature, which is a part of the Speech to Text capability.

    2. Beijing Hongdandan Visually Impaired Service Center: The organization is working with Microsoft and a team of volunteers to generate AI audio content, which will be used to improve resources for people who are blind or have low vision. They used Azure Custom Neural Voice, a text-to-speech tool that allows users to create custom voice fonts, to generate the audio content.

    3. Duolingo: The language learning company is using Custom Neural Voice to personalize language learning by introducing a cast of characters within the platform. Duolingo went through hundreds of iterations of characters, aimed for them to reflect the user base of cultures around the world while aligning visually with the app's longstanding main character. They used Custom Neural Voice to bring the characters to life within the language learning platform. They also used Azure to help bring nine new characters to life within the language learning platform.

    4. HelloTalk: The innovative mobile app provides an enjoyable and effortless way to learn a new language by connecting users with native speakers from around the world. With its intuitive language tools, including its Pronunciation Assessment feature, and community features, it enables users to practice and immerse themselves in the culture of their target language, improve their pronunciation, and make new friends in the process.

    5. Berlitz: The global leadership and language training company provides language learning products that use Azure speech recognition and pronunciation assessment. Through these innovate tools learners instantly receive detailed feedback on the accuracy and fluency of their speech in the new language. This allows Berlitz learners the flexibility to practice and perfect their pronunciation anywhere, anytime before speaking with native speakers in English, German, Spanish, and more.

    The future impact of AI in education


    The integration of AI, specifically speech services, into the education sector is becoming increasingly important as it can greatly enhance the learning experience and improve the effectiveness of teaching. Speech services such as Azure Pronunciation Assessment and Custom Neural Voice provide personalization, automation, and analytics in education platforms, which can lead to better student engagement and achievement. These services also enable educators to provide instant feedback on speech accuracy, fluency, and completeness which helps language learners to improve their pronunciation and fluency. With the ability to assess pronunciation in real-time, AI-powered speech services can help make the language assessment more engaging and accessible to learners of all backgrounds. Additionally, these services can also help with personalization of the learning experience for each student by providing personalized feedback and recommendations based on individual student needs. The integration of AI into the education sector can help educators empower students, and help students achieve their full potential.

    Source: microsoft.com

    Thursday 9 March 2023

    Discover the latest innovations at Azure Open Source Day 2023

    Azure Open Source Day 2023, Azure Exam, Azure Career, Azure Prep, Azure Prearation, Azure Tutorial and Materials, Azure Guides

    Microsoft has always been a developer-first organization, and we strive to make tools and platforms that serve developers. Open Source on Azure gives developers the freedom to build next-generation intelligent applications where they want, and when they want. Thanks to the power of open source, developers can now innovate faster and more securely on Azure. Microsoft helps you write more secure code, respond quickly to vulnerabilities in your software supply chain, and adopt the latest best practices to secure your development environments—all with a commitment to open source and support for all languages and frameworks.

    By combining the most beloved coding and collaboration tools—Linux, GitHub, Visual Studio Code, along with the Azure platform—the open-source software in the Azure ecosystem aids developer productivity with best-in-class tools for code-to-cloud development.

    Azure Open Source Day highlights Microsoft’s commitment to open source and focused on how Open Source technologies can be used to build intelligent apps faster and with more flexibility.

    Opening panel: How open source and AI impact software development in the cloud


    We are kicking off our show with a panel of thought leaders from Github, HashiCorp, Microsoft, and Redis to discuss how open source has continued to evolve software development, impacts on software supply chain and security, and how new AI capabilities may impact the future.

    How Open Source on Azure builds an intelligent app


    Today, we are excited to be showcasing a brand-new, intelligent, cloud-native application that connects owners with their lost pets using fine-tuned machine learning. Instead of printing posters, use an advanced machine learning image classification model, fine-tuned by the images on your camera roll. With this trained machine learning model, when a pet is found, you can instantly snap a photo that will match the model and connect you to the owner.

    We are leveraging open source technologies to make sure that our application is using the latest and greatest technologies.

    The app's frontend is a dotnet Blazor app, with a Python backend. The frontend and backend communicate using the Distributed Application Runtime (Dapr) that provides application programming interfaces (APIs) that simplify microservice connectivity. The backend uses a pre-built vision model from Hugging Face, fine-tuned directly through Azure Machine Learning for model training and prediction. The whole app is deployed using Bicep templates and runs on Azure Kubernetes Service. The Kubernetes Event Driven Autoscaling (KEDA) is used to provide autoscaling capabilities based on the number of messages being sent through Dapr.

    The app’s data layer was built with Azure Cosmos DB and takes advantage of the autoscale feature that matches database capacity with traffic demands. So as the app grows, the database automatically grows with it. With autoscale, the risk of rate-limiting and over-provisioning is eliminated, app performance is maintained, and the developer doesn’t have to monitor and manage database capacity manually. Developers using MySQL will soon enjoy similar benefits, with the general availability of IOPS Autoscaling in Azure Database for MySQL—flexible server, Business Critical tier coming next month. This feature will save time and money by resizing input/output operations per second (IOPS) depending on workload needs. We’ve also made it easier to develop low-code apps with MySQL data and visualize MySQL data with the integrations of Power Apps and Power BI Desktop with Azure Database for MySQL—flexible server, now in public preview and general availability respectively.

    Developers using Azure Cache for Redis Enterprise can now use the RedisJSON module on active, geo-replicated caches. Currently in preview, this feature simplifies development and minimizes downtime by enabling a single operation to read, write, and store JSON documents while simultaneously syncing data across all active regions.

    PostgreSQL users can now apply enhanced security to their databases, with the general availability of Microsoft Azure Active Directory and customer-managed keys in Azure Database for PostgreSQL—flexible server. Database user identities and access, along with encryption key permissions and life cycles, can now be centrally managed to make it easier to build secure apps.

    Compute scaling in the demo is provided by Azure Virtual Machine Scale Sets (VMSS) Flex to deploy GitHub Actions self-hosted runners on new Arm-based virtual machines. VMSS Flex allows you to easily manage and mix different virtual machine sizes and SKUs, including both Spot and standard virtual machines. Recent additions to the Azure portfolio include next-generation burstable Arm-based Bpsv2 virtual machines, which provide a low-cost option for workloads that typically run at a low to moderate baseline central processing unit (CPU) utilization, and Intel-based DIsv5 virtual machines that can deliver up to 30 percent increased performance and better price-performance than the Fsv2 virtual machines. Both virtual machine series feature broad support for Linux distributions.

    The app uses a pre-trained vision transformer model obtained from Hugging Face for image classification tasks. Developers and data scientists can now use foundation models in Azure Machine Learning to easily start their data science works to fine-tune and deploy foundation models from Hugging Face using Azure Machine Learning components and pipelines. This feature, currently in preview, provides organizations with a comprehensive repository of popular large AI models from Hugging Face through the built-in Azure Machine Learning registries, supporting various tasks such as classification, summarization, question answering, and translation. It simplifies the process of data pre-processing and adaptation of model training scripts, freeing data scientists from the overhead of setting up and managing underlying infrastructure and environment dependencies.

    Unleashing the AI technology wave: Training large language models at scale


    AI is changing every industry and is top of mind for developers. Most companies have leveraged AI to improve efficiency and costs. Large AI applications leveraging natural language processing (NLP), automatic speech recognition (ASR), and text-to-speech (TTS) are becoming prevalent, but what powers these applications is the underlying infrastructure optimized for large AI workloads. As mentioned in the post announcing the general availability of Microsoft Azure OpenAI Service, Azure is the best place to build AI workloads. This session highlights the partnership between Microsoft and NVIDIA and how Azure’s AI infrastructure and Azure Machine Learning were built for speed.

    Azure NDm A100 v4-series virtual machines are Azure’s flagship graphics processing unit (GPU) offerings and were used to run the model's new NVIDIA NeMo Megatron framework and test the limits of this series. Microsoft ran a 530B-parameter benchmark on 175 virtual machines, resulting in a training time per step of as low as 55.7 seconds. This benchmark measures the compute efficiency and how it scales by measuring the time taken per step to train the model after a steady state is reached, with a mini-batch size of one. The InfiniBand HDR provided superior communication between nodes without increased latency and was critical to the ludicrous speed performance.

    The open source future with Web3


    Azure Open Source Day will conclude with a fascinating fireside chat between Kathleen Mitford, CVP Azure Marketing and Donovan Brown, Partner Program Manager, Azure Open Source Incubations on the Open Source Future with Web3. The open and decentralized nature of Web3 can be a natural fit with open source philosophy, which is an exciting and developing space for new innovations.

    Web3 refers to another evolution of the internet, which may be more decentralized. It is built on a blockchain, which is a distributed ledger technology that enables the creation of a secure and transparent way to transfer and store digital assets. Microsoft has first-party solutions and a rich partner ecosystem to help you build using Web3.

    DevOps best practices are just as important, if not more important in the Web3 world as they are in the Web2 world. Azure has the key tools a developer needs—from Azure Static Web Apps, purposely built for your App, to GitHub Actions, and Azure DevOps, and Visual Studio Code. In the future, many organizations may build solutions with a combination of Web3 and Web2 working together. Go deeper into Web3 with Donovan Brown’s 10-part blog series on how to build a DevOps pipeline for the Ethereum Blockchain.

    Also, with the power of AI, you can ask ChatGPT to create a "hello world" sample in any language and copy and paste the code into your project. When you go to modify the code, have GitHub copilot help you make the changes using all the best practices. You can do all of this inside a GitHub Codespace configured with all your favorite tools, frameworks, and Visual Studio Code extensions installed. Then, you can use Azure DevOps or GitHub Actions to deploy the application to Azure. If you choose to build a Web3 application, Microsoft has the best tools and cloud services to support you.

    Source: microsoft.com

    Tuesday 7 March 2023

    Azure WAF guided investigation Notebook using Microsoft Sentinel for automated false positive tuning

    The SQL injection attack remains one of the critical attacks in the OWASP Top 10, and it involves injecting a SQL query via the input data field into a web application without input validation. According to Microsoft Digital Defense Report 2022, 67 percent of web application exploits include SQL injections.

    Azure Web Application Firewall (Azure WAF) provides centralized protection of your web applications from exploits and vulnerabilities. It protects against OWASP Top 10 attacks, bot attacks, application layer Distributed Denial of Service (DDoS) attacks, and other web attacks.


    Azure WAF detects SQL injection attacks and blocks them by default. In certain instances, this could be a false positive that requires investigation and creation of Azure WAF exclusions. To complete a successful investigation, full context about the attack is needed and a process that guides you through the investigation is required.

    We are pleased to announce a new Azure WAF guided investigation to tune WAF policy Notebook in preview. It guides you through an investigation experience to understand the Azure WAF incidents in Microsoft Sentinel, identify false positives, and automatically apply exclusions to WAF rules to address the false positives. This Notebook allows you to understand the WAF alert and pivot on key entities of the WAF event such as the request URI, client IP, hostname, and correlate with Threat Intelligence feeds to get a holistic view of the attack surface.

    Azure WAF investigations powered by Microsoft Sentinel


    Azure WAF is deeply integrated with Microsoft Sentinel, Microsoft’s Security Information and Event Management (SIEM) solution. Using the existing Azure WAF data connector, WAF logs are ingested and later analyzed for a variety of web application attacks and powerful visualizations pivoting on the full attack pattern are presented to you. This Notebook is built using Microsoft Threat Intelligence Center’s MSTICpy packages. With this Notebook, you can access rich historical contextual information using Microsoft Sentinel’s capabilities like incident generation, entity graph, and threat intelligence correlation, in conjunction with Azure WAF’s SQL injection detections based on OWASP rules and Microsoft Threat Intelligence rules.

    Automated investigation and mitigation of web application attacks


    Our new Azure WAF guided investigation to tune WAF policy Notebook provides an automated guided investigation for triaging Sentinel incidents triggered by Azure WAF SQL injection rules.

    The solution includes the following components:

    ◉ Azure WAF data connector in Microsoft Sentinel.
    ◉ Microsoft Sentinel incidents that are generated due to SQL injection attack detected by the Microsoft Sentinel analytic rules.
    ◉ Azure WAF Notebook that helps investigate Azure WAF logs and automatically applies WAF exclusions to the WAF policy.

    A high-level diagram explaining the data flow is given below:

    Microsoft, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Prep, Microsoft Preparation, Microsoft Guides, Microsoft Learning, Microsoft Notebook

    Let us look at two use case scenarios for using this Notebook:

    Understanding the attack landscape when there is a true positive


    Using the Notebook, you can pivot on various attack artifacts such as IP, URL, or domain threat intelligence, and understand the entity graph. This Notebook retrieves the WAF SQLi rule that generated the detection and looks up related SQLi rule events within the pre-selected time. Based on the above details, if you decide that the SQL injection attack is valid then you can update the incident severity and priority. In this scenario, the web application remains protected by Azure WAF.

    Microsoft, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Prep, Microsoft Preparation, Microsoft Guides, Microsoft Learning, Microsoft Notebook

    Understand the attack pattern and create exclusions if there is a false positive


    Using the Notebook, you can pivot on various attack artifacts such as IP, URL, or domain threat intelligence, and understand the entity graph. This Notebook retrieves the WAF SQLi rule that generated the detection and looks up related rule events. It also retrieves raw WAF logs to understand the relations between the request URI, client Ips, hostname entities and allows you to dynamically access the OWASP rule set in GitHub to understand the rule match pattern. Based on the investigations, if you decide this incident is a false positive, the process to automatically create granular exclusions is presented to you and the exclusions are applied to the Azure WAF policy using Azure WAF APIs.

    Microsoft, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Prep, Microsoft Preparation, Microsoft Guides, Microsoft Learning, Microsoft Notebook

    The following personas would benefit from this Notebook:

    Persona: Developer at SomeUnionFlight.com


    Understanding SQL injection detection logic

    Chris is a developer at SomeUnionFlight.com. His company hosts a website for users to search for flights and make flight reservations. They have hosted their website behind WAF with Azure Front Door (AFD) where AFD accepts user requests to search their website. SomeUnionFlight.com has an SQL backend where they store flight information. He notices that when users try to access the website, their access is getting blocked because the URL has “Union” keyword which is triggering the SQL injection rule. This detection is considered as a false positive because the “Union” keyword is used to mention a website name and not an SQL injection attack. He would like an investigation experience that helps him understand how to analyze this detection using Microsoft Sentinel and determine if it is a false positive. He would also like to automatically create exclusions for false positives for the URL without having to disable the entire rule.

    Persona: SecOps analyst at Contoso.com


    Understanding collateral attack vectors

    Ashley is a Security Operations analyst at Contoso.com. Her company has purchased both Azure WAF and Microsoft Sentinel. She oversees analyzing WAF logs and identifying attack patterns. She would like to understand if the client IP or the request URI associated with the WAF rule that triggered the SQL injection are Indicators of Compromise (IoC). By understanding related Threat Intelligence Indicators of Compromises, she can prevent future attacks on her organization.

    Get started today


    SQL injection attacks are getting more prevalent by the day and Azure WAF protects web applications from these attacks. To enable a high-quality investigation experience for Azure WAF customers, we have created this new Azure WAF guided investigation Notebook that enables you quickly understand full attack surface and take actions on the incidents. 

    This new Azure WAF Notebook can be found in Microsoft Sentinel under the Notebooks in the Threat Management section.

    Microsoft, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Prep, Microsoft Preparation, Microsoft Guides, Microsoft Learning, Microsoft Notebook

    Microsoft, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Prep, Microsoft Preparation, Microsoft Guides, Microsoft Learning, Microsoft Notebook

    Source: microsoft.com