Tuesday 28 February 2023

Agile teams align and get to market faster with Mural and Microsoft

During the last two years navigating changing economic climates and a global pandemic that shifted the way we work, we’ve learned that teams can continue to collaborate together productively and effectively in remote and hybrid settings. A recent Microsoft study shows that hybrid work works. The biggest risk of a hybrid workforce is not unproductive team members—it's that people are becoming burnt out from too many meetings and a lack of clear, defined priorities.

But it takes intention and leadership to design a company culture where collaboration workflows let everyone contribute equally in retrospectives, standups, planning, and other agile projects where bold ideas and clear alignment are critical to staying competitive in the market. Now, more than ever, agile teams, or any team focused on planning and strategizing are expected seamlessly connect and collaborate regardless of where everyone is working.

Microsoft, in partnership with Mural’s leading collaborative intelligence system, is filling a critical need. Powered by Microsoft Azure and integrated with Microsoft Teams, Mural enables dynamic meetings and teamwork turning ideas and insights into reality so remote work no longer inhibits co-creation. Mural's latest integration with the Microsoft ecosystem connects Azure DevOps to an intuitive workspace for teams to collaborate visually. Mural centers around a digital whiteboard and includes pre-built frameworks that guide and structure collaborative work, including templates for agile ceremonies and LUMA design thinking practices. Facilitation tools built into the product, like voting and a timer, make it intuitive for teams to work together intentionally and make it easier for everyone to contribute. Mural’s new integration with Azure DevOps is the connective tissue for agile teams, no matter where they’re working.

Level up your team’s next collaboration session with Mural and Azure DevOps


With Mural’s new integration, Mural and Azure DevOps customers can now save time and jumpstart collaboration by easily importing work items as sticky notes onto a Mural canvas. Teams can use Mural’s visual canvas and templates to refine work items in a backlog, review a sprint plan, break down a feature into smaller chunks of work, have a retrospective, or even build a program board. Having work items on the canvas where collaboration is happening keeps workflows connected and teams focused without incurring a toggle tax by switching between applications.

93 percent of surveyed organizations who leverage agile experienced an increase in collaboration and productivity following the implementation of Mural.

Everything from user story mapping to retrospectives are more productive, inclusive, and engaging when you connect Mural and Azure DevOps.

How Mural and Azure DevOps work


Connect Azure DevOps to your Mural account to view your projects and easily import work items as sticky notes. Use filters or run your own query to find the work items you want to import.

Microsoft Certification, Microsoft Tutorial and Materials, Microsoft Certification, Microsoft Skills, Microsoft Jobs, Microsoft Prep, Microsoft Preparation

Design your import in Mural by choosing which fields, including custom fields, to import as tags on your sticky. Stickies are automatically color coded by work item type to visualize and organize next steps and prevent delays in decision making.

Microsoft Certification, Microsoft Tutorial and Materials, Microsoft Certification, Microsoft Skills, Microsoft Jobs, Microsoft Prep, Microsoft Preparation

Be even more effective with Mural’s templates for hybrid agile ceremonies like program implement (PI) planning, sprint planning, sprint reviews, and more to level up collaboration and reduce time to market.

Microsoft Certification, Microsoft Tutorial and Materials, Microsoft Certification, Microsoft Skills, Microsoft Jobs, Microsoft Prep, Microsoft Preparation

Partners in collaboration transformation


Mural is a Microsoft Partner of the Year award winner, which recognizes partners based on their commitment to customers, the impact of their solutions, and their exemplary use of Microsoft technologies. We are thrilled to collaborate with Mural, an ISV building the next generation of solutions with Azure technology, to help teams everywhere plan smarter, collaborate better, and ship faster. Integration with Azure DevOps improves agility and can optimize your existing cloud investments by helping decrease costs and saving developer time. With comprehensive security and compliance built in, Mural is paving the way for organizations to adopt a DevOps culture. Mural integration with Azure DevOps is truly a “better together” story as both teams recognized early on a need for the integration across both end users, DevOps, and developers. In the early stages of development, both Microsoft and Mural not only helped test the integration, but also incorporated it into their main development lifecycle of Azure DevOps, quickly realizing the full potential of both Mural and Azure DevOps as an all-in-one platform.

“In a hybrid, modern workplace, organizations need to collaborate and brainstorm with virtual teams to develop innovative solutions for their businesses. We’re thrilled to partner with Mural and deliver its latest integration with Azure DevOps to enable teams everywhere to do their best work using a shared collaboration space across platforms.”—Casey McGee, VP of Global ISV Sales and Digital Natives, Microsoft

Source: microsoft.com

Saturday 25 February 2023

Exploring mTLS setup to send a client certificate to the backend and OCSP validation

In our previous blog We discussed what mutual transport layer security (mTLS) is and what some of its use cases are. In this blog I want to discuss two of those use cases. First, how to send client certificate to the backend application server and validate the setup by curl command and second how to set up OCSP validation and do verification by openssl commands.

Insert client certificate as HTTP header


In some cases, backend applications may need a client certificate that is received by Application Gateway. Client certificates can serve different purposes as per the need of the backend applications. Some backend servers may need client certificate information for audit purposes or may want to issue token or cookie to a client certificate. In that case we may have to supply the client certificate to the backend. One way to solve this is by supplying the certificate in base64 encoded format within a nonstandard HTTP (Hypertext Transfer Protocol) header. Please note, for security purposes and to prevent header injections, backend server must accept the custom header from trusted Application Gateway. Let’s discuss first how to send client certificate to backend application as custom http header. To achieve that you can set up a rewrite rule to send client certificate as HTTPS header.

Below is the rewrite rule that you can create to send client certificate to the backend as an HTTP header. Setup rewrite action as below.

Azure Exam, Azure Exam Prep, Azure Preparation, Azure Tutorial and Materials, Azure mTLS

Above is screenshot of Create rewrite set explaining values that need to be populated for Rewrite rule.

Once a rewrite rule is created you can verify if the backend server is receiving client certificate in the HTTP header. To test the setup prerequisite is to have openssl and curl tool installed in your machine. You should have access to the client certificate and client private key.

Verification steps to check client certificate in custom HTTP header:

Capture the client certificate output.

◉ more client.crt

Azure Exam, Azure Exam Prep, Azure Preparation, Azure Tutorial and Materials, Azure mTLS

Above is screen shot showing client certificate output.

Run the following commad to send a request to Application Gateway:

◉ curl -vk HTTPS://<yourdomain.com> --key client.key --cert client.crt

In the backend server you should see the header you created in the Application gateway rewrite rule. You will have to run network capturing tools like tcpdump at the backend server.

Azure Exam, Azure Exam Prep, Azure Preparation, Azure Tutorial and Materials, Azure mTLS

Above screenshot shows Client certificate that backend has received.

Above you can see the X-Client-cert header received by backend that we have created in the rewrite rule. This header has the client certificate that we have sent. The backend server can extract this value and use it based on the desired use case.

OCSP


Online certificate status protocol (OCSP) is now supported by Application gateway. Let’s discuss here how to setup OCSP and validate the setup with openssl command. With OCSP support you can verify the status of the client certificate in real time. This can prevent man-in-the-middle attacks by ensuring that the certificate being present is still valid and has not been compromised. You can get more details about OCSP in RFC 2560. It is easy to setup. When a client initiates a connection to an Application Gateway configured with mutual TLS authentication, not only can the certificate chain and issuer's distinguished name be validated, but revocation status of the client certificate can be checked with OCSP (Online Certificate Status Protocol). During validation, the certificate presented by the client will be looked up via the defined OCSP responder defined in its Authority Information Access (AIA) extension. In the event the client certificate has been revoked, the application gateway will respond to the client with an HTTP 400 status code and reason. If the certificate is valid, the request will continue to be processed by application gateway and forwarded on to the defined backend pool.

Please check this OCSP link to enable this capability. I have summarized the PowerShell command to setup OCSP.

$AppGw = Get-AzApplicationGateway -Name "ApplicationGateway01" -ResourceGroupName "ResourceGroup01"

$profile = Get-AzApplicationGatewaySslProfile -Name "SslProfile01" -ApplicationGateway $AppGw

Set-AzApplicationGatewayClientAuthConfiguration -SslProfile $profile -VerifyClientCertIssuerDN -VerifyClientRevocation OCSP

Once you have set up OCSP, you can verify your client certificate with the OCSP endpoint using openssl command.

◉ openssl ocsp -issuer <ca-bundle> -cert client.crt -text -url <HTTP://FQDN>

Ca-bundle—certificate authority (CA) that has issued the certificate (uploaded per the link step 8 from our previous blog)

Client.crt—Client certificate

url—This will be OCSP endpoint URL address. If you do not know what the URL is you can find the OCSP endpoint of client certificate by using following command:

◉ openssl x509 -in client.crt -text | grep -I OCSP

OCSP—URL:HTTP://ocsp.sectigo.com

Azure Exam, Azure Exam Prep, Azure Preparation, Azure Tutorial and Materials, Azure mTLS

Above is screen shot of openssl command showing status of client certificate verification.

You should see the following response if certificate is valid:

Response verify OK

client.crt: good

After verification of your client certificate through OCSP endpoint, you can verify the traffic by sending a request to Application Gateway that has OCSP check-enabled.

◉ curl -vk HTTPS://yourdomain.com --key client. Key --cert client.crt

In case the certificate is not a valid client certificate, OCSP will respond with either “revoked” or “unknown”. Below is the error for “unknown” certificate.

Azure Exam, Azure Exam Prep, Azure Preparation, Azure Tutorial and Materials, Azure mTLS

Tuesday 21 February 2023

7 reasons to join us at Azure Open Source Day

Are you interested in learning more about Azure and open-source technologies?  Do you want to learn about the latest AI capabilities on Azure and how Microsoft is leveraging open-source technologies to drive innovation? If so, you won't want to miss Azure Open Source Day on Tuesday, March 7, 2023, at 9:00 AM–10:30 AM Pacific Time.


Azure Open Source Day is a great opportunity to learn more about Microsoft's role in the open-source community, its contributions, and vision. Microsoft has a long history of supporting and contributing to open-source projects, and it continues to be a leader in the community today. Learn how Microsoft is empowering developers to build innovative solutions using the best of cloud capabilities and open-source technologies.

Azure Open Source Day, Azure Exam, Azure Career, Azure Skills, Azure Jobs, Azure Tutorial and Materials, Azure Learning, Azure Guides

Here are seven reasons why you should attend:

1. See app-building demos—Discover how to build intelligent applications that are fast, flexible, and scalable using containers, Azure Kubernetes Service (AKS), and Azure managed databases and Azure AI services. Azure provides a wide range of tools and services that can be used to build intelligent applications.

2. Learn from partners—See how to use the power of Azure to build intelligent apps fast and flexibly using the best of open-source technology. Hear about Microsoft and Nvidia’s partnership to allow developers to spin up a platform in a matter of minutes.

3. Discover new innovative technologies—Find out how to use Dapr—an open-source project developed by Microsoft and a growing community of contributors—to easily build, deploy, and scale microservices. Dapr helps you to focus on business logic while abstracting away the underlying infrastructure and platform.

4. Hear perspectives on open-source trends—Hear from Microsoft luminaries Brendan Burns and Sarah Novotny, and our partners (GitHub, HashiCorp, Redis, and Nvidia) about how open source can be used to drive technological progress and drive collaboration between companies.

5. Get proven support—Get a first look at how Microsoft is committed to supporting its customers with their technology needs whatever they may be, including Web3 scenarios and use cases. Microsoft's Azure cloud platform, developer tools, and identity and security services can help customers build and run Web3 applications.

6. Learn how to protect your data—Protect your business assets by building on a highly secure cloud platform designed to meet your open-source security and compliance needs.

7. Ask the experts—Post your questions during the live chat Q&A. Azure Open Source Day features a live chat where attendees can ask the experts their questions and get detailed answers.

Source: microsoft.com

Monday 20 February 2023

7 Best Tips for Microsoft PL-300 Exam Preparation

The PL-300 exam, also known as the Microsoft Power BI Data Analyst exam, is an essential certification exam for professionals looking to become Microsoft Certified - Power BI Data Analyst Associate. It is one of the many exams offered by Microsoft under its new role-based certification paths, replacing the MCSA, MCSD, and MCSE certifications. Passing the PL-300 exam is the only requirement for this certification, which validates a professional’s knowledge and expertise in Power BI data analysis. This certification demonstrates that an individual can effectively use Power BI to analyze and visualize data and provide insights that drive business decisions. As the only Power BI-specific exam and certification path offered by Microsoft, the PL-300 exam is an excellent opportunity for professionals to establish themselves as experts in data analysis.

Information About Microsoft PL-300 Exam

Microsoft PL-300 is the essential exam to pass to obtain the Power BI Data Analyst Associate certification.

This exam gauges your skill to prepare, model, analyze, and visualize data with Power BI to offer value for an organization.

Microsoft PL-300 Exam Details

  • Certification Exam Name: Exam PL-300: Microsoft Power BI Data Analyst
  • Exam Cost: $165 in the US
  • Prerequisites: Applicants must have a fundamental understanding of data repositories and data processing both on-premises and in the cloud.
  • Exam Duration: 180 minutes (1 Hour 40 Minutes)
  • Total Questions: 40-60 Multiple-choice Questions (with 1 case study)
  • Passing Score: 70%
  • Topics Covered in Microsoft PL-300 Exam

    Here are topics included in the exam along with the percentage per their appearance chance in the final exam:

  • Prepare the data (15-20%)
  • Model the data (30-35%)
  • Visualize and analyze the data (25-30%)
  • Deploy and maintain assets (20-25%)
  • Who Should Pass the PL-300 Exam?

    The PL-300 exam is designed for Power BI subject matter experts who possess the necessary skills to model, transform, and visualize data within the tool. The exam evaluates candidates’ ability to apply their knowledge and transform business requirements into actionable insights and recommendations through various case studies. It is not recommended for those new to Power BI as it requires extensive hands-on experience with the tool to pass. Candidates must have experience in all areas of the tool and discern the best deployment methods for a given situation. The PL-300 exam is a valuable opportunity for professionals to bridge the gap between being a skilled Power BI user and an individual who can create and manage content in the most effective way for their organization. Passing this exam demonstrates a high level of expertise in Power BI and highlights a candidate’s ability to make informed data-driven decisions, making them an asset to any organization.

    How to Prepare for Microsoft PL-300 Certification Exam?

    1. Focus on Microsoft PL-300 Exam’s Key Components

    Microsoft provides abundant study resources to prepare for the Microsoft Power BI Certification exam. One can visit the official website and carefully review the syllabus, taking notes in a secure location and formulating a study strategy based on the syllabus. This approach assists in identifying a starting point for studying and what topics to focus on.

    2. Review the Microsoft Power BI Data Analyst PL-300 Certification Exam Content

    It is important to familiarize oneself with the exam syllabus and to comprehend essential details such as the exam’s duration, question types, and cost. It is recommended to review these details carefully as they provide insight into the course’s background and can assist in better preparation.

    3. Create a Study Routine for Exam PL-300

    Creating a study schedule aids in completing all topics within a designated timeframe, reviewing them, and taking necessary breaks. Structuring study time is crucial to prevent procrastination, minimize distractions, and avoid neglecting essential tasks.

    4. Pick a Suitable Learning Style

    Self-study may need to provide more resources or the flexibility to resolve queries as required. Hence, enrolling in an e-learning platform that provides access to industry experts who can impart knowledge and offer top-notch study resources is recommended. In addition, e-learning study materials can be accessed at any time and from any location, with round-the-clock assistance to identify and address gaps in one’s knowledge and skills.

    If e-learning platforms are not preferred, one can opt for self-study by referring to Microsoft Power BI beginner tutorials to enhance their understanding of the tool. This approach can also facilitate testing knowledge through online mock tests for Power BI.

    5. Make Practicality a Priority in Your Use of Power BI

    Even if you thoroughly understand every detail and concept of Power BI, it won’t be helpful if you cannot apply that knowledge in real-world situations. However, there is a solution: Microsoft has created a self-paced training program called Dashboard in a Day (DIAD), which is regularly updated and maintained. The DIAD program provides a dataset for you to practice after completing the self-paced portion, allowing you to gain practical experience.

    6. Use Exam PL-300 Practice Test

    Incorporating practice tests into your study routine can be a highly effective strategy for increasing your likelihood of success. Repeating the practice test multiple times can boost your self-assurance and help you monitor your advancement. By including practice tests in your studying approach, you can develop a more profound familiarity with the subject matter and feel more ready when it’s time to take the actual test.

    7. Give Time for Rest and Revision

    Revising the material you have been studying is essential for retaining the concepts you’ve covered; otherwise, you risk forgetting everything. Nonetheless, it’s vital to prioritize your health by maintaining a proper diet and getting adequate rest if you want to study effectively. It’s impossible to retain information if your brain is exhausted from a lack of sleep or rest, so taking care of your physical well-being is crucial to optimize your learning.

    Importance of Passing the Microsoft PL-300 Exam

    Obtaining a Microsoft Certified: Data Analyst Associate certification goes beyond adding a credential to a resume. Passing the PL-300 exam requires technical knowledge and an understanding how to best utilize the Power BI environment for a given situation. These skills can transform report builders into architects of enterprise-wide data solutions. This certification is highly beneficial for individuals looking to begin a career or transition to a new role in data analytics using Power BI. It can also serve as a valuable way to showcase career growth to managers or clients. Being Microsoft certified demonstrates that an individual can be trusted to build, deploy, and manage a business intelligence solution for an organization. By obtaining this certification, individuals can establish themselves as experts in the field of data analytics, increase their marketability, and potentially open up new career opportunities.

    Related Read: Plan Your Career in Power BI with Microsoft PL-300 Exam

    Conclusion

    Successfully passing this Microsoft exam validates your ability to create reports, write DAX and Power Query code proficiently, administer the Power BI web environment, and apply analytical techniques in Power BI to establish a comprehensive data solution for an organization. Holding this certification assures you that you can undertake a data analyst role utilizing Power BI confidently. If you aspire to showcase your skills or embark on a new career path, obtaining the Microsoft Certified: Data Analyst Associate certification is an excellent move in the right direction.

    Saturday 18 February 2023

    Secure your application traffic with Application Gateway mTLS

    I am happy to share that Azure Application Gateway now supports mutual transport layer security (mTLS) and online certificate status protocol (OCSP). This was one of the key questions from our customers as they were looking for more secure communication options for the cloud workloads. Here, I cover what mTLS is, how it works, when to consider it, and how to verify it in Application Gateway.

    What is mTLS?


    Mutual transport layer security (TLS) is a communication process where both parties verify and authenticate each other’s digital certificates prior to setting up an encrypted TLS connection. mTLS is an extension of the standard TLS protocol, and it provides an additional layer of security over TLS. With traditional TLS, the server is authenticated, but the client is not. This means that anyone can connect to the server and initiate a secure connection, even if the client or user is not authorized to do so. By using mTLS you can make sure that both the client and the server must authenticate each other prior to establishing the secure connection, this will make sure there is no unauthorized access possible on either side. mTLS works on the framework of zero trust—never trust, always verify. This framework ensures that no connection should be trusted automatically.

    How does mTLS work?


    mTLS works by using a combination of secure digital certificates and private keys to authenticate both the client and the server. The client and the server each have their own digital certificate and private key, which are used to establish trust and a secure connection. The client verifies the server's certificate, and the server verifies the client's certificate—this ensures that both parties are who they claim to be.

    How are TLS and mTLS different?


    TLS and mTLS protocols are used to encrypt network communication betweenclient and server. In TLS protocol only the client verifies the validity of the server prior to establishing the encrypted communication. The server does not validate the client during the TLS handshake. mTLS, on other hand, is a variation of TLS that adds an additional layer of security by requiring mutual authentication between client and server. This means that both the client and server must present a valid certificate before the encrypted connection can be established. This makes mTLS more secure than TLS as it adds an added layer of security by validating authenticity of client and server.

    TLS call flow:

    Azure Exam, Azure Prep, Azure Tutorial and Material, Azure Learning, Azure Prep, Azure Career

    Azure Exam, Azure Prep, Azure Tutorial and Material, Azure Learning, Azure Prep, Azure Career

    mTLS call flow:

    Azure Exam, Azure Prep, Azure Tutorial and Material, Azure Learning, Azure Prep, Azure Career

    Azure Exam, Azure Prep, Azure Tutorial and Material, Azure Learning, Azure Prep, Azure Career

    When to consider mTLS


    ◉ mTLS is useful where organizations follow a zero-trust approach. This way a server must ensure of the validity of the specific client or device that wants to use server information. For example, an organization may have a web application that employees or clients can use to access very sensitive information, such as financial data, medical records, or personal information. By using mTLS, the organization can ensure that only authorized employees, clients, or devices are able to access the web application and the sensitive information it contains.

    ◉ Internet of Things (IoT) devices talk to each other with mTLS. Each IoT device presents its own certificate to each other to get authenticated.

    ◉ Most new applications are working on microservices-based architecture. Microservices communicate with each other via application programming interfaces (APIs), by using mTLS you can make sure that API communication is secure. Also, by using mTLS you can make sure malicious APIs are not communicating with your APIs

    ◉ To prevent various attacks, such as brute force or credential stuffing. If an attacker can get a leaked password or a BOT tries to force its way in with random passwords, it will be of no use—without a valid TLS certificate the attacker will not be able to pass the TLS handshake.

    At high level now you understand what is mTLS and how it offers more secure communication by following zero trust security model. If you are new to Application Gateway and have never setup TLS in Application Gateway, follow the link to create APPGW and Backend Servers. This tutorial uses self-signed certificates for demonstration purposes. For a production environment, use publicly trusted CA-signed certificates. Once end-to-end TLS is set up, you can follow this link for setting up mTLS. To test this setup the prerequisite is to have OpenSSL and curl tool installed on your machine. You should have access to the client certificate and client private key.

    Let’s dive into how to test mTLS Application Gateway. In the command below, the client's private key is used to create a signature for the Certificate Verify message. The private key does not leave the client device during the mTLS handshake.

    Verify your mTLS setup by using curl/openssl


    ◉ curl -vk https://<yourdomain.com> --key client.key --cert client.crt

    <Yourdomain.com> -> Your domain address
    client.key -> Client’s private key
    client.crt -> Client certificate

    Azure Exam, Azure Prep, Azure Tutorial and Material, Azure Learning, Azure Prep, Azure Career

    In the above output, we are verifying if mTLS is correctly set up. If it is set up correctly, during the TSL handshake server will request the client certificate. Next, in the handshake, you need to verify if the client has presented a client certificate along with the Certificate Verify message. Since the client certificate was valid, the handshake was successful, and the application has responded with an HTTP "200" response.

    If the client certificate is not signed by the root CA file that was uploaded as per the link in step 8, the handshake will fail. Below is the response we will get if the client certificate is not valid.

    Azure Exam, Azure Prep, Azure Tutorial and Material, Azure Learning, Azure Prep, Azure Career

    Alternatively, you can verify the mTLS connectivity with an OpenSSL command.

    ◉ openssl s_client -connect <IPaddress> :443 -key client.key -cert client.crt

    Azure Exam, Azure Prep, Azure Tutorial and Material, Azure Learning, Azure Prep, Azure Career

    Once the SSL connection is established type as written below:

    GET / HTTP/1.1

    Host: <IP of host>

    Azure Exam, Azure Prep, Azure Tutorial and Material, Azure Learning, Azure Prep, Azure Career

    You should get the Response code—200. This validates that mutual authentication is successful.

    Source: microsoft.com

    Thursday 16 February 2023

    Microsoft Azure Security expands variant hunting capacity at a cloud tempo

    In the first blog in this series, we discussed our extensive investments in securing Microsoft Azure, including more than 8500 security experts focused on securing our products and services, our industry-leading bug bounty program, our 20-year commitment to the Security Development Lifecycle (SDL), and our sponsorship of key Open-Source Software security initiatives. We also introduced some of the updates we are making in response to the changing threat landscape including improvements to our response processes, investments in Secure Multitenancy, and the expansion of our variant hunting efforts to include a global, dedicated team focused on Azure. In this blog, we’ll focus on variant hunting as part of our larger overall security program.

    Variant hunting is an inductive learning technique, going from the specific to the general. Using newly discovered vulnerabilities as a jumping-off point, skilled security researchers look for additional and similar vulnerabilities, generalize the learnings into patterns, and then partner with engineering, governance, and policy teams to develop holistic and sustainable defenses. Variant hunting also looks at positive patterns, trying to learn from success as well as failure, but through the lens of real vulnerabilities and attacks, asking the question, “why did this attack fail here, when it succeeded there?”

    In addition to detailed technical lessons, variant hunting also seeks to understand the frequency at which certain bugs occur, the contributing causes that permitted them to escape SDL controls, the architectural and design paradigms that mitigate or exacerbate them, and even the organizational dynamics and incentives that promote or inhibit them. It is popular to do root cause analysis, looking for the single thing that led to the vulnerability, but variant hunting seeks to find all of the contributing causes.

    While rigorous compliance programs like the Microsoft SDL define an overarching scope and repeatable processes, variant hunting provides the agility to respond to changes in the environment more quickly. In the short term, variant hunting augments the SDL program by delivering proactive and reactive changes faster for cloud services, while in the long term, it provides a critical feedback loop necessary for continuous improvement. 

    Leveraging lessons to identify anti-patterns and enhance security


    Starting with lessons from internal security findings, red team operations, penetration tests, incidents, and external MSRC reports, the variant hunting team tries to extract the anti-patterns that can lead to vulnerabilities. In order to be actionable, anti-patterns must be scoped at a level of abstraction more specific than, for example, “validate your input” but less specific than “there’s a bug on line 57.” 

    Having distilled an appropriate level of abstraction, variant hunting researchers look for instances of the anti-pattern and perform a deeper assessment of the service, called a “vertical” variant hunt. In parallel, the researcher investigates the anti-pattern’s prevalence across other products and services, conducting a “horizontal” variant hunt using a combination of static analysis tools, dynamic analysis tools, and skilled review.

    Insights derived from vertical and horizontal variant hunting inform architecture and product updates needed to eliminate the anti-pattern broadly. Results include improvements to processes and procedures, changes to security tooling, architectural changes, and, ultimately, improvements to SDL standards where the lessons rapidly become part of the routine engineering system.

    For example, one of the static analysis tools used in Azure is CodeQL. When a newly identified vulnerability does not have a corresponding query in CodeQL the variant hunting team works with other stakeholders to create one. New “specimens”—that is, custom-built code samples that purposely exhibit the vulnerability—are produced and incorporated into a durable test corpus to ensure learnings are preserved even when the immediate investigation has ended. These improvements provide a stronger security safety net, helping to identify security risks earlier in the process and reducing the re-introduction of known anti-patterns into our products and services.

    Microsoft Azure Security, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Prep, Microsoft Preparation, Microsoft Guides, Microsoft Learning

    Azure Security's layered approach to protecting against server-side threats


    Earlier in this series, we highlighted security improvements in Azure Automation, Azure Data Factory, and Azure Open Management Infrastructure that arose from our variant hunting efforts. We would call those efforts “vertical” variant hunting.

    Our work on Server-Side Request Forgery (SSRF) is an example of “horizontal” variant hunting. The impact and prevalence of SSRF bugs have been increasing across the industry for some time. In 2021 OWASP added SSRF to its top 10 list based on feedback from the Top 10 community survey—it was the top requested item to include. Around the same time, we launched a number of initiatives, including:

    ◉ Externally, Azure Security recognized the importance of identifying and hardening against SSRF vulnerabilities and ran the Azure SSRF Research Challenge in the fall of 2021.
    ◉ Internally, we ran a multi-team, multi-division effort to better address SSRF vulnerabilities using a layered approach.
    ◉ Findings from the Azure SSRF Research challenges were incorporated to create new detections using CodeQL rules to identify more SSRF bugs.
    ◉ Internal research drove investment in new libraries for parsing URLs to prevent SSRF bugs and new dynamic analysis tools to help validate suspected SSRF vulnerabilities.
    ◉ New training has been created to enhance prevention of SSRF vulnerabilities from the start.
    ◉ Targeted investments by product engineering and security research contributed to the creation of new Azure SDK libraries for Azure Key Vault that will help prevent SSRF vulnerabilities in applications that accept user-provided URIs for a customer-owned Azure Key Vault or Azure Managed HSM.

    This investment in new technology to reduce the prevalence of SSRF vulnerabilities helps ensure the security of Azure applications for our customers. By identifying and addressing these vulnerabilities, we are able to provide a more secure platform for our customers on which to build and run their applications.

    In summary, Azure has been a leader in the development and implementation of variant hunting as a method for identifying and addressing potential security threats. We have hired and deployed a global team focused exclusively on variant hunting, working closely with the rest of the security experts at Microsoft. This work has resulted in more than 800 distinct security improvements to Azure services since July 2022. We encourage security organizations all over the world to adopt or expand variant hunting as part of your continuous learning efforts to further improve security.

    Source: microsoft.com

    Saturday 11 February 2023

    Automate your attack response with Azure DDoS Protection solution for Microsoft Sentinel

    DDoS attacks are most known for their ability to take down applications and websites by overwhelming servers and infrastructure with large amounts of traffic. However, there are additional objectives for cybercriminals to use DDoS attacks to exfiltrate data, extort, act politically, or ideologically. One of the most devastating features of DDoS attacks is their unique ability to disrupt and create chaos in targeted organizations or systems. This plays well for bad actors that leverage DDoS as smokescreen for more sophisticated attacks, such as data theft. This demonstrates the increasingly sophisticated tactics cybercriminals use to intertwine multiple attack vectors to achieve their goals.

    Azure offers several network security products that help organizations protect their applications: Azure DDoS Protection, Azure Firewall, and Azure Web Application Firewall (WAF). Customers deploy and configure each of these services separately to enhance the security posture of their protected environment and application in Azure. Each product has a unique set of capabilities to address specific attack vectors, but the most benefit speaks to the power of relationship—when combined these three products provide more comprehensive protection. Indeed, to combat modern attack campaigns one should use a suite of products and correlate security signals from one to another, to be able to detect and block multi-vector attacks.

    We are announcing a new Azure DDoS Protection Solution for Microsoft Sentinel. It allows customers to identify bad actors from Azure’s DDoS security signals and block possible new attack vectors in other security products, such as Azure Firewall.

    Using Microsoft Sentinel as the glue for attack remediation


    Each of Azure’s network security services is fully integrated with Microsoft Sentinel, a cloud-native security information and event management (SIEM) solution. However, the real power of Sentinel is in collecting security signals from these separate security services and analyzing them to create a centralized view of the attack landscape. Sentinel correlates events and creates incidents when anomalies are detected. It then automates the response to mitigate sophisticated attacks.

    In our example case, when cybercriminals use DDoS attacks as smokescreen to data theft, Sentinel detects the DDoS attack, and uses the information it gathers on attack sources to prevent the next phases of the adversary lifecycle. By using remediation capabilities in Azure Firewall and other network security services in the future, the attacking DDoS sources are blocked. This cross-product detection and remediation magnifies the security posture of the organization, where Sentinel is the orchestrator.

    Automated detection and remediation of sophisticated attacks


    Our new Azure DDoS Protection Solution for Sentinel provides a single consumable solution package that allows customers to achieve this level of automated detection and remediation. The solution includes the following components:

    1. Azure DDoS Protection data connector and workbook.

    2. Alert rules that help retrieve the source DDoS attackers. These are new rules we created specifically for this solution. These rules may be utilized by customers to achieve other objectives for their security strategy.

    3. A Remediation IP Playbook that automatically creates remediation in Azure Firewall to block the source DDoS attackers. Although we document and demonstrate how to use Azure Firewall for remediation, any 3rd party firewall that has a Sentinel Playbook can be used for remediation. This provides the flexibility for customers to use this new DDoS solution with any firewall.
    The solution is initially released for Azure Firewall (or any third-party firewall), and we plan to enhance it to support Azure WAF soon.

    Let’s see a couple of use cases for this cross-product attack remediation.

    Use case #1: remediation with Azure Firewall

    Let’s consider an organization that use Azure DDoS Protection and Azure Firewall, and consider the attack scenario in the following figure:

    Azure DDoS Protection Solution, Azure, Azure Certification, Azure Career, Azure Skills, Azure Jobs, Azure Tutorial and Materials

    An adversary controls a compromised bot. They starts with a DDoS smokescreen attack, targeting the resources in the virtual network for that organization. They then plan to access the network resources by scanning and phishing attempts until they’re able to gain access to sensitive data.

    Azure DDoS Protection detects the smokescreen attack and mitigates this volumetric network flood. In parallel it starts sending log signals to Sentinel. Next, Sentinel retrieves the attacking IP addresses from the logs, and deploys remediation rules in Azure Firewall. These rules will prevent any non-DDoS attack from reaching the resources in the virtual network, even after the DDoS attacks ends, and DDoS mitigation ceases.

    Use case #2: remediation with Azure WAF (coming soon)

    Now, let’s consider another organization who runs a web application in Azure. It uses Azure DDoS Protection and Azure WAF to protect its web application. The adversary objective in this case is to attack the web application and exfiltrate sensitive data by starting with a DDoS smokescreen attack, and then launch web attacks on the application.

    Azure DDoS Protection Solution, Azure, Azure Certification, Azure Career, Azure Skills, Azure Jobs, Azure Tutorial and Materials

    When Azure DDoS Protection service detects the volumetric smokescreen attack, it starts mitigating it, and signals logs to Sentinel. Sentinel retrieves the attack sources and applies remediation in Azure WAF to block future web attacks on the application.

    Get started with Azure DDoS protection today


    As attackers employ advanced multi-vector attack techniques during the adversary lifecycle, it’s important to harness security services as much as possible to automatically orchestrate attack detection and mitigation.

    For this reason, we created the new Azure DDoS Protection solution for Microsoft Sentinel that helps organizations to protect their resources and applications better against these advanced attacks. We will continue to enhance this solution and add more security services and use cases.

    Source: microsoft.com

    Thursday 9 February 2023

    Azure Red Hat OpenShift for Microsoft Azure Government—now generally available

    Azure Red Hat OpenShift, Microsoft Azure Government, Azure Exam Prep, Azure Tutorial and Materials, Azure Prep, Azure Certification, Azure Preparation, Azure Skill, Azure Jobs

    Today we’re pleased to announce the general availability of Azure Red Hat OpenShift on Microsoft Azure Government. With this release, we are combining world-class Azure infrastructure with a leading enterprise Kubernetes platform as a jointly operated and supported service for Azure Government customers to run their containerized workloads in production.

    Azure Red Hat OpenShift (ARO) on Azure Government enables compliance with strict government regulations and certifications, such as FedRAMP and CJIS, which makes it a secure and compliant option for running containerized workloads at production scale. Agencies can take advantage of the stringent security and compliance features of Azure Government and leverage the flexibility and scalability of OpenShift.

    Azure Red Hat OpenShift for Azure Government includes key IT security and regulatory certifications, including:

    ◉ FedRAMP High Authorization.
    ◉ International Traffic in Arms Regulations (ITAR).
    ◉ Defense Federal Acquisition Regulation Supplement (DFARS).
    ◉ Internal Revenue Service (IRS) 1075 forms.
    ◉ Criminal Justice Information Services (CJIS).

    As a managed service, Azure Red Hat OpenShift also offers several benefits for agencies looking to innovate, including:

    1. Scalability: OpenShift provides automatic scaling, self-healing, and rolling updates, which help to ensure that applications can handle increased loads and recover from failures quickly.

    2. Flexibility: OpenShift allows developers to use their preferred languages, frameworks, and tools to build and deploy containerized applications, making it easy to work with existing applications and technologies.

    3. Enterprise-grade management: OpenShift provides a centralized management console, role-based access control, and built-in monitoring and logging, making it easy for IT teams to manage and troubleshoot containerized applications.

    4. Interoperability: Azure Red Hat OpenShift runs on top of Azure and integrates with other Azure services, such as Azure Database for PostgreSQL, Azure Cosmos DB, and Azure Virtual Network, making it easy to build and deploy applications that leverage the full range of Azure services.

    5. Support: Azure Red Hat OpenShift is a jointly managed product, which means that it is supported by both Microsoft and Red Hat, providing customers with access to the expertise and resources of both companies.

    Launched in 2019, Azure Red Hat OpenShift was the first codeveloped, jointly operated Red Hat OpenShift service on the public cloud, offering a powerful on-ramp to the hybrid cloud by extending the same enterprise-grade Kubernetes used in private datacenters to the scale of Microsoft Azure.

    Source: microsoft.com

    Saturday 4 February 2023

    Azure high-performance computing powers energy industry innovation

    Azure High-Performance, Azure Exam, Azure Tutorial and Materials, Azure Prep, Azure Tutorial and Materials, Azure Guides, Azure Learning

    The rising demand for energy


    Global energy demand has rapidly increased over the last few years and looks set to continue accelerating at such a pace. With a booming middle class, economic growth, digitization, urbanization, and increased mobility of populations, energy suppliers are in a race to leverage the development of new technologies that can more optimally and sustainably generate, store, and transport energy to consumers.

    With the impact of climate change adding urgency to minimizing energy waste, in addition to optimizing power production leaders in the renewable energy as well as oil and gas industries are accelerating sector-wide innovation initiatives that can drive differentiated impact and outcomes at scale.

    As the population of developing countries continues to expand, the energy needs of billions of additional people in rural and especially urban areas will need to be catered to. McKinsey estimates that global energy consumption will triple by 2050, with oil and gas accounting for 65 percent of power consumption by then.

    In addition, supplies of conventional oil and gas are also expected to decline in the not-too-distant future, shrinking in concentration to mostly the Middle East (oil) and countries like Russia, Iran, and Qatar (gas). As a result, the transition to more sustainable sources of power is leading global energy producers to leverage next-generation technologies to transform their solutions while simultaneously optimizing their operations.

    New innovators in the renewable energy industry are also adopting next-generation technologies such as artificial intelligence (AI), advanced analytics, 3-D imaging, and the internet of things (IoT), supported by high-performance computing (HPC) capabilities, to maximize energy production and ensure a smoother transition to a more sustainable path.

    Optimizing operational excellence in the energy industry


    Instead of investing in complex, costly, and time-intensive on-premises resources, global energy leaders are leveraging the power of cloud capabilities such as Azure HPC + AI, to simulate highly complex, large-scale models and visualize seismic imaging and modeling, resulting in huge economic gains.

    One of the key innovations enabling this strategic advantage is the dynamic scaling capability of Azure HPC + AI, powered by GPUs, which are ideal for running remote visualization, optimized virtual machines, and can be augmented with deep learning and predictive analytics, allowing customers to have on-demand intelligent computing to solve complex problems and drive tangible business outcomes.

    Energy multinational bp, for example, believes technology innovation is the key to making a successful transition to net zero. The company chose to create digital twins to find opportunities for optimization and carbon reduction.

    Drawing on over 250 billion data signals from an IoT network spanning bp's global operating assets, the company identified various opportunities to scale the digital twin solution to its entire operating base and reduce emissions by as much as 500,000 tons of CO2 equivalent every year.

    Going green—Energy industry innovation abounds


    The green energy sector is also grabbing hold of the opportunity presented by these exponential technologies to speed up the journey toward a more sustainable energy ecosystem.

    Azure High-Performance, Azure Exam, Azure Tutorial and Materials, Azure Prep, Azure Tutorial and Materials, Azure Guides, Azure Learning
    Italian energy infrastructure operator Snam is harnessing Azure AI and a full stack of Azure IoT services to reduce carbon emissions and help meet its net-zero targets. Energy efficiency is top of the company's agenda. Snam aims to cut methane emissions by 55 percent by 2025, reach net zero by 2040, and exclusively transport decarbonized gas by 2050.

    With any leakage in its operations posing a threat to field workers, maintenance staff, and people living near their network—not to mention the environment—Snam deployed an IoT network for real-time monitoring and to enhance its data collection and processing capabilities.

    For wind energy solutions provider Vestas Wind Systems, a combination of Azure HPC and partner Minds.ai's machine learning platform, DeepSim, helped its wind farms mitigate the wake effect, generate more energy, and build a sustainable energy future.

    Drawing on the Azure HBv3 virtual machines using third-generation AMD EPYCTM processors, Vestas can scale up and run millions of complex simulations that inform how controllers adjust turbines to optimize energy production.

    The computing power offered by the AMD-based Azure HBv3 nodes allows Vestas to drive efficiencies that have the potential to unlock significantly more power and higher profits for wind farm operators by minimizing the estimated 10 percent of wind energy that is lost to wake effects.

    Key takeaways


    As the energy industry eyes a period of unprecedented growth and change, the role of technology will become ever more profound.

    Leveraging powerful Microsoft Cloud capabilities such as HPC, AI, advanced analytics, big data, and IoT, the integrated advanced technology capabilities that have previously been the reserve of only a handful of the largest companies are now truly available to anyone.

    Supported by these powerful next-generation technologies, energy companies can unlock greater efficiency, innovation, and growth to achieve gains across their operations and drive the world towards a brighter energy future.

    Source: microsoft.com

    Thursday 2 February 2023

    Azure Native NGINXaas makes traffic management secure and simple—now generally available

    Continuing Microsoft Azure’s commitment to empower our ISV partners and customers to adopt and modernize their application of choice and run in the cloud, we are excited to announce general availability (GA) of the NGINXaaS offering on Azure.

    In facilitating the cloud transformation journey for cloud architects, developers, IT professionals, and business decision makers who are all working towards their digital transformations, we are expanding on our more than a decade of partnership with F5, the company behind NGINX to provide a deeper integration of NGINX into the Azure ecosystem.

    NGINX provides load balancing, traffic management, and security tools for users to configure and manage the incredibly complex make-up of the architectures traffic patterns on their cloud and on-premises environments.

    “We are excited to expand our Azure ecosystem with the General availability of NGINX for Azure. This strategic partnership with F5 immediately brings together the power of Azure and NGINX’s application delivery expertise to give our developers and customers more native options on Azure.”—Julia Liuson President, Microsoft Developer Division.

    Do more with less


    Based on inputs from customers in the Open Source world and other users of the NGINX offering, we worked with F5 to simplify the infrastructure management and provide a seamless experience by integrating the deployment, billing, and support of the NGINX solution on the Azure cloud platform, available via the Azure Marketplace.

    By taking the management burden away from the user as part of the managed offering, the customer can now focus on the core elements of their business while the custodians of the NGINX and Azure offering bring our strengths to provide a fully managed, secure, and reliable NGINX offering on Azure.

    The deep integration into the Azure Control plane also provides another layer of optimization by promoting all the latest relevant features from the Azure Platform to be automatically available to this service.

    Deploying and managing load balancer and traffic manager on Azure


    The service integrates the NGINX offering into the Azure Control plane. Through this integration, customers can provision a new NGINX service and configure their Azure resources to seamlessly extend workloads to the cloud and deliver secure and high-performance applications using the familiar and trusted load balancing solution. This gives the user consistency in performance and security across their ecosystem via a one-click deployment. In addition, the customers can manage all of the advanced traffic management features they demand, including JSON Web Token authentication and integrated security, to name a few.

    Lift and shift from existing deployments


    The integrated offering makes it very easy to migrate application delivery from on-premises to Azure cloud. Enterprises and users can now lift and shift apps to Azure cloud seamlessly by bringing their own or existing configurations of NGINX and deploying them from the Azure portal or Marketplace. Users can then configure advanced traffic management and security, leverage state-of-the-art monitoring capabilities, and port custom configurations.

    Unified experience


    Build end-to-end traffic management solutions with a unified experience. This service gives the user consistency in performance and security across their portfolio of on-premises and Azure cloud apps by using the same load balancing solution and configurations everywhere via the one-click step deployment.

    Secure deployments


    The ability to control traffic via virtual networks is a critical consideration for our customers. With this integration, users can seamlessly manage configurations between their own virtual network and the NGINX Cloud virtual network via a custom solution leveraging service, Injection. This is further complemented with unified billing for the NGINX service through Azure subscription invoicing.

    Getting started with Azure Native NGINX Service:


    Discovery and procuring: Azure customers can find the service listed on Azure Marketplace and review the different purchasing plans offered, and purchase it directly with single billing enabled.

    Azure Networking, Azure Marketplace, Azure Partner, Azure Career, Azure Skills, Azure Jobs, Azure Tutorial and Materials, Azure Management

    Provisioning the NGINX resources: Within several clicks, you can deploy NGINX service in your desired subscription and datacenter regions with your preferred plan.

    Azure Networking, Azure Marketplace, Azure Partner, Azure Career, Azure Skills, Azure Jobs, Azure Tutorial and Materials, Azure Management

    In Azure Portal experience: Configure the NGINX Networking components:

    Azure Networking, Azure Marketplace, Azure Partner, Azure Career, Azure Skills, Azure Jobs, Azure Tutorial and Materials, Azure Management

    Configuring logs and metrics: Customers can determine which Azure resource logs and metrics are sent to the NGINX resource.

    Azure Networking, Azure Marketplace, Azure Partner, Azure Career, Azure Skills, Azure Jobs, Azure Tutorial and Materials, Azure Management

    Source: microsoft.com