Showing posts with label Azure Storage. Show all posts
Showing posts with label Azure Storage. Show all posts

Thursday, 13 June 2024

Unlock new potential for your SAP workloads on Azure with these learning paths

Unlock new potential for your SAP workloads on Azure with these learning paths

SAP, a leading provider of enterprise resource planning (ERP) software, is a mission-critical consideration for companies migrating to Azure. Microsoft and SAP have a strong partnership centered on helping customers successfully migrate their SAP workloads to Azure, offering solutions that can be seamlessly integrated with Azure’s cloud infrastructure:

  • Accelerated cloud savings: As the leading provider for SAP workloads, let us manage your infrastructure as you streamline your cloud spending. 
  • AI intelligence built-in: Harness the power of AI-powered insights to make data-driven decisions that drive your business forward. 
  • Boost productivity and innovation: Integrated apps streamline your team’s workflow and automate repetitive business processes.
  • Enhanced protection: Our multi-layered cloud security ensures your SAP workloads run smoothly, backed by integrated Azure recovery services.

Together, SAP and Microsoft are not just providing tools and services but are also creating ecosystems that foster innovation and transformation, enabling businesses to stay competitive in a rapidly evolving digital landscape. To help you and your team better take advantage of these benefits, we’ve created an array of learning materials and interactive events—from self-guided courses to Virtual Training Days, certifications to conferences—that build your cloud expertise.

Chart your course to expertise with personalized Learning Paths 


Our Microsoft Learn Learning Paths are curated collections of free, online modules and resources designed to help you build specific skills or gain knowledge in a particular technology or subject area. We have a full range of paths related to migrating SAP workloads, including:

Explore Azure for SAP Workloads: Streamline your SAP operations and maximize ROI with our comprehensive Azure training. Empower your team to seamlessly migrate, manage, and optimize SAP workloads on Azure, leveraging its robust infrastructure and specialized tools. This comprehensive training will enhance your SAP performance, drive efficiency, and unlock innovation within your existing environment. 

Deploy SAP on Azure: To boost your SAP efficiency and reliability, your team will master the deployment of SAP solutions on Azure, including single-instance SAP HANA and SAP NetWeaver high availability configurations. This expertise will streamline migrations, optimize performance, and ensure smooth operations, ultimately reducing costs and maximizing the value of your SAP investment. 

Explore the foundations of IaaS for SAP on Azure: Unlock the power of Azure’s Infrastructure as a Service (IaaS) for your SAP workloads. Our comprehensive training equips your team with the expertise to design and deploy scalable, high-performing SAP solutions on Azure. By mastering essential Azure resources like virtual machines, virtual networks, storage accounts, and Microsoft Entra ID, you’ll optimize your infrastructure for efficiency and resilience, ultimately driving business growth. 

Explore SAP HANA on Azure (Large Instances): Optimize your SAP HANA investment as this program equips your team with the expertise to architect, deploy, and manage SAP HANA on Azure (Large Instances) effectively. By mastering key concepts like high availability, disaster recovery, security, and cost optimization, your business can ensure a resilient, secure, and cost-effective SAP HANA environment in the cloud. 

Plan to deploy the Microsoft Sentinel solution for SAP: This standalone learning module provides an overview of Microsoft Sentinel’s comprehensive security solution for SAP environments. It outlines the roles and responsibilities of the team members involved in designing, deploying, and implementing this solution.

Improve SAP seller productivity with Microsoft Teams and Power Platform: Streamline your sales and service processes with our SAP and Microsoft 365 integration training. Your team will master integrating Teams with SAP Sales and Service Core, enabling seamless collaboration. Additionally, they’ll learn to automate tasks using Power Automate, enhancing productivity and efficiency for your SAP sellers. 

Planning and deploying SAP on Azure: Unlock the power of Azure to optimize your SAP operations. Our comprehensive course teaches you to efficiently deploy, configure, and manage SAP workloads in the cloud. Gain expertise in migrating and operating SAP solutions, mastering essential Azure resources like VMs, storage, and networking. Enhance scalability, performance, and security while reducing costs.

Migrate and Modernize SAP in the Microsoft Cloud: Accelerate your SAP deployment on Azure with our one-day course for experienced SAP professionals. Gain hands-on expertise in setting up Azure infrastructure specifically for SAP, including virtual machines, networking, storage, and identity management. Learn best practices for backup, disaster recovery, and monitoring to ensure optimal performance and reliability of your SAP environment in the cloud. 

Learn from the pros with live, interactive Virtual Training Days 


Virtual Training Days are instructor-led classes designed to equip individuals and teams with in-demand skills related to cloud migration, AI, and other cutting-edge technologies. We offer Virtual Training Days to help you migrate SAP to Azure, optimizing your performance, reliability, and scalability while reducing costs. In this session, Migrate and Modernize SAP on the Microsoft Cloud, you’ll find out how to secure and monitor SAP workloads on Azure. Come explore how this move enhances productivity, fosters secure collaboration, and gives you AI-powered insights for greater efficiency. 

Showcase your accomplishments with Certifications 


Microsoft Certifications are the official badges that prove you’ve got the skills, whether it’s Azure, Microsoft 365, or other technologies. Getting certified isn’t just a pat on the back; it’s a ticket to career opportunities and industry recognition. Plus, it keeps you on top of the latest tech trends. So, if you want your IT game strong, Microsoft Certifications are the way to go.

Planning and Administering Microsoft Azure for SAP Workloads: To qualify for this certification, you need in-depth knowledge of SAP systems and industry standards for migration, integration, and long-term operation on Azure. Professionals in this career collaborate with cloud administrators, database administrators, and clients to implement solutions, recommending services and adjusting resources for optimal performance and resilience.

Connect and learn: Network with industry leaders at the SAP Sapphire conference 


SAP Sapphire 2024 was a series of global events that brought together SAP experts, partners, and customers to explore the latest innovations and strategies in business technology. With a focus on empowering businesses to turn their visions into reality, SAP Sapphire blended in-person and virtual experiences to share valuable insights, practical knowledge, and networking opportunities to help you drive digital transformation and achieve your business goals. 

Azure experts were on hand to share their expertise and discuss developments related to the cloud and AI. Both sessions are available on-demand for viewing:

Unlock innovation with AI on the Microsoft Cloud: We discussed how integrating AI with Azure and SAP solutions can help your organization learn valuable insights, automate processes, and optimize efficiency for your most critical business operations. This session showed participants how to make AI a reality for their organizations, boosting productivity and creating new opportunities for growth. 

Accenture’s RISE with SAP journey with Microsoft and generative AI: Discover how Accenture, a global IT services and consulting firm, successfully migrated to the RISE with SAP solution on Microsoft Cloud, leveraging Microsoft AI solutions for enhanced innovation and efficiency. This collaborative effort among Accenture, Microsoft, and SAP resulted in a transformed enterprise system, delivering greater insights, agility, and operational efficiency.

Source: microsoft.com

Thursday, 8 February 2024

Reflecting on 2023—Azure Storage

Reflecting on 2023—Azure Storage

The beginning of a new year often prompts reflection along with planning for the future. At the forefront of our priorities lies the commitment to enhance the Azure platform and its ecosystem, fuel groundbreaking AI solutions, and facilitate seamless migrations to the cloud. We achieve this through purpose-built storage solutions tailored to meet the unique demands of your workloads.

2023 was an emblematic year of growth for storage and data services with AI at the top of mind for many customers. Azure customers’ overall data estate continued to grow, powered by the emergence of new workloads and access patterns. Notable Azure Storage figures include:

  • The storage platform now processes more than 1 quadrillion (that’s 1000 trillion!) transactions a month with over 100 exabytes of data read and written each month. Both numbers were sharply higher compared to the beginning of 2023.
  • Premium SSD v2 disks, our new general-purpose block storage tailored for SQL, NoSQL databases, and SAP grew capacity by 100 times.
  • The total transactions of Premium Files, our file offering for Azure Virtual Desktop (AVD) and SAP, grew by more than 100% year over year (YoY).

In 2023, advancements and investments were made in five strategic areas aligning with customer workload patterns, cloud computing trends, and the evolution of AI.

Focused innovations for new workloads


We’ve made advancements in offering end-to-end solutions with unique storage capabilities, tailored to help customers bring new workloads to Azure, without the need to retrofit them.

As customers modernize applications with Kubernetes, we have observed customers’ workloads moving from “stateless” to “stateful” workloads. Azure Container Storage is custom designed to meet the needs of hosting stateful workloads where storage is tightly integrated with containers and offers outstanding data management, lifecycle management, price-performance, and scale. Azure Container Storage is the industry’s first platform-managed container-native storage service in the public cloud. It offers a path to manage persistent volumes and the backend storage options completely via the Kubernetes native experience.

Microsoft’s Azure Elastic SAN is a fully managed storage area network (SAN) service that simplifies deploying, scaling, managing, and configuring a SAN in the cloud. It is designed to address customers’ challenges in service management when it comes to migrating a large-scale SAN appliance to cloud. We are the first in the market in providing a fully managed SAN offering on cloud, with built-in cloud scale and high availability options. It adopts a SAN-like hierarchy, provisioning resources at the top-level resource, dynamically sharing resources across the workloads, managing security policies at the workload level. In addition, it enables customers to bring new data intensive workloads to Azure VMware Solution using Elastic SAN to serve the storage needs. Elastic SAN and Azure Container Storage, in public preview, showcase a paradigm shift by simplifying service management, purpose built for cloud native storage scenarios. General availability is planned for the upcoming months.

AI is leading a fresh wave of innovation powered by data. Azure Blob Storage stands at the forefront of this data explosion, providing excellent performance and high bandwidth access for all AI training needs. Azure Premium Block Blob, generally available since 2019, provides extremely low latency at competitive pricing for AI and machine learning (ML), IoT and streaming analytics, and interactive workloads. We also released Azure Managed Lustre in July which offers a fully managed distributed parallel file system tailored for high performance computing (HPC) and AI training workloads.

OneLake (which is the storage layer for Microsoft Fabric) is powered by the scale, performance, and rich capabilities of Azure Data Lake Storage (ADLS). ADLS has been a game changer for analytics with differentiated storage features including native file system capabilities that enable atomic meta data operations, resulting in a significant performance boost, especially when used with Azure Premium Block Blob. Multi-protocol access is another unique capability that allows customers to work with their data using a variety of access protocols (including REST, SFTP and NFS v3) eliminating data silos, costly data movement, and duplication.

Optimizations for mission critical workloads


We continue to add differentiated capabilities to enable enterprises to optimize their mission critical workloads hosted on Azure.  

With the general availability of Premium SSD v2, we have been actively onboarding solutions focused on mission critical data. Gigaom highlighted the superior performance and reduced total cost of ownership of SQL Server 2019 running on Azure Virtual Machines (VMs) with Premium SSD v2. Similar benefits apply to other database workloads like PostgreSQL. We will continue enhancing Premium SSD v2 capabilities and expand regional coverage in the upcoming year.

Azure Blob Storage powers data-intensive mission critical workloads globally, including some of the largest AI model training and inferencing. Cloud-native companies host exabytes of consumer-generated data and rely on the platform’s scale-out capabilities. Major retailers store trillions of user activities to power recommendation engines and inventory systems. The Cold storage tier, released in 2023, enables cost-efficient retention of this data.

Enterprises migrate Windows Server workloads to the cloud for modernization, supported by enhanced Azure Files and Azure File Sync capabilities. The preview of Microsoft Entra ID (previously named as Azure Active Directory support for Azure Files REST API) enables modern applications to use identity-based authentication and authorization for accessing SMB file shares. Azure Files offers comprehensive authentication support, including on-premises AD and Microsoft Entra ID for both Kerberos and REST. Unique to Azure, we also offer Microsoft Entra ID Kerberos authentication enabling seamless migration from on-premises Windows servers to Azure which keeps your data and permissions intact while eliminating complex domain join setups in the cloud. Enterprises migrating mission-critical applications such as SAP and Oracle rely on Azure NetApp Files (ANF) to meet their performance and data protection requirements.  In strategic partnership with NetApp, we introduced the public preview of ANF large volumes up to 500 TiB, addressing the needs of workloads that require larger capacity under a single namespace, such as HPC in Electronic Design Automation and Oil and Gas related applications, and the public preview of Cool Access, improving the TCO for infrequently accessed data.

Efficient data migration at scale is critical for successful cloud workload migration. We offer a comprehensive suite of migration solutions. Azure Storage Mover, a fully managed data migration service, improves the overall migration experience of large data sets. Azure File Sync allows hybrid synchronization between on-premises Windows Servers and Azure Files. AzCopy, a command-line utility, enables quick script-based migration jobs, while Azure Databox facilitates offline data transfer.

Expanding partner ecosystem


We transition customer workloads to Azure in collaboration with our partners, who complement our platform capabilities with innovative solutions. We have fostered numerous successful strategic partnerships and are delighted to see the partner community investing in exclusive Azure solutions.

Qumulo announced the next version of Azure Native Qumulo Scalable File Service, exclusive to Azure and built to leverage Azure Blob Storage’s scale and cost efficiency. This service supports exabytes of data under a single namespace, surpassing other public cloud file solutions, and facilitates high-scale enterprise network attached storage (NAS) data migration to Azure.

Commvault unveiled a purpose-built cyber resilience solution, exclusively on Azure. It empowers users to predict threats faster, achieve clean recoveries, and accelerate response times. It integrates seamlessly with Microsoft Azure OpenAI Service.

Since the launch of Pure Storage Cloud Block Storage offering on Azure in 2021, we have partnered closely with Pure Storage. They use unique Azure Storage features such as shared disks and were early adopters of Ultra and Premium SSD v2 disk storage. Our collaboration has expanded to containers through Portworx and the Azure VMware Solution.

Besides storage and backup partners, we collaborate with Atempo, Data Dynamics, Komprise, and Cirrus Data to deliver storage migration solutions, providing access to industry-leading file and block solutions to migrate petabytes of NAS and storage area network (SAN) data to Azure at no additional charge.

Industry contributions


We actively participate in industry organizations such as Storage Networking Industry Association (SNIA) and the Open Compute Platform (OCP), collaborating with industry leaders to share insights, influence standards, and contribute to the development of innovative solutions and best practices.

At the 2023 Storage Developer Conference, we presented Massively Scalable Storage for Stateful Containers on Azure, emphasizing the synergy between Azure Container Storage and Elastic SAN. This combination offers unified volume management across diverse storage backends and delivers a highly available block storage solution designed to scale to millions of input/output operations per second (IOPS) with fast pod attach and detach experience.

At Flash Memory Summit (FMS) conference we chartered the course for the future of flash as a HDD displacement technology in alignment with Azure Storage long-range plans.

We are committed to sustainability, striving for net carbon-zero goals in the next decade. Through the OCP Sustainability Project, we provide an open framework for the datacenter industry to adopt best practices for reusability and circularity and lead in the steering committee to drive progress.

Unparalleled commitment to quality


Our primary focus is on delivering a robust foundation for our storage services, adapting to evolving workloads like the emergence of AI and ML. This ongoing commitment covers reliability, durability, scalability, and performance, for our storage solutions while maintaining a low total cost of ownership (TCO) for Azure customers.

We assure customers of high data durability, backed by substantial investments in infrastructure, hardware, software, and streamlined processes. Our Zone-redundant Storage (ZRS) ensures data durability of at least 99.9999999999% (12 9’s) over a given year—the highest standard among the major cloud service providers (CSPs). Notably, we are the sole cloud provider among the major CSPs maintaining a 0% Annual Failure Rate (AFR) with our block storage offerings since launch.

Our unique ZRS offering provides a simple way to establish highly available solutions across three zones at a fraction of the costs. You can deploy a cross-zonal AKS cluster with persistent volumes hosted on ZRS disks to ensure data durability and availability during zonal outages. Clustered applications like SQL failover cluster instances (FCI) leveraging Windows Server Failover Cluster (WSFC) can also benefit from ZRS with high resiliency provided out of the box.

In a concerted effort to elevate customer support experience, we’ve integrated with Azure Copilot, empowering support engineers to troubleshoot more efficiently while delivering quality responses quickly. This has resulted in a noteworthy increase in customer satisfaction scores.

In closing, 2023 has been a year of profound learning and substantial progress. Explore our feature releases in Azure Updates. Our commitment is to empower you with Azure Storage innovations and enable the seamless execution of a diverse range of workloads. We invite you to continue trusting us as we anticipate exciting developments in 2024!

Source: microsoft.com

Saturday, 12 August 2023

Efficiently store data with Azure Blob Storage Cold Tier — now generally available

Azure Blob Storage Cold Tier, Azure Exam, Azure Tutorial and Materials, Azure Career, Azure Skills, Azure Jobs

We are excited to announce the general availability of Azure Blob Storage Cold Tier in all public and Azure Government regions except Poland Central and Qatar Central. Azure Blob Storage Cold Tier is an online tier specifically designed for efficiently storing data that is infrequently accessed or modified, all while ensuring immediate availability.

Azure Blob Storage is optimized for storing massive amounts of unstructured data. With blob access tiers, you can store your data in the most cost-effective way, based on how frequently it will be accessed and how long it will be retained. Azure Blob Storage now includes a new cold online access tier option, further reducing costs.

Across diverse industries, Azure customers are harnessing the power of blob storage to address a wide range of needs. With the introduction of the new tier, customers can now experience remarkable benefits in scenarios such as backing up media content, preserving medical images, and securing critical application data for seamless business continuity and robust disaster recovery.

Cost effectiveness with cold tier


Cold tier is the most cost-effective Azure Blob Storage offering to store infrequently accessed data with long term retention requirements, while maintaining instant access. Blob access tiers maximize cost savings based on data access patterns. When your data isn’t needed for 30 days, we recommend tiering the data from hot to cool to save up to 46 percent (using the East US 2 pricing as an example) due to lower prices on capacity. When your data is even cooler, for example, if you don’t require access for 90 days or longer, cold tier results in more savings. When compared to cool tier, cold tier can save you an additional 64 percent on capacity costs (using the East US 2 pricing as an example; hot to cold tier savings are 80 percent).

Prices for read operations are higher on cooler access tiers and read patterns and file size distribution affect the cost-effectiveness. We recommend calculating the total cost based on both operation and capacity costs. The chart below shows how total cost differs on cool tier and cold tier based on how long you keep the data with the tier.

Azure Blob Storage Cold Tier, Azure Exam, Azure Tutorial and Materials, Azure Career, Azure Skills, Azure Jobs

In the above scenario, the total cost estimation assumes 10 TiB data in total, 10 MiB blob size on average, reading once every month, and reading 10 percent of the total data each time. If you keep data for 30 days, the total cost is lower on the cool tier than the cold tier. If you keep data for 60 days or longer, cold tier is a more cost-effective option.

Seamless experience with cold tier


Cold tier is as easy to use as hot tier and cool tier. REST APIs, SDKs, Azure Portal, Azure PowerShell, Azure CLI, Azure Storage Explorer have been extended to support cold tier. You can use the latest version of these clients to write, manage, and read your data directly from cold tier. The read latency is milliseconds on the cold tier.

Lifecycle management policy also supports automating tiering blobs to cold tier based on conditions including modified time, creation time, and last access time.

The cold tier extends its support to both Blob Storage and Azure Data Lake Storage Gen2. Locally redundant storage (LRS), Geo-redundant storage (GRS), Read-access Geo-redundant storage (RA-GRS), Zone-redundant storage (ZRS), Geo-zone-redundant storage (GZRS), and Read-access geo-zone-redundant storage (RA-GZRS) are all supported based on regional redundancy availability.

Empowering customers and partners to maximize savings


Customers across industries can use cold tier to improve the cost efficiency of object storage on Azure without compromising read latency. Since we launched the preview for cold tier in May 2023, our customers and partners have used this capability on Azure to store data that is infrequently accessed or modified.

Source: microsoft.com

Thursday, 22 June 2023

Azure Data Manager for Energy and LogScope: Enabling data integration within minutes

Revolutionizing access to OSDU Log Data with LogScope


Harvey Rock Physics (HRP) has revolutionized the way geoscientists and petrophysicists interact with well data, by integrating their LogScope software platform built on the Microsoft Cloud. This innovative solution provides seamless integration of borehole data into petrophysical and geological models, enabling users to view, filter, and deliver data to the OSDU Data Platform, such as Azure Data Manager for Energy with ease.

Founded in 2011, HRP is a leader in petrophysical technologies, specializing in mobile solutions. The LogScope software is built on a modern code base that allows HRP to rapidly create modules to meet client needs. In this case, HRP created a module that allows efficient entry and graphical viewing of data in the OSDU Data Platform.

The OSDU Data Platform provides flexible storage that is application-independent, allowing users to store data and use multiple applications in complex workflows. LogScope provides seamless interrogation through a map view that enables borehole-related data to be spatially visualized and selected for further inspection in a single well or cross-section mode.

LogScope is designed to be readily accessible from the Microsoft Store, with versions available for Windows tablet computers, as well as Apple iPad and iPhone products. This software is developed on a modern code base, allowing HRP to quickly create modules to meet client needs. The LogScope application provides tight integration of borehole data into petrophysical and geological models.

Azure Data Manager, Azure Tutorial and Materials, Azure Guides, Azure Prep, Azure Learning, Azure Certification, Azure Preparation Exam

Search and ingest data in minutes


Download the application from the Microsoft Store or Apple App Store, install it on your device, enter your connection parameters, and you are up and running in minutes. You can search and ingest data from your mobile device. New-generation end users expect the installation and usage of applications to be as easy as downloading any application on their phones or tablets, however, the legacy applications are coming from the old world of IT and have a different user experience. With LogScope we are making deployment and usage of earth science applications as easy as a phone or tablet client.

One of the major advantages of this new tool is the ability to upload or ingest data directly from raw data file formats—typically LAS or DLIS—or directly from data inside LogScope. Large data volumes can be intuitively and quickly uploaded to the OSDU Data Platform.

Azure Data Manager, Azure Tutorial and Materials, Azure Guides, Azure Prep, Azure Learning, Azure Certification, Azure Preparation Exam

Overall, HRP’s LogScope software platform, powered by the Microsoft Cloud, provides an efficient and user-friendly way for geoscientists and petrophysicists to interact with well data. The innovative solution provides seamless integration of borehole data into petrophysical and geological models, enabling users to view, filter, and deliver data to the OSDU Data Platform with ease.

How to work with HRP solutions on Azure Data Manager for Energy


Azure Data Manager for Energy is an enterprise-grade, fully managed, OSDU Data Platform for the energy industry that is efficient, standardized, easy to deploy, and scalable for data management—ingesting, aggregating, storing, searching, and retrieving data. The platform will provide the scale, security, privacy, and compliance expected by our enterprise customers. The platform offers out-of-the-box compatibility with RoQC applications, which accelerates time-to-market and being able to run their domain workflows with ease, with data contained in Azure Data Manager for Energy, and with minimal effort.

Source: microsoft.com

Saturday, 20 May 2023

Transforming containerized applications with Azure Container Storage—now in preview


Container is the new virtual machine (VM). Whether you are a CTO, enterprise architect, leading DevOps teams, or building applications, you have already embarked on the journey to containerize your applications or are raring to go—to maximize the benefits of scale, flexibility, and cost. With Kubernetes at the helm, containers have rapidly become a hotbed of innovation and a critical area of transformation for enterprises and startups alike. After the initial focus on stateless containers, running high scale stateful workloads on containers has now become the norm. To run business-critical, enterprise-grade applications on Kubernetes in the cloud, customers need highly scalable, cost-efficient, and performant storage—built-for and with intrinsic support for containers. Today, we are excited to announce the preview of Azure Container Storage, the industry’s first platform-managed container native storage service in the public cloud, providing end to end storage management and orchestration for stateful applications to run efficiently at scale on Azure.

Why Azure Container Storage?


With rapid adoption of Kubernetes, we see a surge of production workloads, both cloud-first as well as app modernization, that need container-native persistent storage for databases (such as MySQL), big data (such as ElasticSearch), messaging applications (such as Kafka), and continuous integration and continuous delivery (CI/CD) systems (such as Jenkins). To run these stateful applications, customers need operational simplicity to deploy and scale storage tightly coupled with the containerized applications. Customers today, however, need to choose between using VM centric cloud storage options, retrofitted to containers, or deploying and self-managing open-source container storage solutions in the cloud—leading to huge operational overhead, scaling bottlenecks, and high cost.

To provide customers with a seamless end-to-end experience, container native storage needs to enable:

◉ Seamless volume mobility across the cluster to maximize pod availability without bottlenecks on volume attaches and deletes.
◉ Rapid scaling of large number of volumes as application pods scale up or scale out as needed.
◉ Optimal price-performance for any volume sizes, especially small volumes that require higher input/output operations per second (IOPS).
◉ Simple and consistent volume management experience across backing storage types to match workload requirements, such as extremely low latency ephemeral disks versus persistent or scalable remote storage.

Azure Container Storage addresses these requirements by enabling customers to focus their attention on running workloads and applications rather than managing storage. Azure Container Storage is our first step towards providing a transformative storage experience. As a critical addition to Azure’s suite of container services, it will help organizations of all sizes to streamline their containerization efforts and improve their overall storage management capabilities.

Leveraging Azure Container Storage


Azure Container Storage, Azure Career, Azure Skills, Azure Jobs, Azure Tutorial and Materials, Azure Prep, Azure Preparation

Azure Container Storage is a purpose-built, software-defined storage solution that delivers a consistent control plane across multiple backing storage options to meet the needs of stateful container applications. This fully managed service provides a volume management layer for stateful container applications enabling storage orchestration, data management, Kubernetes-aware data protection, and rule-based performance scaling.

Aligning with open-source container native storage approaches, Azure Container Storage runs microservices-based storage controllers in Kubernetes, to abstract the storage management layer from pods and backing storage, enabling portability across Kubernetes nodes and ability to mount different storage options as shown in the figure.

Azure Container Storage components include:

◉ A Storage Pool, which is a collection of storage resources grouped and presented as a unified storage entity for your AKS cluster.
◉ A data services layer, responsible for replication, encryption, and other add-on functionality absent in the underlying storage provider.
◉ A protocol layer, which exposes provisioned volumes via NVMe-oF protocol to application pods.

With this approach Azure Container Storage offers several differentiated experiences to customers on Azure, including:

Lowering the total cost of ownership (TCO) by providing ability to scale IOPS on smaller volume sizes, to support containerized applications that have dynamic and fluctuating input/output (IO) requirements. This is enabled using shared provisioning of capacity and performance on a storage pool, which can be leveraged by multiple volumes. With shared provisioning, customers can now maximize performance across application containers while keeping TCO down. Instead of allocating capacity and IOPS per persistent volume (PV), which commonly leads to overprovisioning, customers can now create PVs and dynamically share resources from a Storage Pool.

Rapid scale-out of stateful pods, achieved using remote network protocols like NVME-oF and iSCSI to mount PV, enabling effortless scaling on AKS across Compute and Storage. This is specifically beneficial for container deployments that start small and iteratively add resources. Responsiveness is key to ensure that applications are not starved or disrupted, either during initialization or scaling in production. Additionally, application resiliency is key with pod respawns across the cluster requiring rapid PV movement. Leveraging remote network protocols allows us to tightly couple with the pod lifecycle to support highly resilient high scale stateful applications on AKS.

Simplified consistent volume management interface backed by local and remote storage options enabling customers to allocate and use storage via the Kubernetes control plane. This means that customers can leverage ephemeral disks, Azure Disks as well as Azure Elastic SAN via a unified management interface to meet workload needs. For instance, ephemeral storage may be preferable for Cassandra to achieve the lowest latency, while Azure Disks is suitable for PostgreSQL or other database solutions. This unified experience provided by Azure Container Storage simplifies the management of persistent volumes, while delivering a comprehensive solution to address the broad range of performance requirements of various containerized workloads.

Fully integrated day-2 experiences, including data protection, cross-cluster recovery, and observability providing operational simplicity for customers who need to create customer scripts or stitch together disparate tools today. Customers can orchestrate Kubernetes-aware backup of the persistent volumes integrated with AKS generally available to streamline the end-to-end experiences for running stateful container workloads on Azure.

Source: microsoft.com

Saturday, 10 December 2022

Azure Storage Mover–A managed migration service for Azure Storage

File storage is a critical part of any organization’s on-premises IT infrastructure. As organizations migrate more of their applications and user shares to the cloud, they often face challenges in migrating the associated file data. Having the right tools and services is essential to successful migrations.

Across workloads, there can be a wide range of file sizes, counts, types, and access patterns. In addition to supporting a variety of file data, migration services must minimize downtime, especially on mission-critical file shares.

In February of 2022, we launched the Azure file migration program that provides no-cost migration to our customers, via a choice of storage migration partners.

Today, we are adding another choice for file migration with the preview launch of Azure Storage Mover, which is a fully managed, hybrid migration service that makes migrating files and folders into Azure a breeze.

The key capabilities of the Azure Storage Mover preview are:

NFS share to Azure blob container


With this preview release, we focus on the migration of an on-premises network file system (NFS) share to an Azure blob container. Storage Mover will support many additional source and target combinations over the coming months.

Azure Storage Mover, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Tutorial and Materials, Microsoft Guides, Microsoft Preparation

Cloud-driven migrations


Managing copy jobs at scale without a coordinating service can be time consuming and error-prone. Individual jobs have to be monitored and any errors resolved. It’s hard to maintain comprehensive oversight to ensure a complete and successful migration of your data.

With Azure Storage Mover you can express your migration plan in Azure and when you are ready, conveniently start and track migrations right from the Azure portal, PowerShell, or CLI. This allows you to utilize Azure Storage Mover for a one-time migration project or for any repeated data movement needs.

Azure Storage Mover is a hybrid service with migration agents that you’ll deploy close to your source storage. All agents can be managed from the same place in Azure, even if they are deployed across the globe.

Azure Storage Mover, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Tutorial and Materials, Microsoft Guides, Microsoft Preparation

Scale and performance


Many aspects contribute to a high-performance migration service. Fast data movement through the Azure Storage REST protocol and a clear separation of the management path from the data path are among the most important. Each agent will send your files and folders directly to the target storage in Azure.

Directly sending the data to the target optimizes the performance of your migration because the data doesn’t need to be processed through a cloud service or through a different Azure region from where the target storage is deployed in. For example, this optimization is key for migrations that happen across geographically diverse branch offices that will likely target Azure Storage in their region.

Azure Storage Mover, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Tutorial and Materials, Microsoft Guides, Microsoft Preparation

What’s next for Storage Mover?


There are many steps in a cloud migration that need to happen before the first byte can be copied. A deep understanding of your data estate is essential to a balanced cloud solution design for your workloads.

When we combine that with a strategy to minimize downtime, and manage and monitor migration jobs at scale, then we’ve arrived at our vision for the Storage Mover service. This roadmap for this vision includes:

◉ Support for more sources and Azure Storage targets.
◉ More options to tailor a migration to your needs.
◉ Automatically loading possible sources into the service. That’s more than just convenience; it enables large-scale migrations and reduces mistakes from manual input.
◉ Deep insights about selected sources for a sound cloud solution design.
◉ Provisioning target storage automatically based on your migration plan.
◉ Running post-migration tasks such as data validation, enabling data protection, and completing migration of the rest of the workload, etc.

Azure Storage Mover, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Tutorial and Materials, Microsoft Guides, Microsoft Preparation

Source: microsoft.com

Thursday, 10 March 2022

The anatomy of a datacenter—how Microsoft's datacenter hardware powers the Microsoft Cloud

Anatomy of a DataCenter, Microsoft Cloud, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Preparation

Leading hardware engineering at a company known for its vast portfolio of software applications and systems is not as strange as it sounds, as the Microsoft Cloud depends on hardware as the foundation of trust, reliability, capacity, and performance, to make it possible for Microsoft and our customers to achieve more. The underlying infrastructure that powers our 60 plus datacenter regions across 140 countries consists of hardware and systems that sit within the physical buildings of datacenters—enabling millions of customers to execute critical and advanced workloads, such as AI and quantum computing, as well as unleashing future innovations.

Datacenter hardware development is imperative to the evolution of the Microsoft Cloud

As the Microsoft Cloud offers services and products to meet the world’s ever-growing computing demands, it is critical that we continuously design and advance hardware systems and infrastructure to deliver greater performance, higher efficiency, and more resiliency to customers—all with security and sustainability in mind. Today, our hardware engineering efforts and investments focus heavily on roadmap and lifecycle planning, sourcing and provisioning of servers, and innovating to deliver next-generation infrastructure for datacenters. In our new Hardware Innovation blog series, I’ll be sharing some of the hardware development and investments that are driving the most impact for the Microsoft Cloud and making Azure the trusted cloud that delivers innovative, reliable, and sustainable hybrid cloud solutions. But first, let’s look “under the hood” of a Microsoft datacenter:

From server to cloud: the end-to-end cloud hardware lifecycle

Our hardware planning starts with what customers want: capacity, differentiated services, cost-savings, and ultimately the ability to solve harder problems with the help of the Microsoft Cloud. We integrate key considerations such as customer feedback, operational analysis, technology vetting, with evaluation of disruptive innovations into our strategy and roadmap planning, improvement of existing hardware in our datacenters for compute, network architecture, and storage, while future-proofing innovative workloads for scale. Our engineers then design, build, test, and integrate software and firmware into hardware fleets that meet a stringent set of quality, security, and compliance requirements before deploying them into Microsoft’s datacenters across the globe.

Sourcing and provisioning cloud hardware, sustainably and securely

With Microsoft’s scale, the ways in which we provision, deploy, and decommission hardware parts have the potential to drive massive planetary impact. While we work with suppliers to reimagine a more resilient and efficient supply chain using technologies such as blockchain and digital twins, we also aim to have sustainability built into every step of the way. An example of our sustainability leadership is the execution of Microsoft Circular Centers, where servers and hardware that are being decommissioned are repurposed—efforts that are expected to increase the reuse of servers and components by up to 90 percent by 2025. I will be sharing more on our Circular Centers progress this year. We also have in place the Azure Security and Resiliency Architecture (ASRA) as an approach to drive security and resiliency consistently and comprehensively across the Microsoft Cloud infrastructure supply chain.

Innovating to deliver next-generation datacenter infrastructure

We are investigating and developing technology that would allow datacenters to be more agile, efficient, and sustainable to operate while meeting the computing demands of the future. We showcased development in datacenter energy efficiency, such as our two-phase liquid immersion cooling, allowing more densely packed servers to fit in smaller spaces, and addressing processor overclocking for higher computing efficiency with a lower carbon footprint. We also continue to invest in and develop workload-optimized infrastructure—from servers, racks, systems, to datacenter designs—for more custom general-purpose offerings as well as specialized compute such as AI, high-performance computing, quantum, and beyond.

Building the most advanced and innovative hardware for the intelligent cloud and the intelligent edge

The journey of building Microsoft Cloud’s hardware infrastructure is an exciting and humbling one as we see continual advancement in technology to meet the needs of the moment. I have been in the hardware industry for more than thirty years—yet, I’m more excited each day as I work alongside leaders and experts on our team, with our partners across the industry, and with the open source community. Like many of the cloud services that sit on top of it, Microsoft’s hardware engine runs on consistency in quality, reliability, and scalability. Stay tuned as we continue to share more deep dives and updates of our cloud hardware development, progress, and results—and work to drive forward technology advancement, enable new capabilities, and push the limits of what we can achieve in the intelligent cloud and the intelligent edge.

Source: azure.microsoft.com

Sunday, 12 September 2021

Improve availability with zone-redundant storage for Azure Disk Storage

Azure Disk Storage, Azure Exam Prep, Azure Learning, Azure Preparation, Azure Study Material

As organizations continue to accelerate their cloud adoption, the need for reliable infrastructure is critical to ensure business continuity and avoid costly disruptions. Azure Disk Storage provides maximum resiliency for all workloads with an industry-leading zero percent annual failure rate and single-instance service-level agreements (SLA) for all disk types, including a best-in-class single-instance SLA of 99.9 percent availability for Azure Virtual Machines using Azure Premium SSD or Azure Ultra Disk Storage.

Today, we continue our investments to further improve the reliability of our infrastructure with the general availability of zone-redundant storage (ZRS) for Azure Disk Storage. ZRS enables you to increase availability for your critical workloads by providing the industry’s only synchronous replication of block storage across three zones in a region, enabling your disks to tolerate zonal failures which may occur due to natural disasters or hardware issues. ZRS is currently supported for Azure Premium SSDs and Azure Standard SSDs.

We have seen strong interest and great feedback from many enterprise customers from various industries during our preview. These customers are planning to use ZRS for disks to provide higher availability for a wide range of scenarios such as clustering for SAP and SQL Server workloads, container applications, and legacy applications.

Increase availability for your clustered or distributed applications

Last year, we announced the general availability of shared disks for Azure Disk Storage, which is the only shared block storage in the cloud that supports both Windows and Linux-based clustered and distributed applications. This unique offering allows a single disk to be simultaneously attached and used from multiple virtual machines (VMs), enabling you to run your most demanding enterprise applications in the cloud, such as clustered databases, parallel file systems, persistent containers, and machine learning applications, without compromising on well-known deployment patterns for fast failover and high availability. Customers can now further improve the availability of their clustered applications, like SQL failover cluster instance (FCI) and SAP ASC/SCS leveraging Windows Server Failover Cluster (WSFC), with the combination of shared disks and ZRS.

Using Availability Zones for VMs, you can allocate primary and secondary VMs in different zones for higher availability and attach a shared ZRS disk to the VMs in different zones. If a primary VM goes down due to a zonal outage, WSFC will quickly failover to the secondary VM providing increased availability to your application. Customers can also use ZRS with shared disks for their Linux-based clustering applications that use IO fencing with SCSI persistent reservations. Shared disks are now available on all Premium SSD and Standard SSD sizes, enabling you to optimize for different price and performance options.

Azure Disk Storage, Azure Exam Prep, Azure Learning, Azure Preparation, Azure Study Material
Figure 1: Multi-zone Windows Server Failover Cluster with shared ZRS disk

Take advantage of ZRS disks with multi-zone Azure Kubernetes Service clusters

Customers can take advantage of ZRS disks for their container applications hosted on multi-zone Azure Kubernetes Service (AKS) for higher reliability. If a zone goes down, AKS will automatically fail over the stateful pods to a healthy zone by detaching and attaching ZRS disks to nodes in the healthy zone. We recently released the ZRS disks support in AKS through the CSI driver.

Achieve higher availability for legacy applications

You can achieve high availability for your workloads using application-level replication across two zones (such as SQL Always On). However, if you are using industry-specific proprietary software or legacy applications like older versions of SQL Server, which don't support application-level synchronous replication, ZRS disks will provide improved availability through storage-level replication. For example, if a zone goes down due to natural disasters or hardware failures, the ZRS disk will continue to be operational. If your VM in the affected zone becomes unavailable, you could use a virtual machine in another zone and attach the same ZRS disk.

Build highly available, cost-effective solutions

To build highly available software-as-a-service (SaaS) solutions, independent software vendors (ISVs) can take advantage of ZRS disks to increase availability while also reducing costs. Previously, ISVs would need to host VMs in two zones and replicate data between the VMs. This resulted in extra costs as they had to deploy twice the amount of locally redundant storage (LRS) disks to maintain two copies of data in two zones and an additional central processing unit (CPU) for replicating the data to two zones. ISVs can now use shared ZRS disks to deliver a more cost-effective solution with 1.5 times lower costs on the disks and no additional replication costs. In addition, ZRS disks also offer lower write latency than VM to VM replication of the data as the replication is performed by the platform. NetApp describes the value that ZRS provides them and their customers:

“Many customers wish to have their data replicated cross-zone to improve business continuity against zonal failures and reduce downtime. ZRS for Azure Disk Storage combined with shared disks is truly a game-changer for us as it enables us to improve the availability of our solution, allows applications to achieve their full performance, and reduces replication costs by offloading the replication to the backend infrastructure. NetApp is excited to extend its CVO High Availability solution using ZRS disks as this helps us provide a comprehensive high availability solution at lower costs for our mutual customers.”—Rajesh Rajaraman, Senior Technical Director at NetApp

Performance for ZRS disks

The IOPS and bandwidth provided by ZRS disks are the same as the corresponding LRS disks. For example, a P30 (128 GiB) LRS Premium SSD disk provides 5,000 IOPS and 200 MB/second throughput, which is the same for a P30 ZRS Premium SSD disk. Disk latency for ZRS is higher than that of LRS due to the cross zonal copy of data.

Source: microsoft.com

Thursday, 29 July 2021

Boost your client performance with Azure Files SMB Multichannel

Lower your deployment cost, while improving client performance with Server Message Block (SMB) Multichannel on premium tier.

Today, we are announcing the preview of Azure Files SMB Multichannel on premium tier. SMB 3.0 introduced the SMB Multichannel technology in Windows Server 2012 and Windows 8 client. This feature allows SMB 3.x clients to establish multiple network connections to SMB 3.x servers for greater performance over multiple network adapters and over network adapter with Receive Side Scaling (RSS) enabled. With this preview release, Azure Files SMB clients can now take advantage of SMB Multichannel technology with premium file shares in the cloud.

Benefits

SMB Multichannel allows multiple connections over the optimum network path that allows for increased performance from parallel processing. The increased performance is achieved by bandwidth aggregation over multiple NICs or with NIC support for Receive Sides Scaling (RSS) that enables distributed IOs across multiple CPUs and dynamic load balance.

Azure Exam Prep, Azure Tutorial and Material, Azure Preparation, Azure Learning, Azure Guides, Azure Exam Prep

Benefits of Azure Files SMB Multichannel include:


◉ Higher throughput: Makes this feature suitable for applications with large files with large IOs such as media & entertainment for content creation/transcoding, genomics, and financial services risk analysis.

◉ Increased IOPS: This is particularly useful for small IO scenarios such as database applications.

◉ Network fault tolerance: Multiple connections mitigate the risk of disruption despite the loss of a network connection.

◉ Automatic configuration: Dynamic discovery and creation of multiple network paths.

◉ Cost optimization: Achieve higher scale from a single virtual machine (VM) client by utilizing full VM limits. To reach Azure Files premium bandwidth and IOPS scale, applications now require fewer VM clients to achieve the required scale.

Below is a sample test result that demonstrates the performance improvements of up to three times with SMB Multichannel feature enabled.

Azure Exam Prep, Azure Tutorial and Material, Azure Preparation, Azure Learning, Azure Guides, Azure Exam Prep

The above sample test results are based on internal testing performed by generating random IO load with a single virtual machine client (Standard_D32s_v3 with a single RSS enabled NIC) connected to a premium file share. Diskspd tool was used to generate load against ten files with 64 queue depth to simulate multiple threaded workload patterns against multiple files in a share. The IOPS results are based on 4 KiB read and write IO sizes, while throughput test is based on 512 KiB read and write IO sizes to optimize for the performance. A larger percentage gain was observed in write throughput versus read throughput.

Pricing and availability


SMB Multichannel for Azure Files premium storage accounts come at zero additional cost. In addition, as part of our continued commitment to provide the most cost-effective file storage for your workloads, we recently announced a more than 33 percent price reduction on Azure Files premium tier.

Currently, SMB Multichannel preview on premium shares is available in limited regions for Windows SMB 3.x clients. We are quickly expanding the coverage to all Azure regions with premium tier. Stay up to date on region availability for SMB Multichannel by visiting the Azure Files documentation.

Getting started


Learn more about feature capability and SMB Multichannel performance in the Azure Files documentation. To get started, you will need to register your subscription for SMB Multichannel feature preview. Once the registration is complete, you can enable or disable SMB Multichannel on premium storage accounts (FileStorage) in one of the supported regions with a click of a button.

Azure Exam Prep, Azure Tutorial and Material, Azure Preparation, Azure Learning, Azure Guides, Azure Exam Prep

Source: microsoft.com

Sunday, 9 May 2021

Azure Availability Zones in the South Central US datacenter region add resiliency

Azure Exam Prep, Azure Tutorial and Material, Azure Learning, Azure Preparation, Azure Career

As businesses move more workstreams to the cloud, business continuity and data protection have never been more critical—and perhaps their importance has never been more visible than during the challenges and unpredictability of 2020. To continue our commitment to supporting stability and resiliency in the cloud, Microsoft is announcing the general availability of Azure Availability Zones from our South Central US datacenter region.

Azure Availability Zones are unique physical locations within an Azure region that each consist of one or more datacenters equipped with independent power, cooling, and networking. Availability Zones provide protection against datacenter failures and unplanned downtime. These are further supported by one of the top industry-leading service level agreements (SLA) of 99.99 percent virtual machine uptime.

For many companies, especially those in regulated industries who are increasingly moving their critical applications to the cloud, Availability Zones in South Central US provide the option for customers to choose the resiliency and business continuity options that support their business. Availability Zones provide our customers with added options for high availability with added fault tolerance against datacenter failures while supporting data protection and backup. Customers can choose to store data in the same datacenter, across zonal datacenters in the same region, or across geographically separated regions. Finally, data is protected against accidental customer deletion using role-based access control and immutable storage applied through forced retention policies.

Availability Zones in South Central US build upon a broader, rich set of resiliency features available with Azure that support customer resiliency. Key among these are:

◉ Azure Storage, SQL Database, and Cosmos DB all provide built-in data replication, both within a region and across regions.

◉ Azure managed disks are automatically placed in different storage scale units to limit the effects of hardware failures.

◉ Virtual machines (VMs) in an availability set are spread across several fault domains. A fault domain is a group of VMs that share a common power source and network switch. Spreading VMs across fault domains limits the impact of physical hardware failures, network outages, or power interruptions.

◉ Azure Site Recovery supports customers in disaster recovery scenarios across regions and zones.

The creation of Availability Zones in South Central US benefits our customers in many ways, including increased service availability guarantees, which reduces the chance of downtime or data loss should there be any failure. These zones also help ensure data storage protection for peace of mind. Data protection is our priority, even over recovery time. We can endure a longer outage and data can still be protected because of zone availability. Data is also replicated in triplicate. As we kick off a New Year, resiliency and stability for our customers are still crucial. We’re optimistic and excited to see the impact these Availability Zones will have on customers, their digital transformations, and ultimately their success.

Source: microsoft.com

Saturday, 20 March 2021

Azure Defender for Storage powered by Microsoft threat intelligence

With the reality of working from home, more people and devices are now accessing corporate data across home networks. This raises the risks of cyber-attacks and elevates the importance of proper data protection. One of the resources most targeted by attackers is data storage, which can hold critical business data and sensitive information.

Read More: AZ-600: Configuring and Operating a Hybrid Cloud with Microsoft Azure Stack

To help Azure customers better protect their storage environment, Azure Security Center provides Azure Defender for Storage, which alerts customers upon unusual and potentially harmful attempts to access or exploit their storage accounts.

What’s new in Azure Defender for Storage

As with all Microsoft security products, customers of Azure Defender for Storage benefit from Microsoft threat intelligence to detect and hunt for attacks. Microsoft amasses billions of signals for a holistic view of the security ecosystem. These shared signals and threat intelligence enrich Microsoft products and allow them to offer context, relevance, and priority management to help security teams act more efficiently.

Based on these capabilities, Azure Defender for Storage now alerts customers also upon the detection of malicious activities such as:

◉ Upload of potential malware (using hash reputation analysis).

◉ Phishing campaigns hosted on a storage account.

◉ Access from suspicious IP addresses, such as TOR exit nodes.

In addition, leveraging the advanced capabilities of Microsoft threat intelligence helps us enrich our current Azure Defender for Storage alert and future detections.

To benefit from Azure Defender for Storage, you can easily configure it on your subscription or storage accounts and start your 30-day trial today.

Cyberattacks on cloud data stores are on the rise

Nowadays, more and more organizations place their critical business data assets in the cloud using PaaS data services. Azure Storage is among the most widely used of these services. The amount of data obtained and analyzed by organizations continues to grow at an increasing rate, and data is becoming increasingly vital in guiding critical business decisions.

With this rise in usage, the risks of cyberattacks and data breaches are also growing, especially for business-critical data and sensitive information. Cyber incidents cause organizations to lose money, data, productivity, and consumer trust. The average total cost of a data breach is $3.86 million. On average, it takes 280 days to identify and contain a breach, and 17 percent of cyberattacks involve malware.

It’s clear that organizations worldwide need protection, detection, and rapid-fire response mechanisms to these threats. Yet, on average, more than 99 days pass between infiltration and detection, which is like leaving the front door wide open for over four months. Therefore, proper threat intelligence and detection are needed.

Azure Defender for Storage improved threat detections

1. Detecting upload of malware and malicious content

Storage accounts are widely used for data distribution, thus they may get infected with malware and cause it to spread to additional users and resources. This may make them vulnerable to attacks and exploits, putting sensitive organizational data at risk.

Malware reaching storage accounts was a top concern raised by our customers, and to help address it, Azure Defender for Storage now utilizes advanced hash reputation analysis to detect malware uploaded to storage accounts in Azure. This can help detect ransomware, viruses, spyware, and other malware uploaded to your accounts.

A security alert is automatically triggered upon detection of potential malware uploaded to an Azure Storage account.

Azure Defender, Microsoft Threat Intelligence, Azure Exam Prep, Azure Certification, Azure Guides, Azure Preparation

In addition, an email notification is sent to the owner of the storage account:

Azure Defender, Microsoft Threat Intelligence, Azure Exam Prep, Azure Certification, Azure Guides, Azure Preparation

It’s important to notice that, currently, Azure Defender for Storage doesn’t offer malware scanning capabilities. For those interested in malware scanning upon file or blob upload, they might consider using a third-party solution.

“Azure Blob Storage is a very powerful and cost-effective storage solution, allowing for fast and cheap storage and retrieval of large amounts of files. We use it on all our systems and often have millions of documents in Blob Storage for a given system. With PaaS solutions, it can, however, be a challenge to check files for malware before they are stored in Blob Storage. It is incredibly easy to enable the new “Malware Reputation Screening” for storage accounts at scale, it offers us a built-in basic level of protection against malware, which is often sufficient, thus saving us the overhead to set up and manage complex malware scanning solutions.”—Frans Lytzen, CTO at NewOrbit

In addition to malware, Azure Defender for Storage also alerts upon unusual upload of executable (.exe) and service package (.cspkg) files which can be used to breach your environment.

2. Detecting phishing campaigns hosted on Azure Storage

Phishing is a type of social engineering attack often used to steal user data, including login credentials, credit card numbers, and other sensitive info. Email phishing attacks are among the most common types of phishing attacks, where cybercriminals spread a high volume of fake emails designed to trick visitors into entering their corporate credentials or financial information into a web form that looks genuine or to download attachments containing malware, such as ransomware.

Email phishing attacks are becoming more sophisticated, making it even harder for users to distinguish between legitimate and malicious messages. One of the ways attackers use to make their phishing webpages look genuine, both to users and security gateways, is to host those pages on certified cloud storage, such as Azure Storage.

Using dedicated storage accounts to host the phishing content makes it easier to detect and block such accounts. So, attackers constantly try to sneak their phishing content and webpages into others’ storage accounts that allow uploading content.

Microsoft threat intelligence amasses and analyzes several signals to help better identify phishing campaigns, and now Azure Defender for Storage can alert when it detects that one of your Azure Storage accounts hosts content used in a phishing attack affecting users of Microsoft 365.

3. Detecting access from suspicious IP addresses

The reputation of client IP addresses that access Azure Storage are continuously monitored. These reputations are based on a threat intelligence feed which contains data from various sources, including first and third-party threat intelligence feeds, curated from honeypots, malicious IP addresses, botnets, malware detonation feeds, and more, also including analyst-based observations and collections.

This provides another layer of protection for Azure Storage as customers are alerted when IP addresses with questionable reputations access their storage accounts. Moreover, existing alerts such as access from unusual locations are enriched with information about the reputation of this anomalous client IP address. Consequently, customers now receive alerts with better explanations, as well as elevated fidelity and severity.

Figure 1 illustrates how access to storage is analyzed by examining the reputation of the client IP address according to this feed.

Azure Defender, Microsoft Threat Intelligence, Azure Exam Prep, Azure Certification, Azure Guides, Azure Preparation

Figure 1: Enriching Azure Storage Service access logs with the reputation of client IP.

This new alert has been vital in revealing and preventing cyber-attacks, which may have otherwise caused severe damage, as observed in two real customer case-study scenarios described below. 

First case study: Detecting malicious access to critical customer data


Figure 2 depicts a hybrid architecture in which on-premises machines are monitored using Security Center Log Analytics agent. These machines access a storage service via a gateway that has an external IP address and is installed on-premises. The storage is protected by a firewall, which permits access only from the dedicated gateway.

Azure Defender, Microsoft Threat Intelligence, Azure Exam Prep, Azure Certification, Azure Guides, Azure Preparation

Figure 2: Attack on a hybrid environment uncovered by tracking IP reputation.

Two on-premises machines were infected by malware. Although the malware remains undetected, the compromise was exposed by observing the two machines initiate access to a honeypot via the gateway. Our Azure Defender for Storage service used the TI feed about IPs that have accessed a honeypot network. The customer was alerted accordingly, preventing a situation in which compromised machines will access critical customer data. Note that the firewall that was setup was not enough to guarantee that compromised machines will not be able to access critical data. Hence this detection was vital in uncovering the breach.

Second case study: Identifying potential malware infection from virtual machines


Figure 3 depicts a virtual machine (VM) infected with a bot spreading an innocent-looking malware to SMB-protocol enabled file systems (such as Azure File Storage). The malware can be anything from an executable file or DLL to an Excel or Word file with macros enabled. The infected virtual machine's communication to its Command and Control Server is intercepted and reported to the TI feed. Azure Defender for Storage flagged access from the VM’s IP address even though it’s not hosted in Azure. As soon as the infected VM copied a file to a protected Azure Storage account, the incident was reported as an alert to the customer, who immediately mitigated the risk preventing further infection to customer machines.

Azure Defender, Microsoft Threat Intelligence, Azure Exam Prep, Azure Certification, Azure Guides, Azure Preparation

Figure 3: A Keybase-infected VM stores a malicious file in Azure Storage.

Source: azure.microsoft.com