Saturday 28 September 2019

Azure Sentinel general availability: A modern SIEM reimagined in the cloud

This marks an important milestone in our journey to redefine Security Information and Event Management (SIEM) for the cloud era. With Azure Sentinel, enterprises worldwide can now keep pace with the exponential growth in security data, improve security outcomes without adding analyst resources, and reduce hardware and operational costs.

With the help of customers and partners, including feedback from over 12,000 trials during the preview, we have designed Azure Sentinel to bring together the power of Azure and AI to enable Security Operations Centers to achieve more. There are lots of new capabilities coming online this week. I’ll walk you through several of them here.

Collect and analyze nearly limitless volume of security data


With Azure Sentinel, we are on a mission to improve security for the whole enterprise. Many Microsoft and non-Microsoft data sources are built right in and can be enabled in a single click. New connectors for Microsoft services like Cloud App Security and Information Protection join a growing list of third-party connectors to make it easier than ever to ingest and analyze data from across your digital estate.

Workbooks offer rich visualization options for gaining insights into your data. Use or modify an existing workbook or create your own.

Azure Study Materials, Azure Tutorials and Materials, Azure Learning, Azure Online Exam

Apply analytics, including Machine Learning, to detect threats


You can now choose from more than 100 built-in alert rules or use the new alert wizard to create your own. Alerts can be triggered by a single event or based on a threshold, or by correlating different datasets (e.g., events that match threat indicators) or by using built-in machine learning algorithms.

Azure Study Materials, Azure Tutorials and Materials, Azure Learning, Azure Online Exam

We’re previewing two new Machine Learning approaches that offer customers the benefits of AI without the complexity. First, we apply proven off-the-shelf Machine Learning models for identifying suspicious logins across Microsoft identity services to discover malicious SSH accesses. By using transferred learning from existing Machine Learning models, Azure Sentinel can detect anomalies from a single dataset with accuracy. In addition, we use a Machine Learning technique called fusion to connect data from multiple sources, like Azure AD anomalous logins and suspicious Office 365 activities, to detect 35 different threats that span different points on the kill chain.

Expedite threat hunting, incident investigation, and response


Proactive threat hunting is a critical yet time-consuming task for Security Operations Centers. Azure Sentinel makes hunting easier with a rich hunting interface that features a growing collection of hunting queries, exploratory queries, and python libraries for use in Jupyter Notebooks. Use these to identify events of interest and bookmark them for later reference.

Azure Study Materials, Azure Tutorials and Materials, Azure Learning, Azure Online Exam

Incidents (formerly cases) contain one or more alerts that require further investigation. Incidents now support tagging, comments, and assignments. A new rules wizard allows you to decide which Microsoft alerts trigger the creation of incidents.

Azure Study Materials, Azure Tutorials and Materials, Azure Learning, Azure Online Exam

Using the new investigation graph preview, you can visualize and traverse the connections between entities like users, assets, applications, or URLs and related activities like logins, data transfers, or application usage to rapidly understand the scope and impact of an incident.

Azure Study Materials, Azure Tutorials and Materials, Azure Learning, Azure Online Exam

New actions and playbooks simplify the process of incident automation and remediation using Azure Logic Apps. Send an email to validate a user action, enrich an incident with geolocation data, block a suspicious user, and isolate a Windows machine.

Azure Study Materials, Azure Tutorials and Materials, Azure Learning, Azure Online Exam

Build on the expertise of Microsoft and community members


The Azure Sentinel GitHub repository has grown to over 400 detection, exploratory, and hunting queries, plus Azure Notebooks samples and related Python libraries, playbooks samples, and parsers. The bulk of these were developed by our MSTIC security researchers based on their vast global security experience and threat intelligence.

Azure Study Materials, Azure Tutorials and Materials, Azure Learning, Azure Online Exam

Support managed Security Services Providers and complex customer instances


Azure Sentinel now works with Azure Lighthouse, empowering customers and managed security services providers (MSSPs) to view Azure Sentinel for multiple tenants without the need to navigate between tenants. We have worked closely with our partners to jointly develop a solution that addresses their requirements for a modern SIEM. 

DXC Technology, one of the largest global MSSPs is a great example of this design partnership:

“Through our strategic partnership with Microsoft, and as a member of the Microsoft Security Partner Advisory Council, DXC will integrate and deploy Azure Sentinel into the cyber defense solutions and intelligent security operations we deliver to our clients.” said Mark Hughes, senior vice president and general manager, Security, DXC. “Our integrated solution leverages the cloud native capabilities and assets of Azure Sentinel to orchestrate and automate large volumes of security incidents, enabling our security experts to focus on the forensic investigation of high priority incidents and threats.”

Friday 27 September 2019

Stay on top of best practices with Azure Advisor alerts

To get the most out of your Azure investment and run as efficiently as possible, we recommend that you regularly review and optimize your resources for high availability, security, performance, and cost. That’s why we created Azure Advisor, a free Azure service that helps you quickly and easily optimize your Azure resources with personalized recommendations based on your usage and configurations.

But with so many priorities vying for your attention, it can be easy to miss remediating your Advisor recommendations. So, what’s a good way to stay on top of these critical optimizations that can save you money, boost performance, strengthen your security posture, and increase uptime?

Get notified about new recommendations with Advisor alerts


Advisor now offers user-configurable alerts so you can get automatically notified as soon as your best practice recommendations become available. Advisor alerts will allow you to act more quickly and efficiently to optimize your Azure resources and stay on top of your new recommendations.

Azure Advisor Alerts, Azure Certifications, Azure Tutorial and Material, Azure Study Materials

You can configure these alerts to be triggered based on several factors:

◈ Recommendation category – high availability, performance, or cost.

◈ Business impact – high, medium, or low.

◈ Recommendation type – for example, right-size or shutdown underutilized virtual machines (VMs,) enable VM backup, or use availability sets to improve fault tolerance.

You can also choose from a wide range of notification options, including email, SMS, push notification, webhook, IT service management integration with popular tools like ServiceNow, Automation runbooks, and more. Your notification preferences are configured using action groups, so you can repurpose any action groups you’ve already set up, such as those for your custom Azure Monitor alerts or Azure Service Health alerts.

Best practices for your Advisor alerts


As you get started with Advisor alerts, we have three tips for you.

First, start simple by choosing a few high impact recommendations that are important to your organization, based on your business goals and priorities. For example, you might have a leadership mandate to reduce costs by a certain percentage, in which case you might decide that “Right-size or shutdown underutilized VMs” is a critical recommendation for you. Then create an alert for that set of recommendations. You can always change your alert or add more later.

Azure Advisor Alerts, Azure Certifications, Azure Tutorial and Material, Azure Study Materials

Second, consider who is right person to notify about new recommendations and the best way to notify them. It’s best to notify the individual or team who has the permission and authority to remediate the recommendation, to streamline the process. In keeping with the “start simple” principle, you may wish to begin with email notifications, which are the most basic to configure and the least intrusive to receive. Again, you can always modify your preferences later.

Finally, once you’ve tackled the first two tips and are comfortable with Advisor alerts, start to explore automation scenarios. For example, you can automatically route a new best practice recommendation through your ticketing system and assign it to the right team for remediation. In some cases, you can even use a combination of Advisor alerts and Automation runbooks to automatically remediate the recommendation.

Thursday 26 September 2019

Azure Media Services' new AI-powered innovation

Animated character recognition, multilingual speech transcription and more now available


At Microsoft, our mission is to empower every person and organization on the planet to achieve more. The media industry exemplifies this mission. We live in an age where more content is being created and consumed in more ways and on more devices than ever. At IBC 2019, we’re delighted to share the latest innovations we’ve been working on and how they can help transform your media workflows.

Video Indexer adds support for animation and multilingual content


We made our award winning Azure Media Services Video Indexer generally available at IBC last year, and this year it’s getting even better. Video Indexer automatically extracts insights and metadata such as spoken words, faces, emotions, topics and brands from media files, without you needing to be a machine learning expert. Our latest announcements include previews for two highly requested and differentiated capabilities for animated character recognition and multilingual speech transcription, as well as several additions to existing models available today in Video Indexer.

Animated character recognition


Animated content or cartoons are one of the most popular content types, but standard AI vision models built for human faces do not work well with them, especially if the content has characters without human features. In this new preview solution, Video Indexer joins forces with Microsoft’s Azure Custom Vision service to provide a new set of models that automatically detect and group animated characters and allow customers to then tag and recognize them easily via integrated custom vision models. These models are integrated into a single pipeline, which allows anyone to use the service without any previous machine learning skills. The results are available through the no-code Video Indexer portal or the REST API for easy integration into your own applications.

Azure Media Services, Azure AI, Azure Tutorials and Materials, Azure Learning, Azure Certifications

We built these animated character models in collaboration with select customers who contributed real animated content for training and testing. The value of the new functionality is well articulated by Andy Gutteridge, Senior Director, Studio & Post-Production Technology at Viacom International Media Networks, which was one of the data contributors: “The addition of reliable AI-based animated detection will enable us to discover and catalogue character metadata from our content library quickly and efficiently. Most importantly, it will give our creative teams the power to find the content they want instantly, minimize time spent on media management and allow them to focus on the creative.”

Multilingual identification and transcription


Some media assets like news, current affairs, and interviews contain audio with speakers using different languages. Most existing speech-to-text capabilities require the audio recognition language to be specified in advance, which is an obstacle to transcribing multilingual videos. Our new automatic spoken language identification for multiple content feature leverages machine learning technology to identify the different languages used in a media asset. Once detected, each language segment undergoes an automatic transcription process in the language identified, and all segments are integrated back together into one transcription file consisting of multiple languages.

Azure Media Services, Azure AI, Azure Tutorials and Materials, Azure Learning, Azure Certifications

The resulting transcription is available both as part of Video Indexer JSON output and as closed-caption files. The output transcript is also integrated with Azure Search, allowing you to immediately search across videos for the different language segments. Furthermore, the multi-language transcription is available as part of the Video Indexer portal experience so you can view the transcript and identified language by time, or jump to the specific places in the video for each language and see the multi-language transcription as captions as a video is played. You can also translate the output back-and-forth into 54 different languages via the portal and API.

Additional updated and improved models


We are also adding new and improving existing models within Video Indexer, including:

Extraction of people and locations entities


We’ve extended our current brand detection capabilities to also incorporate well-known names and locations, such as the Eiffel Tower in Paris or Big Ben in London. When these appear in the generated transcript or on-screen via optical character recognition (OCR), a specific insight is created. With this new capability, you can review and search by all people, locations and brands that appeared in the video, along with their timeframes, description, and a link to our Bing search engine for more information.

Azure Media Services, Azure AI, Azure Tutorials and Materials, Azure Learning, Azure Certifications

Editorial shot detection model


This new feature adds a set of “tags” in the metadata attached to an individual shot in the insights JSON to represent its editorial type (such as wide shot, medium shot, close up, extreme close up, two shot, multiple people, outdoor and indoor, etc.). These shot-type characteristics come in handy when editing videos into clips and trailers as well as when searching for a specific style of shots for artistic purposes.

Azure Media Services, Azure AI, Azure Tutorials and Materials, Azure Learning, Azure Certifications

Explore and read more about editorial shot type detection in Video Indexer.

Expanded granularity of IPTC mapping

Our topic inferencing model determines the topic of videos based on transcription, optical character recognition (OCR), and detected celebrities even if the topic is not explicitly stated. We map these inferred topics to four different taxonomies: Wikipedia, Bing, IPTC, and IAB. With this enhancement, we now include level-2 IPTC taxonomy.

Tanking advantage of these enhancements is as easy as re-indexing your current Video Indexer library.

New live streaming functionality


We are also introducing two new live-streaming capabilities in preview to Azure Media Services.

Live transcription supercharges your live events with AI


Using Azure Media Services to stream a live event, you can now get an output stream that includes an automatically generated text track in addition to the video and audio content. This text track is created using AI-based live transcription of the audio of the contribution feed. Custom methods are applied before and after speech-to-text conversion in order to improve the end-user experience. The text track is packaged into IMSC1, TTML, or WebVTT, depending on whether you are delivering in DASH, HLS CMAF, or HLS TS.

Live linear encoding for 24/7 over-the-top (OTT) channels


Using our v3 APIs, you can create, manage, and stream live channels for OTT services and take advantage of all the other features of Azure Media Services like live to video on demand (VOD), packaging, and digital rights management (DRM).

Azure Media Services, Azure AI, Azure Tutorials and Materials, Azure Learning, Azure Certifications

New packaging features


Support for audio description tracks

Broadcast content frequently has an audio track that contains verbal explanations of on-screen action in addition to the normal program audio. This makes programming more accessible for vision-impaired viewers, especially if the content is highly visual. The new audio description feature enables a customer to annotate one of the audio tracks to be the audio description (AD) track, which in turn can be used by players to make the AD track discoverable by viewers.

ID3 metadata insertion

In order to signal the insertion of advertisements or custom metadata events on a client player, broadcasters often make use of timed metadata embedded within the video. In addition to SCTE-35 signaling modes, we now also support ID3v2 or other custom schemas defined by an application developer for use by the client application.

Microsoft Azure partners demonstrate end-to-end solutions


Bitmovin is debuting its Bitmovin Video Encoding and Bitmovin Video Player on Microsoft Azure. Customers can now use these encoding and player solutions on Azure and leverage advanced functionality such as 3-pass encoding, AV1/VVC codec support, multi-language closed captions, and pre-integrated video analytics for QoS, ad, and video tracking.

Evergent is showing its User Lifecycle Management Platform on Azure. As a leading provider of revenue and customer lifecycle management solutions, Evergent leverages Azure AI to enable premium entertainment service providers to improve customer acquisition and retention by generating targeted packages and offers at critical points in the customer lifecycle.

Haivision will showcase its intelligent media routing cloud service, SRT Hub, that helps customers transform end-to-end workflows starting with ingest using Azure Data Box Edge and media workflow transformation using Hublets from Avid, Telestream, Wowza and Cinegy, and Make.tv.

SES has developed a suite of broadcast-grade media services on Azure for its satellite connectivity and managed media services customers. SES will show solutions for fully managed playout services, including master playout, localized playout and ad detection and replacement, and 24x7 high-quality multichannel live encoding on Azure.

SyncWords is making its caption automation technology and user-friendly cloud-based tools available on Azure. These offerings will make it easier for media organizations to add automated closed captioning and foreign language subtitling capabilities to their real-time and offline video processing workflows on Azure.

Global design and technology services company Tata Elxsi has integrated TEPlay, its OTT platform SaaS, with Azure Media Services to deliver OTT content from the cloud. Tata Elxsi has also brought FalconEye, its quality of experience (QoE) monitoring solution that focuses on actionable metrics and analytics, to Microsoft Azure.

Verizon Media is making its streaming platform available in beta on Azure. Verizon Media Platform is an enterprise-grade managed OTT solution including DRM, ad insertion, one-to-one personalized sessions, dynamic content replacement, and video delivery. The integration brings simplified workflows, global support and scale, and access to a range of unique capabilities available on Azure.

Many of our partners will also be presenting in the theater at our booth, so make sure you stop by to catch them!

Tuesday 24 September 2019

How to develop your service health alerting strategy

Service issues are anything that could affect your availability, from outages and planned maintenance to service transitions and retirements. While rare—and getting rarer all the time, thanks to innovations in impactless maintenance and disciplines like site reliability engineering—service issues do occur, which is why service health alerting is such a critical part of successfully managing cloud operations. It’s all about helping your team understand the status and health of your environment so you can act quickly in the event of an issue. That can mean taking corrective measures like failing over to another region to keep your app running or simply communicating with your stakeholders so they know what’s going on.

In this blog, we’ll cover how you can develop an effective service health alerting strategy and then make it real with Azure Service Health alerts.

How Azure Service Health alerts work


Azure Service Health is a free Azure service that provides alerts and guidance when Azure service issues like outages and planned maintenance affect you. Azure Service Health is available in the portal as a dashboard where you can check active, upcoming, and past issues.

Of course you may not want to check the Azure Service Health dashboard regularly. That’s why Azure Service Health also offers alerts. Azure Service Health alerts automatically notify you via your preferred channel such as email, SMS, mobile push notification, webhook into your internal ticketing system like ServiceNow or PagerDuty, and more if there’s an issue affecting you.

Azure Study Materials, Azure Tutorials and Materials, Azure Certifications, Azure Online Exam

If you’re new to Azure Service Health alerts, you’ll notice that there are many choices to make during the configuration process. Who should I alert about which services and regions? Who should I alert for which types of health events? Outages? Planned maintenance? Health advisories? And what type of notification like email, SMS, push notification, webhook, or something else should I use?

To answer these questions the right way, you’ll need to have a conversation with your team and develop your service health alerting strategy.

How to develop your service health alerting strategy with your team


There are three key considerations for your team to address when you set up your Azure Service Health alerts.

First, think about criticality. How important is a given subscription, service, or region? If it’s production, you’ll want to set up an alert for it, but dev/testing might be unnecessary. Azure Service Health is personalized, so we won’t trigger your alert if the service issue affects a service or region you aren’t using.

Next, decide who to inform in the event of an issue. Who is the right person or team to tell about a service issue so they can act? For example, send Azure SQL or Azure Cosmos DB issues to your database team.

Finally, agree on how to inform that individual or team. What is the right communication channel for the message? Email is noisy, so it might take longer for your teams to respond. That’s fine for planned maintenance that’s weeks away, but not for an outage affecting you right now, in which case you’ll want to alert your on-call team using a channel that’s immediately seen, like a push notification or SMS. Or if you’re a larger or more mature organization, plug the alerts into your existing problem management system using a webhook/ITSM connection so you can follow your normal workflow.

For more information on Azure Service Health, how to set up alerts, and other critical guidance for handling service issues including, in some cases, avoiding their impact altogether, check out the video below:

Azure Study Materials, Azure Tutorials and Materials, Azure Certifications, Azure Online Exam

Set up your Azure Service Health alerts today


Once you’ve had your Azure Service Health alerting conversation with your team and developed your strategy, configure your Azure Service Health alerts in the Azure Portal.

Sunday 22 September 2019

Three ways to leverage composite indexes in Azure Cosmos DB

Composite indexes were introduced in Azure Cosmos DB at Microsoft Build 2019. With our latest service update, additional query types can now leverage composite indexes. In this post, we’ll explore composite indexes and highlight common use cases.

Index types in Azure Cosmos DB


Azure Cosmos DB currently has the following index types that are used for the following types of queries:

Range indexes:

◈ Equality queries
◈ Range queries
◈ ORDER BY queries on a single property
◈ JOIN queries

Spatial indexes:

◈ Geospatial functions

Composite indexes:

◈ ORDER BY queries on multiple properties
◈ Queries with a filter as well as an ORDER BY clause
◈ Queries with a filter on two or more properties

Composite index use cases


By default, Azure Cosmos DB will create a range index on every property. For many workloads, these indexes are enough, and no further optimizations are necessary. Composite indexes can be added in addition to the default range indexes. Composite indexes have both a path and order (ASC or DESC) defined for each property within the composite index.

ORDER BY queries on multiple properties


If a query has an ORDER BY clause with two or more properties, a composite index is required. For example, the following query requires a composite index defined on age and name (age ASC, name ASC):

SELECT * FROM c ORDER BY c.age ASC, c.name ASC

This query will sort all results in ascending order by the value of the age property. If two documents have the same age value, the query will sort the documents by name.

Queries with a filter as well as an ORDER BY clause


If a query has a filter as well as an ORDER BY clause on different properties, a composite index will improve performance. For example, the following query will require fewer request units (RU’s) if a composite index on name and age is defined and the query is updated to include the name in the ORDER BY clause:

Original query utilizing range index:

SELECT * FROM c WHERE c.name = “Tim” ORDER BY c.age ASC

Revised query utilizing a composite index on name and age:

SELECT * FROM c WHERE c.name = “Tim” ORDER BY c.name ASC, c.age ASC

While a composite index will significantly improve query performance, you can still run the original query successfully without a composite index. When you run the revised query with a composite index, it will sort documents by the age property. Since all documents matching the filter have the same name value, the query will return them in ascending order by age.

Queries with a filter on multiple properties


If a query has a filter with two or more properties, adding a composite index will improve performance.

Consider the following query:

SELECT * FROM c WHERE c.name = “Tim” and c.age > 18

In the absence of a composite index on (name ASC, and age ASC), we will utilize a range index for this query. We can improve the efficiency of this query by creating a composite index for name and age.

Queries with multiple equality filters and a maximum of one range filter (such as >,<, <=, >=, !=) will utilize the composite index. In some cases, if a query can’t fully utilize a composite index, it will use a combination of the defined composite indexes and range indexes.

Composite index performance benefits


We can run some sample queries to highlight the performance benefits of composite indexes. We will use a nutrition dataset that is used in Azure Cosmos DB labs.

In this example, we will optimize a query that has a filter as well as an ORDER BY clause. We will start with the default indexing policy which indexes all properties with a range index. Executing the following query as referenced in the image below in the Azure Portal, we observe the query metrics:

Query metrics:

Azure Cosmos DB, Azure Learning, Azure Certifications, Azure Tutorials and Materials, Azure Study Materials

This query, with the default indexing policy, required 21.8 RU’s.

Adding a composite index on foodGroup and _ts and updating the query text to include foodGroup in the ORDER BY clause significantly reduced the query’s RU charge.

Query metrics:

Azure Cosmos DB, Azure Learning, Azure Certifications, Azure Tutorials and Materials, Azure Study Materials

After adding a composite index, the query’s RU charge decreased from 21.8 RU’s to only 4.07 RU’s. This query optimization will be particularly impactful as the total data size increases. The benefits of a composite index are significant when the properties in the ORDER BY clause have a high cardinality.

Creating composite indexes


You can learn more about creating composite indexes in this documentation. It’s simple to update the indexing policy directly through the Azure Portal. While creating a composite index for data that’s already in Azure Cosmos DB, the index update will utilize the RU’s leftover from normal operations. After the new indexing policy is defined, Azure Cosmos DB will automatically index properties with a composite index as they’re written.

Explore whether composite indexes will improve RU utilization for your existing workloads on Azure Cosmos DB.

Saturday 21 September 2019

Announcing Azure Private Link

Customers love the scale of Azure that gives them the ability to expand across the globe, while being highly available. Through the rapidly growing adoption of Azure, customers need to access the data and services privately and securely from their networks grow exponentially.

Azure Private Link is a secure and scalable way for Azure customers to consume Azure Services like Azure Storage or SQL, Microsoft Partner Services or their own services privately from their Azure Virtual Network (VNet). The technology is based on a provider and consumer model where the provider and the consumer are both hosted in Azure. A connection is established using a consent-based call flow and once established, all data that flows between the service provider and service consumer is isolated from the internet and stays on the Microsoft network. There is no need for gateways, network address translation (NAT) devices, or public IP addresses to communicate with the service.

Azure Private Link brings Azure services inside the customer’s private VNet. The service resources can be accessed using the private IP address just like any other resource in the VNet. This significantly simplifies the network configuration by keeping access rules private.

Azure Study Materials, Azure Tutorial and Materials, Azure Guides, Azure Career Path, Azure Learning

Today we would like to highlight a few unique key use cases that are made possible by the Azure Private Link announcement:

Private connectivity to Azure PaaS services


Multi-tenant shared services such as Azure Storage and Azure SQL Database are outside your VNet and have been reachable only via the public interface. Today, you can secure this connection using VNet service endpoints which keep the traffic within the Microsoft backbone network and allow the PaaS resource to be locked down to just your VNet. However, the PaaS endpoint is still served over a public IP address and therefore not reachable from on-premises through Azure ExpressRoute private peering or VPN gateway. With today’s announcement of Azure Private Link, you can simply create a private endpoint in your VNet and map it to your PaaS resource (Your Azure Storage account blob or SQL Database server). These resources are then accessible over a private IP address in your VNet, enabling connectivity from on-premises through Azure ExpressRoute private peering and/or VPN gateway and keep the network configuration simple by not opening it up to public IP addresses.

Private connectivity to your own service


This new offering is not limited to Azure PaaS services, you can leverage it for your own service as well. Today, as a service provider in Azure, you have to make your service accessible over a public interface (IP address) in order for it to be accessible for other consumers running in Azure. You could use VNet peering and connect to the consumer’s VNet to make it private, but it is not scalable and will soon run into IP address conflicts. With today’s announcement, you can run your service completely private in your own VNet behind an Azure Standard Load Balancer, enable it for Azure Private Link, and allow it to be accessed by consumers running in different VNet, subscription, or Azure Active Directory (AD) tenant all using simple clicks and approval call flow. As a service consumer all you will have to do is create a private endpoint in your own VNet and consume the Azure Private Link service completely private without opening your access control lists (ACLs) to any public IP address space.

Azure Study Materials, Azure Tutorial and Materials, Azure Guides, Azure Career Path, Azure Learning

Private connectivity to SaaS service

Microsoft’s multiple partners already offer many different software-as-a-service (SaaS) solutions to Azure customers today. These solutions are offered over the public endpoints and to consume these SaaS solutions, Azure customers must open their private networks to the public internet. Customers want to consume these SaaS solutions within their private networks as if they are deployed right within their networks. The ability to consume the SaaS solutions privately within the customer's own network has been a common request. With Azure Private Link, we’re extending the private connectivity experience to Microsoft partners. This is a very powerful mechanism for Microsoft partners to reach Azure customers. We're confident that a lot of future Azure Marketplace offerings will be made through Azure Private Link. 

Key highlights of Azure Private Link

◈ Private on-premises access: Since PaaS resources are mapped to private IP addresses in the customer’s VNet, they can be accessed via Azure ExpressRoute private peering. This effectively means that the data will traverse a fully private path from on-premises to Azure. The configuration in the corporate firewalls and route tables can be simplified to allow access only to the private IP addresses.

Data exfiltration protection: Azure Private Link is unique with respect to mapping a specific PaaS resource to private IP address as opposed to mapping an entire service as other cloud providers do. This essentially means that any malicious intent to exfiltrate the data to a different account using the same private endpoint will fail, thus providing built-in data exfiltration protection.

◈ Simple to setup: Azure Private Link is simple to setup with minimal networking configuration needed. Connectivity works on an approval call flow and once a PaaS resource is mapped to a private endpoint, the connectivity works out of the box without any additional configurations on route tables and Azure Network Security Groups (NSGs).

◈ Overlapping address space: Traditionally, customers use VNet peering as the mechanism to connect multiple VNets. VNet peering requires the VNets to have non-overlapping address space. In enterprise use cases, its often common to find networks with an overlapping IP address space. Azure Private Link provides an alternative way to privately connect applications in different VNets that have an overlapping IP address space.

Azure Study Materials, Azure Tutorial and Materials, Azure Guides, Azure Career Path, Azure Learning

Roadmap


Today, we’re announcing Azure Private Link preview in a limited set of regions. We will be expanding to more regions in the near future. In addition, we will also be adding more Azure PaaS services to Azure Private Link including Azure Cosmos DB, Azure MySQL, Azure PostgreSQL, Azure MariaDB, Azure Application Service, and Azure Key Vault, and Partner Services in coming months.

Thursday 19 September 2019

SAP on Azure Architecture - Designing for security

Great SAP on Azure Architectures are built on the pillars of security, performance and scalability, availability and recoverability, and efficiency and operations.

Microsoft investments in Azure Security


Microsoft invests $1 billion annually on security research and development and has 3,500 security professionals employed across the company. Advanced AI is leveraged to analyze 6.5 trillion global signals from the Microsoft cloud platforms and detect and respond to threats. Enterprise-grade security and privacy are built into the Azure platform including enduring, rigorous validation by real world tests, such as the Red Team exercises. These tests enable Microsoft to test breach detection and response as well as accurately measure readiness and impacts of real-world attacks, and are just one of the many operational processes that provide best-in-class security for Azure.

Azure is the platform of trust, with 90 compliance certifications spanning nations, regions, and specific industries such as health, finance, government, and manufacturing. Moreover, Azure Security and Compliance Blueprints can be used to easily create, deploy, and update your compliant environments.

Security – a shared responsibility


It’s important to understand the shared responsibility model between you as a customer and Microsoft. The division of responsibility is dependent on the cloud model used - SaaS, PaaS, or IaaS. As a customer, you are always responsible for your data, endpoints, account/access management, irrespective of the chosen cloud deployment.

SAP on Azure is delivered using the IaaS cloud model, which means security protections are built into the service by Microsoft at the physical datacenter, physical network, and physical hosts. However, for all areas beyond the Azure hypervisor i.e. the operating systems and applications, customers need to ensure their enterprise security controls are implemented.


Key security considerations for deploying SAP on Azure


Resource based access control & resource locking

Role-based access control (RBAC) is an authorization system which provides fine-grained access for the management of Azure resources. RBAC can be used to limit access and control permissions on Azure resources for the various teams within your IT operations.

For example, the SAP basis team members can be permissioned to deploy virtual machines (VMs) into Azure virtual networks (VNets). However, the SAP basis team can be restricted from creating or configuring VNets. On the flip side, members of the networking team can create and configure VNets, however, they are prohibited from deploying or configuring VMs in VNets where SAP applications are running.

We recommend validating and testing the RBAC design early during the lifecycle of your SAP on Azure project.

Another important consideration is Azure resource locking which can be used to prevent accidental deletion or modification of Azure resources such as VMs and disks.  It is recommended to create the required Azure resources at the start of your SAP project. When all additons, moves, and changes are finished, and the SAP on Azure deployment is operational all resources can be locked. Following, only a super administrator can unlock a resource and permit the resource (such as a VM) to be modified.

Secure authentication

Single-sign-on (SSO) provides the foundation for integrating SAP and Microsoft products, and for years Kerberos tokens from Microsoft Active Directory have been enabling this capability for both SAP GUI and web-browser based applications when combined with third party security products.

When a user logs onto their workstation and successfully authenticates against Microsoft Active Directory they are issued a Kerberos token. The Kerberos token can then be used by a 3rd party security product to handle the authentication to the SAP application without the user having to re-authenticate. Additionally, data in transit from the users front-end towards the SAP application can also be encrypted by integrating the security product with secure network communications (SNC) for DIAG (SAP GUI), RFC and SPNEGO for HTTPS.

Azure Active Directory (Azure AD) with SAML 2.0 can also be used to provide SSO to a range of SAP applications and platforms such as SAP NetWeaver, SAP HANA and the SAP Cloud Platform.

Protecting your application and data from network vulnerabilities

Network security groups (NSG) contain a list of security rules that allow or deny network traffic to resources within your Azure VNet. NSGs can be associated to subnets or individual network interfaces attached to VMs. Security rules can be configured based on source/destination, port, and protocol.
NSG’s influence network traffic for the SAP system. In the diagram below, three subnets are implemented, each having an NSG assigned - FE (Front-End), App and DB.

1. A public internet user can reach the SAP Web-Dispatcher over port 443
2. The SAP Web-Dispatcher can reach the SAP Application server over port 443
3. The App Subnet accepts traffic on port 443 from 10.0.0.0/24
4. The SAP Application server sends traffic on port 30015 to the SAP DB server
5. The DB subnet accepts traffic on port 30015 from 10.0.1.0/24.
6. Public Internet Access is blocked on both App Subnet and DB Subnet.


SAP deployments using the Azure virtual Ddatacenter architecture will be implemented using a hub and spoke model. The hub VNet is the central point for connectivity where an Azure Firewall or other type of network virtual appliances (NVA) is implemented to inspect and control the routing of traffic to the spoke VNet where your SAP applications reside.

Within your SAP on Azure project, it is recommended to validate that that inspection devices and NSG security rules are working as desired, this will ensure that your SAP resources are shielded appropriately against network vulnerabilities.

Maintaining data integrity through encryption methods

Azure Storage service encryption is enabled by default on your Azure Storage account where it cannot be disabled. Therefore, customer data at rest on Azure Storage is secured by default where data is encrypted/decrypted transparently using 256-bit AES. The encrypt/decrypt process has no impact on Azure Storage performance and is cost free.  You have the option of Microsoft managing the encryption keys or you can manage your own keys with Azure Key Vault. Azure Key Vault can be used to manage your SSL/TLS certificates which are used to secure interfaces and internal communications within the SAP system.

Azure also offers virtual machine disk encryption using BitLocker for Windows and DM-Crypt for Linux to provide volume encryption for virtual machine operating system and data disks. Disk encryption is not enabled by default.

Our recommended approach to encrypting your SAP data at rest is as follows:

◈ Azure Disk Encryption for SAP Application servers – operating system disk and data disks.

◈ Azure Disk Encryption for SAP Database servers – operating system disks and those data disk not used by the DBMS.

◈ SAP Database servers - leverage Transparent Data Encryption offered by the DBMS provider to secure your data and log files and to ensure the backups are also encrypted.

Hardening the operating system


Security is a shared responsibility between Microsoft and you as a customer where your customer specific security controls need to be applied to the operating system, database, and the SAP application layer. For example, you need to ensure the operating system is hardened to eradicate vulnerabilities which could lead to attacks on the SAP database.

Windows, SUSE Linux, RedHat Linux and others are supported for running SAP applications on Azure and various images of these operating systems are available within the Azure Marketplace. You can further harden these images to comply with the security policies of your enterprise and within the guidance from the Center of Internet Security (CIS)- Microsoft Azure foundations benchmark.

Enterprises generally have operational processes in place for updating and patching of their IT software including the operating system. Once an operating system vulnerability has been exposed, it is published in security advisories and usually remediated quickly. The operating system vendor regularly provides security updates and patches. You can use the Update Management solution in Azure Automation to manage operating system updates for your Windows and Linux VMs in Azure. A best practice approach is a selective installation of security updates for the operating system on a regular cadence and installation of other updates such as new features during maintenance windows.

Tuesday 17 September 2019

Announcing user delegation SAS tokens preview for Azure Storage Blobs

Cloud storage often serves as a content source for browser and mobile applications. This is typically achieved using application-issued, pre-authorized URLs which provide time-limited access directly to specific content without requiring a service to proxy this access.

Azure Storage supports this pattern through the use of shared access signature tokens (SAS tokens). These tokens grant specific, time-limited access to storage objects by signing an authorization statement using the storage account access key, which is controlled by account administrators. While this approach provides the required limited access to clients, it sometimes represents an over-provisioning of access to these token-issuing services since this provides full control over the entire account where they may only require read access to specific content.

Today we are launching a preview for a new kind of SAS token, user delegation SAS tokens. By extending the recent release of Azure AD and Azure role-based access control (RBAC) for Azure Storage, lower-privileged users and services can now delegate subsets of their access to clients, using this new type of pre-authorized URL. Clients retrieve a user delegation key tied to their Azure Active Directory (AD) account, and then use it to create SAS tokens granting a subset of their own access rights.

This means, for example, that an application component with only read access to end-user content could be configured to issue short lived read-only URLs to clients without the risks involved with storing and using the powerful account access key. Azure Storage access logs will also reflect client use of these SAS tokens as associated with the Azure AD principal of this application component.

Azure Storage, Azure Tutorials and Material, Azure Study Materials, Azure Learning

During this preview, you can generate user delegation SAS tokens with your own code or use Azure PowerShell or Azure CLI. Remember, you will first need to grant RBAC permissions to access data to the user account that will generate the SAS token.

How to create a user delegation SAS token


The preview capability is available now for everyone, for non-production use. No registration is required.

For developers, here is an example using .NET code to generate a user-delegation SAS token. It also shows our new .NET client libraries for Storage and integrated Azure.Identity libraries.

client = new BlobServiceClient(accountUri, new DefaultAzureCredential());

//define permission set to read blob, valid from low for a specified number of minutes
BlobSasBuilder builder = new BlobSasBuilder()
{
    ContainerName = containerName,
    BlobName = blobName,
    Permissions = "r",
    Resource = "b",
    StartTime = DateTimeOffset.UtcNow,
    ExpiryTime = DateTimeOffset.UtcNow.AddMinutes(tokenLifetime)
};

//refresh user-delegation key as necessary, valid for up to a maximum of 7 days
if (currentUdk == null || currentUdk.SignedExpiry <= builder.ExpiryTime)
{
    currentUdk = client.GetUserDelegationKey(DateTimeOffset.UtcNow, DateTimeOffset.UtcNow.AddDays(7)).Value;
}

//generate a signed SAS token using the user-delegation key
string sasToken = builder.ToSasQueryParameters(currentUdk, accountName).ToString();

Users can also use our command line tool previews to generate user-delegation SAS tokens. Here is an example using Azure CLI to generate a read-only SAS based on the user’s credentials:

$ az login
$ az storage blob generate-sas \
> --account-name myaccount \
> --container-name container \
> --name file.txt \
> --permissions r \
> --expiry 2019-08-30 \
> --auth-mode login \
> --as-user \
> --full-uri
Argument '--as-user' is in preview. It may be changed/removed in a future release.

https://myaccount.blob.core.windows.net/container/file.txt?se=2019-08-30&sp=r&sv=2018-11-09&sr=b&skoid=8c93ed4c-3e11-43f4-9307-3664c9c16554&sktid=9341f370-b982-47de-b7c1-8dbe61328559&skt=2019-08-28T22%3A57%3A38Z&ske=2019-08-30T00%3A00%3A00Z&sks=b&skv=2018-11-09&sig=7trGEakY86Uj5rXsH2ApiyCZfxFgNnh6NFy4wcnmfmQ%3D

User delegation SAS tokens for Azure Blobs allow for issuing pre-authorized URLs from lower privileged identities, and are available in preview in all Azure clouds and locations.

Saturday 14 September 2019

Expanded Azure Maps coverage, preview of Azure Maps feedback site, and more

Azure Maps services continue to expand our support for Microsoft enterprise customers’ needs in Azure. And, we’ve been busy expanding our capabilities. Today we’re announcing Azure Maps is now available in Argentina, India, Morocco, and Pakistan. We have also launched a new Azure Maps data feedback site that is now in preview. In addition, we’re also introducing several enhancements that are available via our Representational state transfer (REST) services and Azure Maps web and Android SDKs.

Here is a run-down of the new features:

Azure Maps is available in new countries and regions


Azure Maps is now available in Argentina, India, Morocco, and Pakistan and these regions require specific consideration for using maps. Azure Maps will now empower our customers to use the appropriate map views in these regions.

Introducing preview of Azure Maps data feedback site


To serve the freshest map data as possible to our customers and as an easy way to provide map data feedback, we’re introducing the Azure Maps data feedback site. The new site empowers our customers to provide direct data feedback, especially on business points of interest and residential addresses. The feedback goes directly to our data providers and their map editors who can quickly evaluate and incorporate feedback into our mapping products.

Azure Tutorial and Materials, Azure Learning, Azure Certifications, Azure Online Exam

REST service enhancements


Point of interest data updates

When requesting point of interest data, you might want to restrict the results to specific brands. For example, your scenario is to only show gas stations under a specific brand to your end users. To support this, we’ve added the capability to include one or multiple brands in your request to limit the search results.

In addition, Azure Maps now returns hours of operation for points of interest like business listings. We return the opening hours for the next week, starting with the current day in the local time of the point of interest. This information can be used to better optimize your planned routes, and for example, show end users store locations that are open during a specific timeframe.

Sunset and sunrise times

According to a recent report from the Global Alliance for Buildings and Construction, buildings construction and operations account for 36 percent of global final energy use and nearly 40 percent of energy-related carbon dioxide emissions when upstream power generation is considered. To create impact with IoT and help to combat climate change and optimize buildings for energy efficiency, Get Timezone by Coordinates API now returns sunset and sunrise times for a given coordinate location. Developers can automate device messages in their IoT solutions, for example, by building rules to schedule heating and cooling by using sunrise and sunset times combined with telemetry messages from a variety of devices and sensors. 

Cartography and styling updates


Point of interest data rendering

To provide richer and more informative map data content, we’ve pushed up certain point of interest data so that certain categories appear at higher levels. As a result, airport icons are rendered at zoom levels 10 to 22.

Azure Tutorial and Materials, Azure Learning, Azure Certifications, Azure Online Exam

Point of interest icons for important tourist attractions like museums, and railway and metro stations are displayed on zoom levels 12 to 22. In addition, universities, colleges, and schools are shown on zoom levels 13 to 22.

Azure Tutorial and Materials, Azure Learning, Azure Certifications, Azure Online Exam

State boundaries and abbreviated state names

To improve usability and give more detailed views, state boundaries are pushed up in the data so that they appear already at zoom level 3. Abbreviated state names are also now shown in zoom level.

Azure Tutorial and Materials, Azure Learning, Azure Certifications, Azure Online Exam

Blank map styles in web SDK

Often it is useful to be able to visualize data on top of a blank canvas or to replace the base maps with custom tile layers. With this in the mind the Azure Maps web SDK now supports two new map styles; blank and blank_accessible. The blank map style will not render any base map data, nor will it update the screen reader on where the map is centered over. The blank_ accessible style will continue to provide screen reader updates with location details of where the map is located, even though the base map is not displayed. Please note, you can change the background color of web SDK by using the CSS background-color style of the map DIV element.

Web SDK enhancements

The Azure Maps team has made many additions and improvements to the web SDK. Below is a closer look at some of the key improvements.

Cluster aggregates

Clustering of point data based on zoom level can be done to reduce the visual clutter on the map and make it easier to make sense of the data. Often clusters are represented using a symbol with the number of points that are within the cluster, however sometimes you may want to further customize the style of clusters based on a metric like the total revenue of all points within a cluster. With cluster aggregates, custom properties can be created and populated using an aggregate expression.

Azure Tutorial and Materials, Azure Learning, Azure Certifications, Azure Online Exam
Aggregating data in clusters

Image templates

The Azure Maps web SDK uses WebGL for rendering most data on the map. Symbol layers can be used to render points on the map with an image, line layers can have images rendered along it, and polygon layers can be rendered with a fill pattern image. In order to ensure good performance, these images need to be loaded into the map image sprite resource before rendering. The web SDK already provides a couple of images of markers in a handful of colors, however, there is an infinite number of color combinations that developers may want to use. With this in mind we have ported the SVG template functionality for HTML markers over to the image sprite and have added 42 image templates, 27 symbol icons, and 15 polygon fill patterns. You can easily define a primary and secondary color as well as a scale for each template when loading it into the map image sprite. These templates can also be used with HTML markers as well.

Azure Tutorial and Materials, Azure Learning, Azure Certifications, Azure Online Exam
Images can be used HTML markers and various layers within the Azure Maps Web SDK

Additional notable improvements to the web SDK:


◈ Accessibility improvements – The team has spent a lot of time improving accessibility in the web SDK and ensuring that every user is able to use the map. A major part of this consisted of leveraging the vector tiles of the base map so that we can provide highly accurate descriptions of what the map is rendering.

◈ Limit spinning of the globe – By default the map mimics a globe by allowing the user to infinitely scroll the map west or east. When the user is zoomed out, sometimes the map will render additional copies of the globe to fill in the blank space. This is great for most scenarios, but some developers prefer having a single copy of the globe that doesn’t scroll infinitely. Now this can be configured using the new renderWorldCopies map option.

◈ Easily show all map styles in style picker – Up until now, if you wanted to show all map styles in the style picker control you had to list them all in an array in the mapStyles option. Now you simply set this option to "all.”

◈ Image overlay georeferencing tools – When georeferencing an image to overlay on the map, sometimes all you have is some reference points (i.e. pixels to positions) which might not be the corners of the image. We added some functions which can be used to correctly georeference the image. We also added tools for reprojecting between pixels and positions relative to the image. For example, if you have an image of a floor plan displayed on the map, you can take any map position and determine its pixel coordinate on the original image and vice versa.

◈ New spatial math functions – Several new spatial math functions have been added. One of the new spatial math functions we added will calculate the closest point to a location that falls on the edge of another geometry object. This has a lot of use cases, such as basic snapping of points to lines or simply knowing how far off the path something is.

◈ Pitch touch support – You can now pitch the map using touch, with two-finger drag up/down.

◈ Popup customizations – Up until now you could only have a popup with a white background and pointer arrow. Now you can set the color of the popup and optionally hide the pointer arrow. Popups can also be made draggable now too!

◈ Shape and Data source events – New events for tracking changes to shapes and data sources.

Tile layers in the Android SDK


The Azure Maps team released an Android SDK into preview earlier this year. It is able to render point, line, and polygon data. The team has now added support for rendering tile layers. Tile layers are a great way to visualize large data sets on the map. Not only can a tile layer be generated from an image, but vector data can also be rendered as a tile layer too. By rendering vector data as a tile layer, the map control only needs to load the tiles which can be much smaller in file size than the vector data they represent. This technique is used by many who need to render millions of rows of data on the map.

Azure Tutorial and Materials, Azure Learning, Azure Certifications, Azure Online Exam
Rendering tile layers within the Azure Maps Android SDK