Tuesday, 30 July 2019

Choosing between Azure VNet Peering and VNet Gateways

As customers adopt Azure and the cloud, they need fast, private, and secure connectivity across regions and Azure Virtual Networks (VNets). Based on the type of workload, customer needs vary. For example, if you want to ensure data replication across geographies you need a high bandwidth, low latency connection. Azure offers connectivity options for VNet that cater to varying customer needs, and you can connect VNets via VNet peering or VPN gateways.

It is not surprising that VNet is the fundamental building block for any customer network. VNet lets you create your own private space in Azure, or as I call it your own network bubble. VNets are crucial to your cloud network as they offer isolation, segmentation, and other key benefits.

VNet peering


VNet peering enables you to seamlessly connect Azure virtual networks. Once peered, the VNets appear as one, for connectivity purposes. The traffic between virtual machines in the peered virtual networks is routed through the Microsoft backbone infrastructure, much like traffic is routed between virtual machines in the same VNet, through private IP addresses only. No public internet is involved. You can peer VNets across Azure regions, too – all with a single click in the Azure Portal.

◈ VNet peering - connecting VNets within the same Azure region

◈ Global VNet peering - connecting VNets across Azure regions

Azure Study Materials, Azure Tutorial and Material, Azure Certifications, Azure Online Exam, Azure VNet

VPN gateways


A VPN gateway is a specific type of VNet gateway that is used to send traffic between an Azure virtual network and an on-premises location over the public internet. You can also use a VPN gateway to send traffic between VNets. Each VNet can have only one VPN gateway.

Azure Study Materials, Azure Tutorial and Material, Azure Certifications, Azure Online Exam, Azure VNet

Which is best for you?


While we offer two ways to connect VNets, based on your specific scenario and needs, you might want to pick one over the other.

VNet Peering provides a low latency, high bandwidth connection useful in scenarios such as cross-region data replication and database failover scenarios. Since traffic is completely private and remains on the Microsoft backbone, customers with strict data policies prefer to use VNet Peering as public internet is not involved. Since there is no gateway in the path, there are no extra hops, ensuring low latency connections.

VPN Gateways provide a limited bandwidth connection and is useful in scenarios where encryption is needed, but bandwidth restrictions are tolerable. In these scenarios, customers are also not as latency-sensitive.

VNet Peering and VPN Gateways can also co-exist via gateway transit


Gateway transit enables you to use a peered VNet’s gateway for connecting to on-premises instead of creating a new gateway for connectivity. As you increase your workloads in Azure, you need to scale your networks across regions and VNets to keep up with the growth. Gateway transit allows you to share an ExpressRoute or VPN gateway with all peered VNets and lets you manage the connectivity in one place. Sharing enables cost-savings and reduction in management overhead.

With gateway transit enabled on VNet peering, you can create a transit VNet that contains your VPN gateway, Network Virtual Appliance, and other shared services. As your organization grows with new applications or business units and as you spin up new VNets, you can connect to your transit VNet with VNet peering. This prevents adding complexity to your network and reduces management overhead of managing multiple gateways and other appliances.

Azure Study Materials, Azure Tutorial and Material, Azure Certifications, Azure Online Exam, Azure VNet

Differences between VNet Peering and VPN Gateways


VNet Peering
VPN Gateways 
Cross-region support?
Yes – via Global VNet Peering
Yes 
Cross-Azure Active Directory tenant support?
Yes
Yes
Cross-subscription support? 
Yes 
Yes 
Cross-deployment model support? 
Yes 
Yes 
Limits 
You can keep up to 500 VNets with one VNet as seen in the documentation on Networking Limits.
Each VNet can only have one VPN Gateway. VPN Gateways depending on the SKU have type different number of tunnel supported.
Pricing 
Ingress/Egress charged.
Gateway + Egress charged. 
Encrypted? 
Software level encryption is recommended
Yes, custom IPsec/IKE policy can be created and applied to new or existing connections.
Bandwidth limitations? 
No bandwidth limitations.
Varies based on type of Gateway from 100 Mbps to 1.25Gps.
Private? 
Yes, no Public IP endpoints. Routed through Microsoft backbone and is completely private. No public internet involved.
Public IP involved.
Transitive relationship 
If VNet A is peered to VNet B, and VNet B is peered to VNet C, VNet A and VNet C cannot currently communicate. Spoke to spoke communication can be achieved via NVAs or Gateways in the hub VNet.
If VNet A, VNet B, and VNet C are connected via VPN Gateways and BGP is enabled in the VNet connections, transitivity works.
Typical customer scenarios 
Data replication, database failover, and other scenarios needing frequent backups of large data.
Encryption specific scenarios that are not latency sensitive and do not need high throughout.
Initial setup time 
It took me 24.38 seconds, but you should give it a shot!
30 mins to set it up

Sunday, 28 July 2019

Accessing virtual machines behind Azure Firewall with Azure Bastion

Azure Virtual Network enables a flexible foundation for building advanced networking architectures. Managing heterogeneous environments with various types of filtering components, such as Azure Firewall or your favorite network virtual appliance (NVA), requires a little bit of planning.

Azure Bastion, which is currently in preview, is a fully managed platform as a service (PaaS) that provides secure and seamless remote desktop protocol (RDP) and secure shell (SSH) access to your virtual machines (VMs) directly through the Azure portal. Azure Bastion is provisioned directly in your virtual network, supporting all VMs attached without any exposure through public IP addresses.

When you deploy Azure Firewall, or any NVA, you invariably force tunnel all traffic from your subnets. Applying a 0.0.0.0/0 user-defined route can lead to asymmetric routing for ingress and egress traffic to your workloads in your virtual network.

While not trivial, you often find yourself creating and managing a growing set of network rules, including DS NAT, forwarding, and so on, for all your applications to resolve this. Although this can impact all your applications, RDP and SSH are the most common examples. In this scenario, the ingress traffic from the Internet may come directly to your virtual machine within your virtual network, but egress traffic will end up going to the NVA. Since most NVAs are stateful, it ends up dropping this traffic as it did not initially receive it.

Azure Bastion, allows for simplified set up of RDP/SSH to your workloads within virtual networks containing stateful NVAs or Azure Firewall with force tunneling enabled. In this blog, we will look at how to make that work seamlessly.

Having deployed both Azure Bastion and Azure Firewall in your virtual network, let us look at how you can configure Azure Bastion to work in this scenario.

Azure Bastion, Azure Networking, Azure Backup & Recovery, Azure Learning, Azure Study Materials

Configuring Azure Bastion


When deploying Azure Firewall, or a virtual appliance, you may end up associating your RouteTable, which was created while deploying Azure Firewall, to all subnets in your virtual network. You may even be including the AzureBastionSubnet subnet as well. 

This applies a user-defined route to the AzureBastionSubnet subnet which directs all Azure Bastion traffic to Azure Firewall, thereby blocking traffic required for Azure Bastion. To avoid this, configuring Azure Bastion is very easy, but do not associate the RouteTable to AzureBastionSubnet subnet.

Azure Bastion, Azure Networking, Azure Backup & Recovery, Azure Learning, Azure Study Materials

As you would have noticed above, myRouteTable is not associated with the AzureBastionSubnet, but with other subnets like Workload-SN.

The AzureBastionSubnet subnet is secure platform managed subnet, and no other Azure Resource can deploy in this subnet except Azure Bastion. All connections to Azure Bastion are enforced through the Azure Active Directory token-based authentication with 2FA, and all traffic is encrypted/over HTTPS. 

Azure Bastion is internally hardened and allows traffic only through port 443, saving you the task of applying additional network security groups (NSGs) or user-defined routes to the subnet.

With this, the RDP/SSH requests will land on Azure Bastion. Configured using the example above, the default route (0.0.0.0/0) does not apply to AzureBastionSubnet as it's not associated with this subnet. Based on the incoming RDP/SSH requests, Azure Bastion connects to your virtual machines in other subnets, like Workload-SN, which do have a default route associated. The return traffic from your virtual machine will go directly to Azure Bastion, instead of going to the NVA, in your virtual network as the return traffic is directed to a specific private IP in your virtual network. The specific private IP address in your virtual network makes it a more specific route and hence, takes precedence over the force-tunnel route to the NVA, making your RDP/SSH traffic work seamlessly with Azure Bastion when a NVA or Azure Firewall is deployed in your virtual network.

Saturday, 27 July 2019

Silo busting 2.0—Multi-protocol access for Azure Data Lake Storage

Cloud data lakes solve a foundational problem for big data analytics—providing secure, scalable storage for data that traditionally lives in separate data silos. Data lakes were designed from the start to break down data barriers and jump start big data analytics efforts. However, a final “silo busting” frontier remained, enabling multiple data access methods for all data—structured, semi-structured, and unstructured—that lives in the data lake.

Providing multiple data access points to shared data sets allow tools and data applications to interact with the data in their most natural way. Additionally, this allows your data lake to benefit from the tools and frameworks built for a wide variety of ecosystems. For example, you may ingest your data via an object storage API, process the data using the Hadoop Distributed File System (HDFS) API, and then ingest the transformed data using an object storage API into a data warehouse.

Single storage solution for every scenario


We are very excited to announce the preview of multi-protocol access for Azure Data Lake Storage! Azure Data Lake Storage is a unique cloud storage solution for analytics that offers multi-protocol access to the same data. Multi-protocol access to the same data, via Azure Blob storage API and Azure Data Lake Storage API, allows you to leverage existing object storage capabilities on Data Lake Storage accounts, which are hierarchical namespace-enabled storage accounts built on top of Blob storage. This gives you the flexibility to put all your different types of data in your cloud data lake knowing that you can make the best use of your data as your use case evolves.

Announcements, Backup & Recovery, Azure Data Lake, Azure Storage, Azure Certifications, Azure Study Materials
Single storage solution

Expanded feature set, ecosystem, and applications


Existing blob features such as access tiers and lifecycle management policies are now unlocked for your Data Lake Storage accounts. This is paradigm-shifting because your blob data can now be used for analytics. Additionally, services such as Azure Stream Analytics, IoT Hub, Azure Event Hubs capture, Azure Data Box, Azure Search, and many others integrate seamlessly with Data Lake Storage. Important scenarios like on-premises migration to the cloud can now easily move PB-sized datasets to Data Lake Storage using Data Box.

Multi-protocol access for Data Lake Storage also enables the partner ecosystem to use their existing Blob storage connector with Data Lake Storage.  Here is what our ecosystem partners are saying:

“Multi-protocol access for Azure Data Lake Storage is a game changer for our customers. Informatica is committed to Azure Data Lake Storage native support, and Multi-protocol access will help customers accelerate their analytics and data lake modernization initiatives with a minimum of disruption.”

You will not need to update existing applications to gain access to your data stored in Data Lake Storage. Furthermore, you can leverage the power of both your analytics and object storage applications to use your data most effectively.

Announcements, Backup & Recovery, Azure Data Lake, Azure Storage, Azure Certifications, Azure Study Materials
Multi-protocol access enables features and ecosystem

Multiple API endpoints—Same data, shared features


This capability is unprecedented for cloud analytics services because not only does this support multiple protocols, this supports multiple storage paradigms. We now bring you this powerful capability to your storage in the cloud. Existing tools and applications that use the Blob storage API gain these benefits without any modification. Directory and file-level access control lists (ACL) are consistently enforced regardless of whether an Azure Data Lake Storage API or Blob storage API is used to access the data.  

Announcements, Backup & Recovery, Azure Data Lake, Azure Storage, Azure Certifications, Azure Study Materials
Multi-protocol access on Azure Data Lake Storage

Features and expanded ecosystem now available on Data Lake Storage


Multi-protocol access for Data Lake Storage brings together the best features of Data Lake Storage and Blob storage into one holistic package. It enables many Blob storage features and ecosystem support for your data lake storage.

Features
More information 
Access tiers
Cool and Archive tiers are now available for Data Lake Storage.
Lifecycle management policies 
You can now set policies to a tier or delete data in Data Lake Storage. 
Diagnostics logs 
Logs for the Blob storage API and Azure Data Lake Storage API are now available in v1.0 and v2.0 formats.  
SDKs  
Existing blob SDKs can now be used with Data Lake Storage. 
PowerShell 
PowerShell for data plane operations is now available for Data Lake Storage.  
CLI 
Azure CLI for data plane operations is now available for Data Lake Storage.  
Notifications via Azure Event Grid 
You can now get Blob notifications through Event Grid.

Ecosystem partner More information 
Azure Stream Analytics Azure Stream Analytics now writes to, as well as reads from, Data Lake Storage. 
Azure Event Hubs capture  The capture feature within Azure Event Hubs now lets you pick Data Lake Storage as one of its destinations.
IoT Hub  IoT Hub message routing now allows routing to Azure Data Lake Storage Gen 2.
Azure Search   You can now index and apply machine learning models to your Data Lake Storage content using Azure Search. 
Azure Data Box  You can now ingest huge amounts of data from on-premises to Data Lake Storage using Data Box. 

Thursday, 25 July 2019

Azure solutions for financial services regulatory boundaries

Microsoft Azure is rapidly becoming the public cloud of choice for large financial services enterprises. Some of the biggest reasons Global Financial Services Institutions (GFIs) are choosing Azure to augment or replace on-premises application environments are:

◈ The high level of security that the Azure cloud provides.

◈ The exceptional control enterprises can have over compliance and security within their subscriptions.

◈ The many features that Azure has for data governance and protection.

◈ The long list of Global Regulatory Standards that the Azure cloud is compliant with.


Requirements for globally regulated Azure solutions


Azure is built to allow enterprises to control the flow of data between regions, and to control who has access to and can manage that data. Before we begin talking about solutions we need to define the requirements.

Examples of global regulation


Many governments and coalitions have developed laws and regulations for how data is stored, where it can be stored, and how it must be managed. Some examples of the more stringent and well know of these scenarios are:

◈ European Union (EU)

General Data Protection Regulation (GDPR) is a legal framework that sets guidelines for the collection and processing of personal information from individuals who live in the EU.

◈ Germany

Federal Data Protection Act is a law that deals with the conditions for processing employee data, and restrictions on the rights enjoyed by data subjects.

Data Localization and Management Law is a law that states that data collected about German citizens must be properly protected and encrypted, stored only on physical devices within Germany’s political boundaries, as well as managed only by German citizens.

◈ China

Cyber Security Law (CSL) is a set of laws concerned with data localization, infrastructure, and management.

◈ Canada

The Canadian Personal Information Protection and Electronic Documents Act (PIPEDA), protects consumer data across Canada, against misuse and disclosure.

Architecture and design requirements


Beyond the above-mentioned regulatory requirements there exist technical requirements specific to these scenarios. Cloud application and infrastructure architects are presented with the opportunity to develop solutions that provide business function while not violating international laws and regulations. The following are some of the requirements that need to be considered.

Globalization


A globalized business model provides access to multiple financial markets on a continuous basis each day. These markets differ in operations, language, culture, and of course regulation. Despite these differences, the services placed in the cloud need to be architected to be consistent across these markets to ensure manageability and customer experience.

Services and data management


Germany and China are prime examples of countries that only allow their citizens to manage data and the infrastructure on which that data resides.

Data localization


Many countries require at least some of the data sovereign to their country to remain physically within their borders. Regulated data cannot be transferred out of the country and data that does not meet regulatory requirements cannot be transferred into the country.

Reliability


Due to many of the above requirements, it becomes slightly more complicated to design for high availability, data-replication, and disaster recovery. For example, data must be replicated only to a location consistent with the country or regions standards and laws. Likewise, if a DR scenario is triggered it must be ensured that the applications, running in the DR site, are not crossing legal or standards boundaries to access information.

Authentication


Proper authentication to support role and identity based access controls must be in place to ensure that only intended and legally authorized individuals can access resources.

The Azure solution


Azure Certifications, Azure Study Materials, Azure Guides, Azure Learning, Azure Online Exam

Security components


Azure Active Directory (AAD)

Azure Active Directory (AAD) is the cloud-based version of Active Directory, so it takes advantage of the flexibility, scalability, and performance of the cloud while retaining the AD functionality that customers have grown used to. One of those functions is the ability to create sub-domains that can be managed and contain only those identities relevant to that country or region. AAD also provides functionality to differentiate between business-to-business relationships (B2B) and business-to-customer relationships (B2C). This differentiation can help clarify between customer access to their own data and management access.  

Azure Sentinel

Azure Sentinel is a scalable, cloud-native, security information event management (SIEM), and security orchestration automated response (SOAR) solution. Azure Sentinel delivers intelligent security analytics and threat intelligence across the enterprise, providing a single solution for alert detection, threat visibility, proactive hunting, and threat response.

Azure Key Vault 

Azure Key Vault helps safeguard cryptographic keys and secrets that cloud applications and services use. Key Vault streamlines the key management process and enables you to maintain control of keys that access and encrypt your data. Developers can create keys for development and testing in minutes, and then migrate them to production keys. Security administrators can grant (and revoke) permission to keys, as needed.

Role based access control

Access management for cloud resources is a critical function for any organization that is using the cloud. Role based access control (RBAC) helps you manage who has access to Azure resources, what they can do with those resources, and what areas they have access to. RBAC is an authorization system built on  Azure Resource Manager that provides fine-grained access management of Azure resources.

Azure Security Center

Azure Security Center is a unified infrastructure security management system that strengthens the security posture of your datacenters. It also provides advanced threat protection across your hybrid workloads in the cloud, whether they're in Azure or not, as well as on premises.

Governance components


Azure Blueprints

Azure Blueprints helps you deploy and update cloud environments in a repeatable manner using composable artifacts such as Azure Resource Manager templates to provision resources, role-based access controls, and policies. Blueprints can be used to deploy certain policies or controls for a given location or geographic region. Sample blueprints can be found in our GitHub repository.

Azure Policy

Azure Policy is a service in Azure that you use to create, assign, and manage policies. These policies enforce different rules and effects over your resources, so those resources stay compliant with your corporate standards and service level agreements. For example, a policy can be set to allow only certain roles to access a group of resources. Another example is setting a policy that only certain sized resources are allowed in a given resource group. If a new resource is added to the group, the policy automatically applies to that entity. Sample Azure Policy configurations can be found in our GitHub repository.

Azure Virtual Datacenter Program (VDC)

The Azure Virtual Datacenter Program (VDC) is a collection of methods are archetypes designed to help enterprises standardize deployments and controls across application and workload environments. VDC utilizes multiple other Azure products including Azure Policy and Azure Blueprints. VDC samples can be found in our GitHub repository.

Infrastructure components


Azure Site Recovery (ASR)

Azure Site Recovery (ASR) provides data replication and disaster recovery services between Azure Regions, or between on-premise environments and Azure. ASR can be easily configured to replicate and failover between Azure regions within or outside country/geographic-region.

High availability

Virtual Machine (Infrastructure-as-a-Service IaaS) high availability can be achieved in multiple ways within the Azure cloud. Azure provides two native methods of failover:

◈ An Azure Availability Set (AS) is a group of virtual machines that are deployed across fault domains and update domains within the same Azure Datacenter. Availability sets make sure that your application is not affected by single points of failure, like the network switch or the power unit of a rack of servers. Azure Availability Sets provide a service level agreement (SLA) of 99.95%.

◈ An Availability Zone (AZ) is like an availability set in that the virtual machines are deployed across fault and update domains. The difference is that AZs provides a higher level of availability (SLA of 99.99%) by spreading the VMs across multiple Azure datacenters within the same region.

For Platform-as-a-Service (PaaS) high availability is built into the services, and need not be configured by the as the IaaS services above.

Data at rest encryption

Data at rest encryption is a common security requirement. In Azure, organizations can encrypt data at rest without the risk or cost of a custom key management solution. Organizations have the option of letting Azure completely manage encryption at rest. Additionally, organizations have various options to closely manage encryption or encryption keys.

Tuesday, 23 July 2019

Digital transformation with legacy systems simplified

Intelligent insurance means improving operations, enabling revenue growth, and creating engaging experiences—which is the result of digital transformation. The cloud has arrived with an array of technical capabilities that can equip an existing business to move into the future. However insurance carriers face a harder road to transform business processes and IT infrastructures. Traditional policy and claim management solutions lack both cloud-era agility, and the modularity required to react quickly to market forces. And legacy systems cannot be decommissioned unless new systems are fully operational and tested, meaning some overlap between old and new.

The Azure platform offers a wealth of services for partners to enhance, extend, and build industry solutions. Here we describe how one Microsoft partner uses Azure to solve a unique problem.

The need for efficient automation


The prevailing approach to upgrading enterprise software is to engage in large scale IT projects that may take years and significant cost to execute. Delaying may only increase the costs, especially with the burden of continuing (and increasing) compliance. But more importantly, delay results in a significant opportunity cost. Due to competition, insurers are under pressure to pursue lower costs overall, and especially in claim handling. New insurance technology also forces the need for new distribution models and to automate internal workflows and supply chains.

A platform built for transformation


The name of Codafication’s solution is Unity (not to be confused with the Unity game engine platform). Codafication calls Unity an ecosystem Platform-as-a-Service (ePaaS). It enables insurance carriers to accelerate their digital transformation through secure, bi-directional data integration with core and legacy systems. At the same time, the platform enables Codafication’s subscribers to use new cloud-native apps and services. The increase in connectivity means customers, staff and supply chains can integrate more easily and with greater efficiencies.

Unity seeks to address the changing expectations of insured customers without disruption to core policy and claim management functions within the enterprise. Codafication stresses a modular approach to implementing Unity. Their website provides a catalog of components such as project management, supply chain and resource planning, and financial control (and more).

In this graphic, potential inputs for the system include a wide variety of processes, from legacy core systems (expected) to robotic processes (a surprise). The output is equally versatile—dashboards and portals along with data lake and IoT workflow apps.

Azure Study Materials, Azure Learning, Azure Certifications, Azure Guides

Insurers can take an iterative and modular approach to solving high value challenges rapidly. Unity provides all the tools required to accelerate digital transformation. Other noteworthy features include:

◈ Custom extensions: use any programming language supported by Docker, in combination with Unity SDKs, to build custom frontend and backend solutions.

◈ Off-the-shelf apps: plug in applications and services (from Codafication) designed for the insurance industry.

◈ Scalability: cloud-native technology, underpinned by Kubernetes, can be hosted in the cloud or in a multi-cloud scenario, with a mix of Docker, serverless and on-premises options.

◈ GraphQL API: leverage the power of a graph database to unlock data silos and find relationships between data stores from legacy systems. Integrate with cloud vendors, AI services and best-in-breed services through a single, secure, scalable and dynamic API.

◈ Integrative technologies: create powerful custom IoT workflows with logic hooks, web hooks and real-time data subscriptions.

Benefits


◈ Through Unity, organizations can interconnect everything and relate data on the fly. Developers can leverage legacy core systems, middleware, and robotics using a microservice architecture driven by a powerful service mesh and extensible framework.

◈ Teams can leverage this infrastructure to deliver (in parallel) solutions into the platform and into the hands of their users. Insurance carriers will find new use cases (like data science uses, and AI) and develop apps rapidly, to deliver projects faster, for less cost and less risk.

◈ Projects can be secured and reused across the infrastructure. This accelerates digital transformation projects without disrupting existing architecture and is the primary step to implementing modern cloud native technologies, such as AI and IoT.

◈ The ‘modernize now, decommission later’ approach to core legacy systems lets an insurer compete and remain relevant against competitors while providing a longer runway for decommissioning aging legacy systems.

Saturday, 20 July 2019

New ways to train custom language models – effortlessly!

Video Indexer (VI), the AI service for Azure Media Services enables the customization of language models by allowing customers to upload examples of sentences or words belonging to the vocabulary of their specific use case. Since speech recognition can sometimes be tricky, VI enables you to train and adapt the models for your specific domain. Harnessing this capability allows organizations to improve the accuracy of the Video Indexer generated transcriptions in their accounts.

Over the past few months, we have worked on a series of enhancements to make this customization process even more effective and easy to accomplish. Enhancements include automatically capturing any transcript edits done manually or via API as well as allowing customers to add closed caption files to further train their custom language models.

The idea behind these additions is to create a feedback loop where organizations begin with a base out-of-the-box language model and improve its accuracy gradually through manual edits and other resources over a period of time, resulting with a model that is fine-tuned to their needs with minimal effort.

Accounts’ custom language models and all the enhancements this blog shares are private and are not shared between accounts.

In the following sections I will drill down on the different ways that this can be done.

Improving your custom language model using transcript updates


Once a video is indexed in VI, customers can use the Video Indexer portal to introduce manual edits and fixes to the automatic transcription of the video. This can be done by clicking on the Edit button at the top right corner of the Timeline pane of a video to move to edit mode, and then simply update the text, as seen in the image below.

Media Services & CDN, Cognitive Services, Azure Media Services, Video Indexer, Azure Learning, Azure Tutorial and Materials, Azure Online Guides

The changes are reflected in the transcript, captured in a text file From transcript edits, and automatically inserted to the language model used to index the video. If you were not already using a customer language model, the updates will be added to a new Account Adaptations language model created in the account.

You can manage the language models in your account and see the From transcript edits files by going to the Language tab in the content model customization page of the VI website.

Once one of the From transcript edits files is opened, you can review the old and new sentences created by the manual updates, and the differences between them as shown below.

Media Services & CDN, Cognitive Services, Azure Media Services, Video Indexer, Azure Learning, Azure Tutorial and Materials, Azure Online Guides

All that is left is to do is click on Train to update the language model with the latest changes. From that point on, these changes will be reflected in all future videos indexed using that model. Of course, you do not have to use the portal to train the model, the same can be done via the Video Indexer train language model API. Using the API can open new possibilities such as allowing you to automate a recurring training process to leverage ongoing updates.

Media Services & CDN, Cognitive Services, Azure Media Services, Video Indexer, Azure Learning, Azure Tutorial and Materials, Azure Online Guides

There is also an update video transcript API that allows customers to update the entire transcript of a video in their account by uploading a VTT file that includes the updates. As a part of the new enhancements, when a customer uses this API, Video Indexer also adds the transcript that the customers uploaded to the relevant custom model automatically in order to leverage the content as training material. For example, calling update video transcript for a video titled "Godfather" will result with a new transcript file named “Godfather” in the custom language model that was used to index that video.

Improving your custom language model using closed caption files

Another quick and effective way to train your custom language model is to leverage existing closed captions files as training material. This can be done manually, by uploading a new closed caption file to an existing model in the portal, as shown in the image below, or by using the create language model and update language model APIs to upload a VTT, SRT or TTML files (similarly to what was done until now with TXT files.)

Media Services & CDN, Cognitive Services, Azure Media Services, Video Indexer, Azure Learning, Azure Tutorial and Materials, Azure Online Guides

Once uploaded, VI cleans up all the metadata in the file and strip it down to the text itself. You can see the before and after results in the following table.

Type Before  After 
VTT NOTE Confidence: 0.891635
00:00:02.620 --> 00:00:05.080
but you don't like meetings before 10 AM.
but you don’t like meetings before 10 AM.
SRT  2
00:00:02,620 --> 00:00:05,080
but you don't like meetings before 10 AM. 
but you don’t like meetings before 10 AM. 
TTML  <!-- Confidence: 0.891635 -->
<p begin="00:00:02.620" end="00:00:05.080">but you don't like meetings before 10 AM.</p> 
but you don’t like meetings before 10 AM.

From that point on, all that is left to do is review the additions to the model and click Train or use the train language model API to update the model.

Friday, 19 July 2019

Assess the readiness of SQL Server data estates migrating to Azure SQL Database

Migrating hundreds of SQL Server instances and thousands of databases to Azure SQL Database, our Platform as a Service (PaaS) offering, is a considerable task, and to streamline the process as much as possible, you need to feel confident about your relative readiness for migration. Being able to identify low-hanging fruit including the servers and databases that are fully ready or that require minimal effort to prepare for migration eases and accelerates your efforts. We are pleased to share that Azure database target readiness recommendations have been enabled.

Azure SQL Database, Azure Study Materials, Azure Certifications, Azure Tutorials and Materials

The Azure Migrate hub provides a unified view of all your migrations across the servers, applications, and databases. This integration provides customers with a seamless migration experience beginning during the discovery phase. The functionality allows customers to use assessment tools for visibility into the applications currently run on-premises so that they can determine cloud suitability and project the cost of running their applications in the cloud. It also allows customers to compare options between competing public and hybrid cloud options.

Assessing and viewing results


Assessing the overall readiness of your data estate for a migration to Azure SQL Database requires only a few steps:

1. Provision an instance of Azure Migrate, create a migration project, and then add Data Migration Assistant to the migration solution to perform the assessment.

2. After you create the migration project, download Data Migration Assistant and run an assessment against one or more SQL Server instances.

3. Upload the Data Migration Assistant assessment results to the Azure Migrate hub.

In a few minutes, the Azure SQL Database target readiness results will be available in your Azure Migrate project.

You can use single assessment for as many SQL Servers as you want, or you can run multiple parallel assessments and upload them to the Azure Migrate hub. The Azure Migrate hub consolidates all the assessments and a provide summarized view of SQL Server and database readiness.

The Azure Migrate dashboard provides a view of your data estate and its overall readiness for migration. This includes the number of databases that are ready to migrate to Azure SQL Database and to SQL Server hosted on an Azure virtual machine. Readiness is computed based on feature parity and schema compatibility with various Azure SQL Database offerings. The dashboard also provides insight into overall migration blockers and the all-up effort involved with migrating to Azure.

IT pros and database administrators can drill-down further to view a specific set of SQL Server instances and databases for a better understanding their readiness for migration.

Azure SQL Database, Azure Study Materials, Azure Certifications, Azure Tutorials and Materials

The “Assessed database” view provides an overview of individual databases, showing info like migration blockers and readiness for Azure SQL Database and SQL Servers hosted on an Azure virtual machine.

Azure SQL Database, Azure Study Materials, Azure Certifications, Azure Tutorials and Materials

Thursday, 18 July 2019

Enhancing the customer experience with the Azure Networking MSP partner program

We are always looking for ways to improve the customer experience and allow our partners to complement our offerings. In support of these efforts we are sharing the Azure Networking Managed Service Provider (MSP) program along with partners that deliver value added managed cloud network services to help enterprise customers connect, operationalize, and scale their mission critical applications running in Azure.

Azure Networking MSP Partner Program enables partners such as networking focused MSPs, network carriers, and systems integrators (SIs) to use their rich networking experience to offer cloud and hybrid networking services around Azure’s growing portfolio of Azure Networking products and services.

Azure’s Networking services are fundamental building blocks critical to cloud migration, optimal connectivity, and security of applications. New networking services such as Virtual WAN, ExpressRoute, Azure Firewall, and Azure Front Door further enrich this portfolio allowing customers to deploy richer applications in the cloud. The Networking MSP partners can help customers deploy and manage Azure Networking services.

Azure Networking MSPs


Azure MSPs play a critical role in enterprise cloud transformation by bringing their deep knowledge and real-world experience to help enterprise customers migrate to Azure. Azure MSPs and the Azure Expert MSP program make it easy for customers to discover and engage specialized MSPs.

Azure Networking MSPs are a specialized set of MSPs for addressing enterprise cloud networking needs and challenges across all aspects of cloud and hybrid networking. Their managed network services and offerings include various aspects of the application lifecycle including network architecture, planning, deployment, operations, maintenance, and optimization.

Azure Lighthouse - unblocking Azure Networking MSPs


Many enterprise customers, such as banks and financial institutions want partners who can help them with managing their Azure Networking subscriptions. However, the need for individual customer management for these subscriptions introduces a lot of manual work for these service providers.

Last week, we announced Azure Lighthouse, which is a unique set of capabilities on Azure, empowering service provider partners with a single control plane to view and manage Azure at scale across all their customers with higher automation and efficiency. We also talked about how Azure Lighthouse enables management at scale for service providers.

With Azure Lighthouse, Azure Networking MSPs can seamlessly onboard customers via managed services offers on the Azure marketplace or natively via ARM templates – empowering them to deliver a rich set of managed network experiences for their end-customers.

Azure Study Materials, Azure Guides, Azure Tutorial and Materials, Azure Certifications

Azure Networking MSP partners


Azure Networking partners play a big role in the Azure networking ecosystem, delivering Virtual WAN CPEs and hybrid networking services such as ExpressRoute to enterprises that are building cloud infrastructures. We welcome the following Azure Networking MSP launch partners into our Azure Networking MSP partner ecosystem.

Azure Study Materials, Azure Guides, Azure Tutorial and Materials, Azure Certifications

Azure Study Materials, Azure Guides, Azure Tutorial and Materials, Azure Certifications

These partners have invested in people, best practices, operations and tools to build and harness deep Azure Networking knowledge and service capabilities. They’ve trained their staff on Azure and have partnered closely with us in Azure Networking through technical workshops and design reviews.

These partners are also early adopters of Azure Lighthouse, building and delivering a new generation of managed network experiences for their end customers. We encourage all worldwide networking MSPs, network carriers, and SIs that would like to join this program to reach out via ManagedVirtualWAN@microsoft.com to join the Azure Networking MSP program and bring your unique value and services to Azure customers.

Tuesday, 16 July 2019

Digital distribution centers—The future is here

The pace of change has never been as fast as it is now. Globally, the population is becoming more urban and income levels are rising. By 2050, nearly 70 percent of the global population will live in cities or urban areas—that’s six billion people. Consumer behavior has also materially changed over the last decade, and omnichannel retail, personalization, and demand for same day deliveries are growing. To cater to the changing landscape, urban distribution centers that stage products closer to users within large cities are on the rise to enable faster delivery and greater customization.

Within the four walls of the distribution center, picking and packing tasks account for more than 50 percent of the total labor cost of warehousing operations. Access to labor has become increasingly challenging, particularly in urban centers, and staffing levels shoot up five to ten-times normal levels during the holiday season. Space constraints and difficulty in staffing are pushing companies to look at adopting distribution center technologies that cut labor costs, optimizes the flow of products, and improves productivity and utilization of these centers.

Since announcing Microsoft’s $5B commitment to developing an industry leading internet of things (IoT) platform last year, we’ve continued to work with our ecosystem partners to build solutions to address such problems. In “Our IoT Vision and Roadmap” session at Microsoft Build, we announced a partnership with Lenovo and NVIDIA, to bring advanced artificial intelligence (AI) to Azure IoT Edge. The demonstrated solution showed Lenovo hardware, a single SE350 Edge Server, running the Azure IoT Edge runtime with NVIDIA DeepStream to process multiple channels of 1080P/30FPS H265 video streams in real-time, transforming cameras into smart sensors that understand their physical environments and use vision algorithms to find missing products on a shelf or detect damaged goods. Such applications of Azure IoT Edge technology enable customers to quickly and cost effectively deploy retail solutions that optimize their logistics operations.

Today, we are excited to announce the next milestone on this journey, the preview of the Lenovo’s Digital Distribution Center (DDC) solution. Lenovo’s DDC is an IoT solution developed in collaboration with NVIDIA and Microsoft. Through real-time scalable package detection, tracking, and validation, DDC delivers for better optimization and increased utilization of distribution centers for retail, manufacturing, and logistics operations. The solution uses multi-video stream analytics with artificial intelligence and machine learning inferencing to self-learn, optimize, and scale. Additional releases will include geofencing alerts, palletization, depalletization, and last-mile sorting.

Azure Study Materials, Azure Certifications, Azure Guides, Azure Learning

Azure Study Materials, Azure Certifications, Azure Guides, Azure Learning

DDC is built with Azure IoT Central, Microsoft’s fully managed IoT app platform that makes it easy to connect, monitor, and manage your IoT devices and products. Azure IoT Central simplifies the initial setup of your IoT solution and reduces the management burden, operational costs, and overhead of a typical IoT project. This allows solution builders to apply their energy and unique domain expertise to solving customer needs and creating business value, rather than needing to tackle the operating, managing, securing, and scaling of a global IoT solution. Partners like Lenovo and NVIDIA add unique value through schemas that are relevant to industry solutions like DDC, including common industry hierarchies that organize people, places, and environments.

Saturday, 13 July 2019

Analyze AI enriched content with Azure Search’s knowledge store

Through integration with Cognitive Services APIs, Azure Search has long had the ability to extract text and structure from images and unstructured content. Until recently, this capability was used exclusively in full text search scenarios, exemplified in demos like the JFK files which analyzes diverse content in JPEGs and makes it available for online search. The journey from visual unstructured content, to searchable structured content is enabled by a feature called cognitive search. This capability in Azure Search is now extended with the addition of a knowledge store that saves enrichments for further exploration and analysis beyond search itself.

The knowledge store feature of Azure Search, available in preview, refers to a persistence layer in cognitive search that describes a physical expression of documents created through AI enrichments. Enriched documents are projected into tables or hierarchical JSON, which you can explore using any client app that is able to access Azure Storage. In Azure Search itself, you define the physical expression or shape of the projections in the knowledge store settings within your skillset.

Customers are using a knowledge store (preview) in diverse ways, such as to validate the structure and accuracy of enrichments, generate training data for AI models, and ad-hoc analysis of their data.

For example, the Metropolitan Museum of Art opened access to all images of public domain works in its collection. Enriching the artworks with cognitive search and the knowledge store allowed us to explore the latent relationships within the artworks on different dimensions like time and geography. Questions like how have images of family groups changed over time, or when were domestic animals included in paintings, are now answerable when you are able to identify, extract, and save the information in a knowledge store (preview).

With the knowledge store, anyone with an Azure subscription can apply AI to find patterns, insights, or create dashboards over previously inaccessible content.

What is the knowledge store (preview)?


Cognitive search is the enrichment of documents with AI skills before they are added to your search index. The knowledge store allows you to project the already enriched documents as objects (blobs) in JSON format or tabular data in table storage.

As part of your projection, you can shape the enriched document to meet your needs. This ensures that the projected data aligns with your intended use.

When using tabular projections, a knowledge store (preview) can project your documents to multiple tables while preserving the relationships between the data projected across tables. The knowledge store has several other features like allowing you to save multiple unrelated projections of your data.

Azure Study Materials, Azure Learning, Azure Tutorials and Materials, Microsoft Online Exam

Data visualization and analytics


Search enables you to find relevant documents, but when you’re looking to explore your data for corpus wide aggregations or want to visualize changes over time you need your data represented in a form other than a search index.

Leveraging Power BI’s integration with Azure tables, gets your dashboard started with only a few clicks. To identify insights from the enriched documents over dimensions like time or space, simply project your enriched documents into tables, validate that Power BI recognizes the relationships and you should now have your data in a format that is ready to consume within the visuals.

When you create a visual, any filters work, even when your data spans related tables. As an example, the art dashboard was created on the open access data from the MET in the knowledge store and the Art Explorer site uses the search index generated from the same set of enrichments.

Azure Study Materials, Azure Learning, Azure Tutorials and Materials, Microsoft Online Exam

The art explorer site allows you to find art works and related works while the Power BI report gives you a visual representation of the corpus and allows you to slice your data along different dimensions. You now can answer questions like “How does body armor evolve over time?”

In this example, a knowledge store (preview) enabled us to analyze the data ad-hoc. In another example, we may for instance enrich invoices or business forms, project the structured data to a knowledge store (preview), and then create a business-critical report.

Improving AI models


A knowledge store (preview) can also help improve the cognitive search experience itself as a data source for training AI models deployed as a custom skill within the enrichment pipeline. Customers deploying an AI model as a custom skill can project a slice of the enriched data shaped to be the source of their machine learning (ML) pipelines. A knowledge store (preview) now serves as a validator of the custom skill as well as a source of new data that can be manually labeled to retrain the model. While the enrichment pipeline operates on each document individually, corpus level skills like clustering require a set of documents to act on. A knowledge store (preview) can operate on the entire corpus to further enrich documents with skills like clustering and save the results back in a knowledge store (preview) or update the documents in the index.

Azure Study Materials, Azure Learning, Azure Tutorials and Materials, Microsoft Online Exam

Friday, 12 July 2019

Introducing the new Azure Migrate: A hub for your migration needs

Moving on-premises apps and data to the cloud is a key step in our customers’ migration journey, and we’re committed to helping simplify that process. Earlier this year, we invited customers to participate in the preview of multiple new migration capabilities. Today, I am excited to announce the latest evolution of Azure Migrate, which provides a streamlined, comprehensive portfolio of Microsoft and partner tools to meet migration needs, all in one place.

With the general availability of Azure Migrate, including the new integrated partner experience, Server Assessment, Server Migration, Database Assessment, and Database Migration capabilities, we strive to make the cloud journey even easier for customers. Azure Migrate acts as a central hub for all migration needs and tools from infrastructure to applications to data. We are truly democratizing the migration process with guidance and choice.

New Azure Migrate integrated experience


The new experience provides you access to Microsoft and ISV tools and helps identify the right tool for your migration scenario. To help with large-scale datacenter migrations and cloud transformation projects, we’ve also added end-to-end progress tracking.

Azure Study Materials, Azure Learning, Azure Certifications, Azure Tutorials and Materials

New features include:

◈ Guided experience for the most common migration scenarios such as server and database migration, data movement to Azure with Data Box, and migration of applications to Azure App Service

◈ Feature-based grouping and choice of Microsoft and partner tools for the typical phases of the migration process—discovery, assessment, and migration

◈ An integrated experience that ensures continuity and gives you a consistent view of your datacenter assets

Carbonite, Cloudamize, Corent, Device42, Turbonomic, and UnifyCloud are already integrated with Azure Migrate. 

Powerful Server Assessment and Server Migration capabilities


With our new Azure Migrate: Server Assessment service offering, in addition to discovery and assessment of VMware servers, you will now be able to:

◈ Perform large-scale VMware datacenter discovery and assessment for migration. Customers can now discover and assess 35,000 virtual machines (VMs). This is a tremendous scale improvement from the previous limit of 1,500 VMs.

◈ Perform large-scale Hyper-V datacenter discovery and assessment for migration. Customers can now profile Hyper-V hosts with up to 10,000 VMs. You can also bring all your inventory from VMware and Hyper-V in the same Azure Migrate project.

◈ Get performance-based rightsizing, application dependency analysis, migration cost planning, and readiness analysis for both VMware and Hyper-V. You don’t need any agents to perform discovery and assessment with Server Assessment.

Azure Study Materials, Azure Learning, Azure Certifications, Azure Tutorials and Materials

Azure Migrate: Server Assessment is free to all Azure customers and will soon add support for physical server discovery and assessment.

Building on our current ability to perform migration of VMware, Hyper-V, Amazon Web Services (AWS), and Google Cloud Platform (GCP) virtual machines and physical servers to Azure, the new Azure Migrate: Server Migration enables:

◈ Agentless migration of VMware VMs to Azure in preview. When you opt to use the new agentless migration method for VMware VMs, you can use the same appliance for discovery, assessment, and migration. Onboard once and execute the entire process seamlessly. You also get OS-agnostic support to help you migrate any client or server OS, including Windows or Linux, that is supported on the Azure platform. This complements the generally available agent-based migration capability.

◈ Agentless migration of Hyper-V VMs to Azure and agent-based migration of physical servers and VMs running on Amazon Web Services or Google Cloud Platform to Azure.

◈ Simplified experience, similar to creating a virtual machine in Azure. The assessment recommendations automatically get applied to the VMs as you start migrating them, especially the rightsizing recommendations that help you optimize servers and save money. This feature works with assessments performed by Azure Migrate: Server Assessment or any integrated partners, such as Cloudamize and Turbonomic.

◈ No-impact migration testing that helps you plan your migration with confidence. You also get zero data loss when you move your applications to Azure.

Azure Study Materials, Azure Learning, Azure Certifications, Azure Tutorials and Materials

Azure Migrate: Server Migration is free to all Azure customers. You only pay for the compute and storage that you consume in your Azure subscription.

Geographic availability


The Azure Migrate experience, including Server Assessment, Server Migration, and our integrated set of Microsoft and partner tools, are available starting today in United States, Europe, Asia, and the United Kingdom. You can start by creating an Azure Migrate project in a geography of your choice. We will ensure that metadata associated with your Microsoft and partner scenarios is retained in an Azure datacenter in the geography that you select. Later this month, customers will be able to create their Azure Migrate projects in Australia, Canada, and Japan. You can use a project in any geography to perform migrations to any Azure region of your choice.

Thursday, 11 July 2019

Two ways to share Azure Advisor recommendations

If your IT organization is like most, you probably work with many different people across many different teams. When it comes to common IT tasks like optimizing your cloud workloads, you might need to interact with several resource owners or even complete a formal review process.

That’s why with Azure Advisor, we’ve made it easy to share recommendations with other people across your teams so you can follow best practices that help you get the most out of Azure. Advisor is a free Azure service that helps you optimize your Azure resources for high availability, security, performance, and cost by providing personalized recommendations based on your usage and configurations.

Here are two ways you can share your Advisor best practice recommendations with your teams.

1. Export a PDF or CSV of your Advisor recommendations


Probably the simplest way to share your Advisor recommendations is by exporting an Advisor recommendation report as a PDF or CSV through the Advisor UI in the Azure portal.

Azure Certifications, Azure Learning, Azure Tutorials and Materials, Azure Guides

This report shows a summary of your Advisor recommendations by category, subscription, and potential business impact. Then you can easily share it with other teams so the resource owners can take action and optimize their resources for high availability, security, performance, and cost.

Azure Certifications, Azure Learning, Azure Tutorials and Materials, Azure Guides

If you want to provide a specific view of a subset of your recommendations, you can use the UI filters or drill down into specific categories and recommendations. The recommendation report will only contain what you see on the screen when you generate it, which can help you focus on the most critical optimizations.

2. Use the Advisor API to integrate with your ticketing system or dashboards


The other way to share your Advisor recommendations with other people in your organization is via the Advisor REST API. Using this API, you can connect Advisor with your organization’s ticketing system and assign remediation work, set up an internal working dashboard your teams can review and action, or leverage Advisor’s recommendation data any way you choose.

Azure Certifications, Azure Learning, Azure Tutorials and Materials, Azure Guides

The visual above shows just one way you can use the Advisor API with your ticketing application to share Advisor recommendations with your teams. Some setup is required, but once this scenario is complete, you can start remediating your recommendations more programmatically which will save you time as you optimize your resources.

This more advanced approach tends to work best for larger organizations, organizations managing a large number of Azure subscriptions and resources that are generating a large number of recommendations, and organizations that have a fairly sophisticated IT practice in place, since it scales well with the size of your deployments.