Thursday, 29 September 2022

RoQC and Microsoft simplify cloud migration with Microsoft Energy Data Services

Microsoft Energy Data Services, Microsoft Career, Microsoft Jobs, Microsoft Prep, Microsoft Certification, Microsoft Cloud, Microsoft Prep, Microsoft Preparation

The vast amount of data in energy companies slows down their digital transformation. Together with RoQC solutions, Microsoft Energy Data Services will accelerate your journey in democratizing access to data by providing an easy-to-deploy managed service fully supported by Microsoft.

Managing large data sets is complicated, and few industries have larger and more complex data sets than the energy industry. Data complexity and large investments in on-premises storage solutions and multitudes of computer systems prevent the transition to cloud-based sub-surface data management. A single company can have tens of petabytes of structured and unstructured data, which if not quality-assured, can lead to an increase in cost if it goes wrong.

Solutions from RoQC, a Norwegian software company, clean up structured data for energy companies. This makes data management more efficient from a time and cost perspective, and also makes decision-making more reliable.

With Microsoft Energy Data Services, energy companies can leverage new cloud-based data management capabilities provided by RoQC and Microsoft Energy Data Services.

Microsoft Energy Data Services is a data platform fully supported by Microsoft, that enables efficient data management, standardization, liberation, and consumption in energy exploration. The solution is a hyperscale data ecosystem that leverages the capabilities of the OSDU Data Platform™ and Microsoft's secure and trustworthy cloud services with our partners’ extensive domain expertise.

"Through machine learning, our software gives energy companies complete control of their data and assets. When the amounts of data are reduced, we eliminate uncertainty and duplication, and optimize the quality of the data sets. Traditionally a petrophysicist might spend a day or two cleaning up the logs for one well before they can be used for detailed analysis—with RoQC LogQA the same petrophysicist can clean hundreds of thousands of logs in the same timeframe. By cooperating with one of the largest platform providers in the world, we gain access to technology, competency, and markets it would be hard for us to get otherwise."—Bjørn Thorsen, CEO of RoQC.

New possibilities through cooperation


RoQC, a certified independent software vendor with Microsoft, has been able to expand its technology globally through the partnership.

Partner development manager for Microsoft Norway, Ole Christian Smerud, assures that the cooperation is mutually beneficial. "As a platform provider, we depend on strong partners to give our customers the best solutions. While we provide a platform, cloud competency, and access to an ecosystem for RoQC, they bring domain knowledge and relevance to their industry," he says.

Save millions with better data


RoQC believes that the energy industry struggles to take the step into the cloud, simply because of the data complexity and that most companies lack control over their data. By qualifying and quantifying data sets by identifying and deleting duplicates, RoQC Tools can reduce the data set size with commensurate dramatic savings in storage costs.

By reducing the amount of data by 10 to 30 percent, we’re talking millions of dollars in savings. The bigger the organization, the bigger the effect.

RoQC Tools are primarily designed so that data managers can perform tasks that are usually time-consuming as efficiently as possible. Very often they can complete a task that usually takes months, in a minute or two. Sometimes, the tasks would not be possible at all without the tools.

There is an obvious and well-documented correlation between increasing the quality of your data and reducing the risk of decisions based on that data. Geoscientists and project leaders in this field make decisions worth millions, maybe billions. You don’t want to make a decision of that magnitude based on insufficient or weak data.

RoQC believes the energy companies’ data is the key to shifting away from fossil resources. In the data sets, subsea energy companies have knowledge of "everything" about the ocean floor and sub-sea.

"Minerals from the ocean floor and sub-surface might be the next big thing for subsea oil-dependent nations like Norway. It is an already overused statement, but data is literally the new oil for this industry," says Bjørn Thorsen.

Preparing efficient data migration


RoQC provides both tools and consultants to enable a client to prepare their data prior to migrating the data to Azure. This preparation can include everything from simply identifying and removing duplicates to developing and implementing standards and then cleaning the data to comply with the standards. These preparations can be done directly in the clients’ normal (e.g., Halliburton/Schlumberger) interpretation platforms.

Furthermore, RoQC’s LogQA provides extremely powerful native, machine learning–based QA and cleanup tools for log data once the data has been migrated to Microsoft Energy Data Services, an enterprise-grade OSDU Data Platform on the Microsoft Cloud.

LogQA monitors the quality of the well log data that a client has stored on OSDU Data Platform. LogQA was partially developed in collaboration with Microsoft as part of Microsoft Energy Data Services, and LogQA is maintained on the latest OSDU Data Platform APIs and version/schema.

As LogQA is native to the Microsoft Cloud infrastructure there is no customer deployment required before a customer can utilize LogQA to monitor, identify, and rapidly rectify the data quality issues. LogQA is designed to work with typically energy industry client datasets, which is potentially millions of well logs.

How to work with RoQC Solutions on Microsoft Energy Data Services


For access to RoQC solutions, reach out to Bjørn Thorsen, CEO, RoQC Data Management AS, Norway at Bjorn@roqc.no.

Source: microsoft.com

Tuesday, 27 September 2022

EPAM and Microsoft partner on data governance solutions with Microsoft Energy Data Services

EPAM, Microsoft Partner, Data Governance, Microsoft Energy Data Services, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Tutorial and Materials, Microsoft Preparation

The energy industry creates and consumes large amounts of highly complex data for key business decisions, like where to drill the next series of wells, how to optimize production, and where to lease the next big field. Despite good intentions, the industry is still plagued by large quantities of data that are inconsistent in location, quality, and format—much of which cannot reliably be found or used when needed. Even when the data is reliable, it can be locked into application-specific data stores that limit its use. The solution to this dilemma is multi-faceted and increasingly includes cloud technology, the OSDU™ Data Platform, modern applications, and data governance focused on people and their business processes.


Microsoft Energy Data Services is a data platform fully supported by Microsoft, that enables efficient data management, standardization, liberation, and consumption in energy exploration. The solution is a hyperscale data ecosystem that leverages the capabilities of the OSDU Data Platform, Microsoft's secure and trustworthy cloud services with our partners’ extensive domain expertise.

Cloud and the OSDU Data Platform


Cloud-based computing is the future—scalable, reliable, secure storage and compute capabilities, all managed for you with many powerful add-on capabilities at your fingertips. For the energy industry, the Open Group® OSDU Data Platform is rapidly emerging as the standard—an open source, cloud-based data platform that unlocks data from applications and provides standard data schemas and access protocols, enabling both data governance and rapid innovation.

One of the things that EPAM discovered when delivering app developer boot camps and deploying the platform for ourselves and for clients is its high level of complexity. In those earlier days, platform deployment was a multi-step process, with each service being deployed and validated separately, taking up to a week. Before we could move on to solving business problems, a part of our work was to guide our clients through various technical deployment obstacles. In addition, it took another several days to ingest pre-formatted sample data in order to test the platform with real data. Not anymore.

Microsoft Energy Data Services


Microsoft has made the OSDU Data Platform enterprise-ready and pre-bundled with the capabilities needed to optimize Energy Company data value using the Microsoft Cloud. EPAM has seen its benefits. As an enterprise-grade platform, Microsoft Energy Data Services has nearly single-click deployment. Deployment time has reduced significantly—what previously took multiple days now takes about 45 minutes! Similarly, the time to ingest the sample data is drastically reduced from one week to around one hour! In addition, the management layer surrounding the platform provides the assured reliability, stability, security, tools, performance, and the SLAs needed by large enterprises such as major energy companies.

Data governance and modern applications


As noted before, excellent infrastructure alone does not magically solve all data and business problems. With Microsoft Energy Data Services providing a solid foundation with which to store data, process data, and build and host cloud-native apps aligned with the OSDU Technical Standard, what remains to empower a data-driven organization is modern applications and data governance.


It is a daunting task to manually track the manifold ways that data enters the company, the many places it is stored, and the many ways it is consumed, enriched, and duplicated. Improving this requires a team who can map out the detailed way in which all of this happens today. It also takes modern digital tools to automate the aggregation, parsing, quality assessment, and lineage-tracking of the data. It takes people with a broad and deep view to accomplish this for large organizations—people who understand the business, the data types, the technology, and how to provide the right data, in the right formats, in the right place, at the right time, to the right people. That includes application connectors and analytical applications themselves designed for the modern cloud environment so that liberated data can move back and forth to users seamlessly.

How to work with EPAM on Microsoft Energy Data Services


EPAM brings industry knowledge, technical expertise, tools, frameworks, relationships with software vendors, and world-class delivery built on the Microsoft Energy Data Services platform. EPAM has developed a document extraction and processing system (DEPS) accelerator, which provides capabilities to facilitate the development of customizable workflows for extracting and processing unstructured data in the form of scanned or digitalized document formats. DEPS is powered by Azure AI/machine learning and deep learning algorithms.  It includes pluggable sub-systems for customization, uses machine learning pre/post processors, validation and extensions for UI review, automation machine learning models training, manual labeling, and analytics capabilities to improve classic optical character recognition (OCR) and text extraction accuracy. DEPS can be adapted to process numerous data types covering both image and text, PDF, XLS, ASCII, and other file formats.

Microsoft Energy Data Services is an enterprise-grade, fully-managed, OSDU Data Platform for the energy industry that is efficient, standardized, easy to deploy, and scalable for data management—for ingesting, aggregating, storing, searching, and retrieving data. The platform will provide scale, security, privacy, and compliance expected by our enterprise customers. EPAM offers services providing the right data, in the right formats, in the right place, at the right time, to the right people, which includes application connectors and analytical applications, with data contained in Microsoft Energy Data Services.

Source: microsoft.com

Saturday, 24 September 2022

Cegal and Microsoft break down data silos and offer open collaboration with Microsoft Energy Data Services

Microsoft Certification, Microsoft Career, Microsoft Jobs, Microsoft Prep, Microsoft Preparation, Microsoft Certification, Microsoft Skills, Microsoft Jobs

The vast amount of applications and data in energy companies across isolated environments is exposing inefficiencies in collaboration. Together with Cegal Cetegra, Microsoft Energy Data Services will accelerate your journey toward seamless access to the data and applications you need for your day-to-day work by providing an easy-to-deploy managed service fully supported by Microsoft.

Cegal has been successfully collaborating with Microsoft and partners to help evaluate the new Microsoft Energy Data Services preview program, an enterprise-grade OSDU Data Platform powered by the cloud.

With Microsoft Energy Data Services, energy companies can leverage new cloud-based data management and collaboration capabilities provided by Cegal and Microsoft. 

Microsoft Energy Data Services is a data platform fully supported by Microsoft, that enables efficient data management, standardization, liberation, and consumption in energy exploration. The solution is a hyperscale data ecosystem that leverages the capabilities of the OSDU Data Platform and Microsoft's secure and trustworthy cloud services with our partners’ extensive domain expertise.

Cegal and Microsoft create collaborative cloud-based applications on Microsoft Energy Data Services


As an ISV and a specialist systems integrator for the energy industry, Cegal has always seen great value in removing data silos to free organizations from dated constraints that can lead to lower productivity. Opening for universal access to one of the most critical assets in any organization—knowingly data—is an obvious path to innovation. Integrating proprietary IP in existing workflows, contextualizing data through new AI-based routines, and integrating best-of-breed applications to create new and innovative solutions are critical steps toward more efficient and productive operations.

To achieve this goal, Cegal and Microsoft closely collaborated over several months, during which multiple relevant use cases have been thoroughly assessed and tested on the Microsoft Energy Data Services platform. From operating the platform to developing new solutions on top of it, Cegal had the opportunity to put in context a wide range of scenarios, making sure the experience was as extensive as possible yet realistic for the energy industry.

Cegal recently released Cetegra, a cloud-based platform offering to its users a modern, collaborative environment, uniquely designed to cater to energy industry–specific needs in terms of digitalization and data management. Deployed through a fully scalable, pay-as-you-go model, Cetegra leverages the strengths of the Microsoft Cloud and will provide full support for the Microsoft Energy Data Services platform. Cetegra with Microsoft Energy Data Services delivers a one-stop shop for all types of data and applications tightly linked to OSDU, offering energy players a comprehensive integration of their applications portfolio, also allowing them to develop and test new apps within the Cetegra Innovation Space without impacting existing business operations.

With Microsoft Energy Data Services entering preview, Cegal looks forward to delivering operational support for the platform. As a global specialist in digitalization, capitalizing on years of experience within the energy sector, Cegal represents the partner of choice to support and guide energy players as they navigate their digital transformation journey. 

How to work with Cegal Solutions on Microsoft Energy Data Services


Microsoft Energy Data Services is an enterprise-grade, fully-managed, OSDU Data Platform for the energy industry that is efficient, standardized, easy to deploy, and scalable for data management—for ingesting, aggregating, storing, searching, and retrieving data. The platform will provide the scale, security, privacy, and compliance that are expected by our enterprise customers. The platform offers out-of-the-box compatibility with Cegal Cetegra, a cloud-based platform offering a modern, collaborative environment, with data contained in Microsoft Energy Data Services.

Source: microsoft.com

Thursday, 22 September 2022

Azure Payment HSM achieves PCI PIN certification offering customers secure digital payments solutions in the cloud

Azure Exam Prep, Azure Tutorial and Materials, Azure Certification, Azure Career, Azure Skills, Azure Jobs

Today we’re announcing Azure Payment HSM has achieved Payment Card Industry Personal Identification Number (PCI PIN) making Azure the first hyperscale cloud service provider to obtain this certification.

Financial technology has rapidly disrupted the payments industry and securing payment transactions is of the utmost importance. Azure helps customers secure their critical payment infrastructure in the cloud and streamlines global payments security compliance. Azure remains committed to helping customers achieve compliance with the Payment Card Industry’s leading compliance certifications.

Enhanced security and compliance through Azure Payment HSM


Azure Payment HSM is a bare metal infrastructure as a service (IaaS) that provides cryptographic key operations for real-time payment transactions in Azure. The service empowers financial institutions and service providers to accelerate their digital payment strategy through the cloud. Azure Payment HSM is certified across stringent security and compliance requirements established by the PCI Security Standards Council (PCI SSC) including PCI DSS, PCI 3DS, and PCI PIN and offers HSMs certified to FIPS 140-2 Level 3 and PCI HSM v3.

Azure Payment HSM enables a wide range of use cases. These include payment processing for card and mobile payment authorization and 3D-Secure authentication; payment credential issuing for cards, wearables, and connected devices; securing keys and authentication data for POS, mPOS, Remote key loading, PIN generation, and PIN routing; sensitive data protection for point-to-point encryption, security tokenization, and EMV payment tokenization.

Azure Payment HSM is designed to meet the low latency and high-performance requirements for mission-critical payment applications. The service is comprised of single-tenant HSMs offering customers complete remote administrative control and exclusive access. HSMs are provisioned and connected directly to users’ virtual networks, and HSMs are under users’ sole administration control. HSMs can be easily provisioned as a pair of devices and configured for high availability.

Azure Payment HSM provides great benefits for both payment HSM users with a legacy on-premises HSM footprint and those new payment ecosystem entrants who may choose a cloud-native approach from the outset. The customer could be a payment service provider acting on behalf of multiple financial institutions or a financial institution that wishes to directly access the Azure Payment HSM.

Leverage Azure Payment HSM PCI PIN certification


PINs are used to verify cardholder identity during online and offline payment card transactions.

The PCI PIN Security Standard contains requirements for the secure management, processing, and transmission of PIN data and applies to merchants and service providers that store, process, transmit, or can impact the security of PIN data.

Azure Payment HSM customers can reduce their compliance burden by leveraging Azure’s PCI PIN Attestation of Compliance (AOC) which addresses Azure’s portion of responsibility for each PCI PIN requirement and contains the list of certified Azure regions. The Azure Payment HSM Shared Responsibility Matrix is also available to help customers significantly reduce time, effort, and cost during their own PCI PIN assessments by simplifying the compliance process.

Source: microsoft.com

Tuesday, 20 September 2022

Azure API for FHIR and Microsoft’s Power Platform help universities tackle COVID-19

When summer 2021 ended, many organizations faced the formidable challenge of how to return to their places of work and school safely. Tuskegee University (Tuskegee) was one of them—not only was the safe return of students and faculty to school a priority but since Tuskegee is in a community with no hospital, controlling exposure was essential.

At the time, Tuskegee encountered many challenges, including but not limited to a statewide shortage of testing kits, inability to handle the broadscale logistics of testing, contact tracing, and figuring out a simplified way to report on status and schedule tests.

With the help of industry partners like Microsoft and Enabling Technologies, Tuskegee was able to build and execute a successful strategy to allow students to return to school safely amidst COVID-19.

Public health specialist, Crystal James stepped up to address the challenge that Tuskegee faced. "We realized that to get back to face to face, we needed to have a strategy to protect the learning environment," she recalled. She expanded her responsibilities as Department Chair in the College of Veterinary Medicine to include a new position: the Special Assistant to the President for COVID-19 Response.

Lack of testing kits


The first hurdle faced was the lack of test kits. According to Crystal, the COVID-19 Recovery Management Center (CRMC) studied the test kits and decided to make their own kits. The Tuskegee CRMC is composed of laboratory scientists, policy specialists, nurses, and public health professionals. They made test kits for the faculty, staff, students, and the surrounding Black Belt Counties who lacked access to tests.

Handling the logistics of testing


With their campus laboratories ready to make the kits, the industry came to Tuskegee’s aid. Thermo Fisher Scientific generously offered the instrumentation for Tuskegee to do PCR testing on campus. In February 2021, Tuskegee University opened its own Clinical Laboratory Improvement Amendments (CLIA) certified reference laboratory and began PCR testing for COVID-19. They tied their testing systems to Thermo Fisher’s for processing.

Azure API, Microsoft’s Power Platform, COVID-19, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Tutorial and Materials, Microsoft News, Microsoft Certification, Microsoft Learning

An easy-to-use app


With test kits and the ability to provide results on campus, Tuskegee University Health Disparities Diagnostic Center addressed the next step—developing a mechanism to deliver the test results back to individuals promptly and contact tracing on campus.

Tuskegee found Microsoft’s offerings and solutions to be the right fit—using Microsoft Power Platform, Dataverse & Azure API for FHIR, the institution had tools to schedule tests, send test results back, and attest to health status while on campus.

Building an app that was simple to use was paramount. The features in the app that was built using Microsoft’s Power Platform allowed students and faculty to:

◉ Report and screen for symptoms—keeping track of symptoms and questionnaire responses by completing daily self-attestations from a smartphone or desktop.

◉ Create daily passes—after completing the daily self-attestation, the application generates a unique daily QR code. That barcode is scanned to gain access to campus facilities.

◉ Manage testing—designated personnel can manage student appointments, questionnaire responses, and test result notifications.

You can find the information about the implementation at Golden Tiger Health Check | Tuskegee University.

Azure API, Microsoft’s Power Platform, COVID-19, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Tutorial and Materials, Microsoft News, Microsoft Certification, Microsoft Learning

Integration assistance from Enabling Technologies


Microsoft Gold Partner, Enabling Technologies (Enabling), led the coordinated development efforts on behalf of Tuskegee. Enabling also rolled out the Return to School app at Lake Washington School District, Kent School District, and Howard University.

"Enabling Technologies provided the technical expertise to create a new system that would talk to the two existing systems," stated Crystal, "including CareEvolve and Thermo Fisher’s Platform for Science. Enabling helped with APIs, programming, and some other integration, which was helpful for non-IT people."

Microsoft technology provided a simple-to-use app and the integration protocols needed to handle the end-to-end testing and results.

Enabling architected the solution and used the Fast Healthcare Interoperability Resources (FHIR®) protocol within Azure API for FHIR to safeguard Tuskegee data.  Azure API for FHIR was used to facilitate the movement of data between the application built on Power Platform and the testing lab software systems.

The rollout at Tuskegee and other Historically Black Colleges and Universities was aligned to Microsoft’s racial equity initiative. this solution was also deployed in support of  Microsoft's announcement to commit more than $110M to support nonprofits, workers, and schools in Washington state.

Soothing the concerns of the community


With COVID-19 swirling, Tuskegee’s students, faculty, and staff had enough on their minds. Learning to use a new app could not be yet another burden. Enabling’s Adoption and Organizational Change team, led by Gabrielle Manuel, stepped in. "It was important to provide students, faculty, and staff with appropriate support materials and advance messaging to prepare them to begin using the app," she said. "The custom messaging, user guides, and videos provided clear instructions and expectations."

Tuskegee’s James advised, "One of the biggest issues that isn’t as well highlighted about the pandemic is the anxiety created when you have to engage in public spaces during this pandemic. Having a tool like this assists us to bring that anxiety level down to a manageable level and bring our students and faculty back to an environment we can call as safe as possible."

The results


The implemented solution through Microsoft’s Power Platform improved the safety of Tuskegee's students, faculty, and staff. The Return to School solution helped decrease the time from exposure to a confirmed PCR lab result to five hours. It also helped Tuskegee to assure parents and students that there is a system to monitor trends every day since RTS also publishes results to their dashboard. Tuskegee published the number of cases on campus, the number of tests conducted, and the percentage of positive cases reducing the mental stress on potentially exposed individuals.

"Microsoft’s release was just in time," said Chris Stegh, CTO, Enabling Technologies. "The fact that the app could be activated in the existing Microsoft 365 tenant made the decision simple. Azure API for FHIR allowed the app to integrate with the university’s COVID-19 testing lab."

Azure API, Microsoft’s Power Platform, COVID-19, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Tutorial and Materials, Microsoft News, Microsoft Certification, Microsoft Learning

What’s next?


While Ms. James is optimistic, she’s also realistic. "While I know the rest of the world would like for us just call it over, that's not how pandemics work. We realize that COVID-19 will still be an issue that needs to be addressed on our campus. Because we are in an area that does not have a hospital and access to health care is very sparse, we want to continue to monitor the prevalence of COVID-19 on our campus. The app will help us pivot should another wave start around the country."

Source: microsoft.com

Thursday, 15 September 2022

Microsoft shares what's next in machine learning at NVIDIA GTC

Microsoft Career, Microsoft Skills, Microsoft Prep, Microsoft Preparation, Microsoft Certification, Microsoft Guides, Microsoft Online Guides, Microsoft News, Microsoft NVIDIA GTC, Microsoft MRI

Finding scalable solutions for today’s global challenges requires forward-thinking, transformative tools. As environmental, economic, and public health concerns mount, Microsoft Azure is addressing these challenges head on with high-performance computing (HPC), AI, and machine learning. The behind-the-scenes power for everything from MRI scans to energy management and financial services, these technologies are equipping customers and developers with innovative solutions that break through the boundaries of what’s possible in data and compute, paving the way for growth opportunities that span industries and applications around the world.

Microsoft Azure is committed to unlocking these new opportunities for our customers, providing the broadest range of NVIDIA GPUs at the edge, on-premises, in the cloud, and for hybrid environments.

At NVIDIA GTC we will demonstrate this commitment by showing how Azure’s advanced HPC capabilities, and AI/machine learning in the cloud are driving transformation and making an impact together with NVIDIA’s latest technology.

Microsoft Azure’s collaboration with NVIDIA was developed with our customers in mind and focused on opening new doors to innovation with graphics processing unit (GPU) acceleration in the cloud.

Learn more by registering today for NVIDIA GTC, a free, online event running September 19 to 22, 2022.

Microsoft Career, Microsoft Skills, Microsoft Prep, Microsoft Preparation, Microsoft Certification, Microsoft Guides, Microsoft Online Guides, Microsoft News, Microsoft NVIDIA GTC, Microsoft MRI

Get a chance to win an NVIDIA Jetson Nano or swag box


In both of our sessions you have a chance to win a SWAG box complete with a HPC t-shirt and mug or a Jetson Nano. Attend these sessions and don’t forget to look for the special link to enter!

Microsoft Sessions at NVIDIA GTC



Bala Venkataraman, Principal Program Manager, Microsoft.

Video on demand

Azure Machine Learning is committed to simplifying the adoption of its platform for training and production. In 2022, we announced the general availability of Azure Machine Learning CLI v2 and the preview of Azure Machine Learning Python SDK v2. Both launches demonstrate our continued focus on making workflows easier and managing their entire lifecycle starting from training single jobs to pipelines and model deployments. In this session, learn about the key improvements in usability and productivity, and the new features that come with the command-line interpreter (CLI) and software development kit (SDK) v2 of Azure Machine Learning.



Sharmeelee Bijlani, Program Manager Azure Machine Learning, Microsoft; Razvan Tanase, Principal Engineering Manager Azure Machine Learning, Microsoft.

Wednesday, September 21, 10:00 to 10:50 AM PDT (1:00 to 1:50 PM EDT, 7:00 to 7:50 AM CEST)

In recent years, deep learning models have grown exponentially in size, demonstrating an acute need for customers to train and fine-tune them using large-scale data infrastructure, advanced GPUs, and an immense amount of memory. Fortunately, developers can now use simple training pipelines on Azure Machine Learning to train large models running on the latest multi-node NVIDIA GPUs. This session will describe these software innovations to customers through Azure Machine Learning (including a fully optimized PyTorch environment) that offers great performance and an easy-to-use interface for large-scale training. We’ll also highlight the power of Azure Machine Learning through experiments using 1,024 A100 Tensor Core GPUs to scale the training of a two-trillion parameter model with a streamlined user experience at 1,000 plus GPU scale.



Mary Howell, NVIDIA.

Wednesday, Sep 21st, 3:00 - 3:30 PM PDT

In this GTC Watch Party, we will be replaying our Operationalize Large-Model Training on Azure Machine Learning using Multi-Node NVIDIA A100 GPUs session. Participants will be joined by experts from across Microsoft and NVIDIA who bring fresh insights and experiences to the table, taking the session to a whole new level of understanding. Interaction is core to our GTC Watch Parties, and we encourage you to join the discussion with any comments or questions. 



Gabrielle Davelaar, AI Technical Specialist, Microsoft; Maxim Salnikov, Senior Azure GTM Manager, Microsoft; Henk Boelman, Senior Cloud Advocate–AI and Machine Learning, Microsoft; Alexander Young, Technical Marketing Engineer, NVIDIA; Ulrich Knechtel, Microsoft Partner Manager (EMEA), NVIDIA.

Thursday, September 22, 2:00 to 3:30 PM CEST (5:00 to 6:30 AM PDT, 8:00 to 9:30 AM EDT)

In this GTC Watch Party, we will be replaying our Operationalize Large-Model Training on Azure Machine Learning using Multi-Node NVIDIA A100 GPUs session. Participants will be joined by experts from across Microsoft and NVIDIA who bring fresh insights and experiences to the table, taking the session to a whole new level of understanding. Interaction is core to our GTC Watch Parties, and we encourage you to join the discussion with any comments or questions.


Microsoft is helping customers across industries step up, transforming AI and machine learning at the Edge


Nuance’s Dragon Ambient eXperience helps doctors document care faster with AI on Azure

Nuance developed an AI-based clinical solution that automatically turns doctor-patient conversations into accurate medical notes. Built with Azure and PyTorch, this solution saves doctors transcription time, reducing administrative burdens and helping them conduct more focused, higher-quality interactions with their patients.

Energy utility Elva builds a highly secure DevOps platform with Azure infrastructure and network security services

Elva looked to build a secure, cloud-first DevOps platform that could meet Norway’s data residency and compliance requirements, delivering automated services that would help develop network grid technology. Using Azure DDoS Protection, Azure Web Application Firewall, and Azure Kubernetes Service, Elva realized its goal, enhancing its in-house development and data integration capabilities. 

The Royal Bank of Canada creates personalized offers while protecting data privacy with Azure confidential computing

The Royal Bank of Canada (RBC) partnered with Microsoft to create a privacy-preserving multi-party data sharing platform built on Azure confidential computing. Called VCR, this solution enables RBC to personalize offerings and protect privacy at the same time, creating exceptional digital experiences that clients can trust.  

Microsoft Career, Microsoft Skills, Microsoft Prep, Microsoft Preparation, Microsoft Certification, Microsoft Guides, Microsoft Online Guides, Microsoft News, Microsoft NVIDIA GTC, Microsoft MRI

Recapping 2022 moments with Azure and NVIDIA technologies


Azure NC A100 v4-series

At Microsoft, our NC series virtual machines allow customers access to almost limitless AI hardware infrastructure so they can be productive quickly. Last summer, we leveled up, announcing the general availability of Azure NC A100 v4 series virtual machines. Powered by NVIDIA A100 80GB PCle Tensor Core GPUs and 3rd Gen AMD EPYC™ processors, these virtual machines help our customers gain insights faster, innovate with speed, do more with less, and are the most performant and cost-competitive NC series offering for a diverse set of workloads.

DeepSpeed on Azure

Azure Machine Learning uses large fleets of the latest NVIDIA GPUs powered by NVIDIA Quantum InfiniBand interconnects to tackle large-scale AI training and tuning. Last July, we announced a breakthrough in our software stack, using DeepSpeed and 1,024 NVIDIA A100 GPUs to scale the training of a two trillion parameter model with a streamlined user experience at 1,000 plus GPU scale. We are bringing these software innovations to you through Azure Machine Learning (including a fully optimized PyTorch environment) that offers great performance and an easy-to-use interface for large-scale training.

NVads A10 v5 virtual machines

Traditionally, graphics-heavy visualization workloads that run in the cloud require virtual machines with full GPUs that are both costly and inflexible. To combat this, we introduced the first GPU-partitioned (GPU-P) virtual machine offering in the cloud, and just last July, we announced the general availability of NVads A10 v5 GPU accelerated virtual machines. Azure is the first public cloud to offer GPU partitioning on NVIDIA GPUs, and our new NVads A10 v5 virtual machines are designed to offer the right choice for any workload and provide optimum configurations for both single-user and multi-session environments. Dig into our latest virtual machine innovation.

NVIDIA Jetson AGX Orin-powered edge AI devices now available


Microsoft is pleased to announce that the NVIDIA Jetson AGX Orin SoM is now powering Azure Certified edge devices from industry-leading device builders including AAEON, Advantech, and AVerMedia, along with the NVIDIA Jetson AGX Orin developer kit.

Developers and solution builders can now leverage powerful NVIDIA Jetson AGX Orin devkits and production modules with Microsoft Azure to create, deploy, and operate powerful AI solutions at the edge, accelerating product development and deployment at scale. The NVIDIA Orin Nano modules have set a new baseline for entry-level edge AI and robotics, building on the momentum behind the Jetson Orin platform worldwide. Stay tuned for new Jetson Orin NX and Orin Nano partner products launching to meet customer needs in AI solution development.

NVIDIA DLI training powered by Azure


We’re proud to host NVIDIA deep learning institute (DLI) training at NVIDIA GTC again this year, with instructor-led workshops around accelerated computing, accelerated data science, and deep learning. Hosted on Microsoft Azure, these sessions enable and empower you to leverage NVIDIA GPUs on the Microsoft Azure platform to solve the world’s most interesting and relevant problems. Register for a DLI workshop today.

Join us at NVIDIA GTC


In collaboration with NVIDIA, Microsoft delivers purpose-built AI, machine learning, and HPC solutions in the cloud to meet even the most demanding real-world applications at scale. Join us at NVIDIA GTC September 19 to 22, to see how every enterprise can leverage the power of GPUs at the edge, on-premises, in the cloud, and for hybrid solutions.

Source: microsoft.com

Tuesday, 13 September 2022

Dive deep into NAT gateway’s SNAT port behavior

In our last blog, we examined a scenario on how network address translation (NAT) gateway mitigates connection failures happening at the same destination endpoint with its randomized source network address translation (SNAT) port selection and reuse timers. In addition to handling these scenarios, NAT gateway’s unique SNAT port allocation is beneficial to dynamic, scaling workloads connecting to several different destination endpoints over the internet. In this blog, let’s deep dive into the key aspects of NAT gateway’s SNAT port behavior that makes it the preferred solution for different outbound scenarios in Azure.

Why SNAT ports are important to outbound connectivity


For anyone working in a virtual cloud space, it is likely that you will encounter internet connection failures at some point. One of the most common reasons for connection failures is SNAT port exhaustion, which happens when the source endpoint of a connection runs out of SNAT ports to make new connections over the internet.

Source endpoints use ports through a process called SNAT, which allows destination endpoints to identify where traffic was sent and where to send return traffic. NAT gateway SNATs the private IPs and ports of virtual machines (VMs) within a subnet to NAT gateway’s public IP address and ports before connecting outbound, and in turn provides a scalable and secure means to connect outbound.

Microsoft Certification, Microsoft Exam Prep, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Prep, Microsoft Preparation, Microsoft SNAT, Microsoft Guides, Microsoft News

Figure 1: Source network address translation by NAT gateway: connections going to the same destination endpoint over the internet are differentiated by the use of different source ports.

With each new connection to the same destination IP and port, a new source port is used. A new source port is necessary so that each connection can be distinguished from one another. SNAT port exhaustion is an all too easy issue to encounter with recurring connections going to the same destination endpoint since a different source port must be used for each new connection.

How NAT gateway allocates SNAT ports


NAT gateway solves the problem of SNAT port exhaustion by providing a dynamic pool of SNAT ports, consumable by all virtual machines in its associated subnets. This means that customers don’t need to worry about knowing the traffic patterns of their individual virtual machines since ports are not pool-based in fixed amounts to each virtual machine. By providing SNAT ports on-demand to virtual machines, the risk of SNAT exhaustion is significantly reduced, which in turn helps prevent connection failures.

Microsoft Certification, Microsoft Exam Prep, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Prep, Microsoft Preparation, Microsoft SNAT, Microsoft Guides, Microsoft News

Figure 2: SNAT ports are allocated on-demand by NAT gateway, which alleviates the risk of SNAT port exhaustion. 

Customers can ensure that they have enough SNAT ports for connecting outbound by scaling their NAT gateway with public IP addresses. Each NAT gateway public IP address provides 64,512 SNAT ports, and NAT gateway can scale to use up to 16 public IP addresses. This means that NAT gateway can provide over one million SNAT ports for connecting outbound.

How NAT gateway selects and reuses SNAT ports


Another key component of NAT gateway’s SNAT port behavior that helps prevent outbound connectivity failures is how it selects SNAT ports. Whether connecting to the same or different destination endpoints over the internet, NAT gateway selects a SNAT port at random from its available inventory.

Microsoft Certification, Microsoft Exam Prep, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Prep, Microsoft Preparation, Microsoft SNAT, Microsoft Guides, Microsoft News

Figure 3: NAT gateway randomly selects SNAT ports from its available inventory to make new outbound connections.

A SNAT port can be reused to connect to the same destination endpoint. However, before doing so, NAT gateway places a reuse cooldown timer on that port after the initial connection closes.

NAT gateway’s SNAT port reuse cooldown timer helps prevent ports from being selected too quickly for connecting to the same destination endpoint. This is advantageous when destination endpoints have their own source port reuse cooldown timers in place.

Microsoft Certification, Microsoft Exam Prep, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Prep, Microsoft Preparation, Microsoft SNAT, Microsoft Guides, Microsoft News

Figure 4: SNAT port 111 is released and placed in a cooldown period before it can connect to the same destination endpoint again. In the meantime, port 106 (dotted outline) is selected at random from the available inventory of ports to connect to the destination endpoint. The destination endpoint has a firewall with its own source port cooldown timer. There is no issue getting past the on-premise destination’s firewall since the connection from source port 106 is new.

What happens then when all SNAT ports are in use? When NAT gateway cannot find any available SNAT ports to make new outbound connections, it can reuse a SNAT port that is currently in use so long as that SNAT port connects to a different destination endpoint. This specific behavior is beneficial to any customer who is making outbound connections to multiple destination endpoints with NAT gateway.

Microsoft Certification, Microsoft Exam Prep, Microsoft Career, Microsoft Skills, Microsoft Jobs, Microsoft Prep, Microsoft Preparation, Microsoft SNAT, Microsoft Guides, Microsoft News

Figure 5: When all SNAT ports are in use, NAT gateway can reuse a SNAT port to connect outbound so long as the port actively in use goes to a different destination endpoint. Ports in use by destination 1 are shown in blue. Port connecting to destination 2 is shown in yellow. Port 111 is yellow with a blue outline to show it is connected to destinations 1 and 2 simultaneously.

What have we learned about NAT gateway’s SNAT port behavior?


In this blog, we explored how NAT gateway allocates, selects, and reuses SNAT ports for connecting outbound. To summarize:

Function NAT gateway SNAT port behavior Benefit 
SNAT port capacity   Up to 16 public IP addresses.
64,512 SNAT ports / NAT gateway public IP addresses. 
Easy to scale for large and variable workloads. 
SNAT port allocation   Dynamic and On-demand.  Great for flexible, unknown, and large-scale workloads. 
SNAT port selection   Randomized.  Reduces risk of connection failures to the same destination endpoint. 
SNAT port reuse   Reuse to a different destination—connect outbound immediately.
Reuse to the same destination—set on a cooldown timer. 
Reduces risk of connection failures to the same destination endpoint with source port reuse cooldown timers.

Deploy NAT gateway today


Whether your outbound scenario requires you to make many connections to the same or to several different destination endpoints, NAT gateway provides a highly scalable and reliable way to make these connections over the internet.

NAT gateway is easy to use and can be deployed to your virtual network with just a few clicks of a button. Deploy NAT gateway today and follow along on how with: Create a NAT gateway using the Azure portal.

Source: microsoft.com

Sunday, 11 September 2022

Gain deeper insights with Microsoft Intelligent Data Platform

Data is foundational to any digital transformation strategy, yet many organizations struggle to understand what data they have, how to extract insights from it, and how to govern it—according to a 2022 Evanta survey, over half of Chief Data Officers (CDOs) struggle with siloed operating models when it comes to data sharing and democratization. According to Harvard Business Review, organizations that have embraced their data as a strategic asset have been better positioned to drive strategic differentiation and grow their revenue, but the fragmentation that exists today between databases, analytics, and governance is a common barrier to success.

The Microsoft Intelligent Data Platform, empowers organizations to invest more time creating value rather than integrating and managing their data estate. It integrates best-in-class solutions across Microsoft’s technology stack—breaking down data siloes and enabling organizations to extract real-time insights with the data governance needed to run the business safely.

“Shifting from a legacy on-premises data warehouse to Azure Synapse, supported by Datometry, has allowed us to virtualize the vast majority of our code without needing to repoint it. We have gained speed, performance, and agility while reducing costs and taken a big step forward in modernizing our enterprise data storage and management.”—Charlotte Lock, Director of Data, Digital & Loyalty at Co-op.

Azure Exam, Azure Exam Prep, Azure Exam Preparation, Azure Career, Azure Skills, Azure Jobs, Azure Guides, Azure Tutorial and Materials

Added security and analytics features for the Azure data portfolio


The Microsoft Intelligent Data Platform features everything already available in the Azure Data portfolio (Azure Data Factory, Azure Data Explorer, SQL Server 2022, Azure SQL, Cosmos DB, and more.) as well as new products and features, including SQL Server 2022, Azure Synapse Link for SQL, Microsoft Purview Data Estate Insights, and Datamart in Power BI:

Azure Exam, Azure Exam Prep, Azure Exam Preparation, Azure Career, Azure Skills, Azure Jobs, Azure Guides, Azure Tutorial and Materials

◉ SQL Server 2022, currently in preview, is the most secure database of the last decade. And is now integrated with Microsoft Purview and Azure Synapse Link, allowing for richer insights and governance from data at scale. SQL Server 2022 also comes with new features including AWS S3 support, Azure Active Directory authentication, Query Store hints, as well as security improvements compared to SQL Server 2019.

◉ Azure Synapse Link for SQL, now in preview, offers real-time analytics for data stored in Azure Synapse Analytics and Azure SQL. It is an automated system that allows for replication of data from transactional databases (both SQL Server 2022 and Azure SQL Database) to a dedicated SQL pool in Azure Synapse Analytics. Azure Synapse Link features near real-time analytics, low-code/no-code solutions for replicating data, as well as minimal operational impact on source systems.

◉ Purview Data Estate Insights is an application that provides Chief Data Officers and other strategic leaders with a summary of their data estate and the risk associated with that data. Purview provides insights on data stewardship, inventory, curation, and governance through automatically generated reports which can be easily shared with stakeholders.

◉ Lastly, Datamart in Power BI allows analysts to access richer insights from their data sets through data marts. Datamarts are self-service analytic solutions that help to bridge the gap between business users through a simple and optionally no-code experience. With datamarts, you can easily ingest and prepare data, add business semantics to data, manage and govern data, as well as build and share reports.

Real-world applications for businesses through real-time data


Let’s explore one example of how the Microsoft Intelligent Data Platform helped navigate supply chain issues:

Many operations companies conduct daily batch runs, where they must manually track their inventory levels and input the data at least once a day. With this method, these organizations cannot accurately predict how much product to sell and must err on the side of selling less to avoid running out of inventory. In times when supply chains are uncertain, this means companies miss out on even more sales.

Azure Exam, Azure Exam Prep, Azure Exam Preparation, Azure Career, Azure Skills, Azure Jobs, Azure Guides, Azure Tutorial and Materials

Azure Exam, Azure Exam Prep, Azure Exam Preparation, Azure Career, Azure Skills, Azure Jobs, Azure Guides, Azure Tutorial and Materials

With the Microsoft Intelligent Data Platform, companies can get real-time information on current inventory levels, rather than a daily report. They can also extract AI-driven insights based on demand spikes, shipping delays, and factory status that predict how many units will be available in a week’s time. This information is supported by the upgraded SQL Server 2022 as well as Azure Synapse Link for SQL server, which allows for more on-premises data to be extended to the cloud, analyzed, and used for decision making.

Azure Exam, Azure Exam Prep, Azure Exam Preparation, Azure Career, Azure Skills, Azure Jobs, Azure Guides, Azure Tutorial and Materials

But what about using data for customer-facing solutions? The Microsoft Intelligent Data platform leverages the CosmosDB platform, providing consumers with recommendations for the best product based on real-time availability of units, delivery time, and compatibility with their needs. Consumers also have access to a support number powered by Power Virtual Agents; through Conversational AI, consumers can get intelligent updates on their order status so they can get the information they need quickly.

Source: microsoft.com

Saturday, 10 September 2022

Azure private MEC—A thriving partner ecosystem

In June of 2021, Microsoft introduced Azure private multi-access edge compute (MEC), a solution that enables operators and system integrators (SI’s) to deliver high performance, low latency connectivity, together with IoT applications at the edge, helping to deliver the next wave of digital transformation. This is made possible because Azure private MEC combines network functions, applications, and edge-optimized Azure services to deliver high-performance, ultra-low latency solutions.

Azure MEC, Azure Career, , Azure Skills, Azure Jobs, Azure Tutorial and Materials, Azure News, Azure Guides

This year at Mobile World Congress, Microsoft announced Azure Private 5G Core, which is available as part of Azure private MEC. Azure Private 5G Core enables deployment and management of 5G core network functions on an Azure Stack Edge (ASE) device, as part of an on-premises private mobile network for enterprises. The 5G core network functions connect with 4G and 5G radio access networks (RANs) and SIM solutions from a growing list of solution partners, providing secure, high performance, low latency connectivity. In this way, Azure Private 5G Core dramatically simplifies the deployment and operation of private networks for a variety of use cases.

With just a few clicks, organizations can easily deploy a customized set of selectable 5G core functions, and a variety of applications on a small edge-compute platform, at tens to hundreds of locations, within minutes. RAN deployments can also be managed from the cloud with common observability across the network. SIMs can be securely configured and deployed from the cloud without exposing them to manual operations. In this article, we highlight a few partners from within our ecosystem, and the solutions they’re developing across a variety of industries.

Microsoft partner ecosystem


As enterprises shift to the cloud, the demand for cloud solutions and services is at an all-time high. As a result, the partner ecosystem is integral to the Azure private MEC strategy. Microsoft’s technology partners represent a tremendous opportunity for collaboration while driving growth.

“A recent study shows that for every dollar our cloud generates in revenue in the local region, our local partner ecosystem generates as much as nine additional dollars. Our customers are using the power of the cloud to generate even more revenue. That means even more local economic success. That’s at the core of what we do.”— Satya Nadella, CEO of Microsoft

Operators and system integrator (SI) partners building certified Azure private MEC offerings get full access to our ecosystem of Industry4.0 and Network ISVs to develop complete solutions for a wide range of customer outcomes, simplifying procurement, deployment of management, and monetization of 5G edge for enterprise customers. Application ISVs benefit from a consistent onboarding experience with established tools and resources, while working with a fully featured platform to develop ultra-low latency applications. Partner solutions are then distributed via Azure Marketplace, easily scaling distribution.

Azure private MEC partner solutions


Our GTM partners are actively developing solutions that utilize their proven technology on Azure private MEC to help enterprises meet digital transformation goals. Below are a few recent examples of innovative solutions that have been created for the manufacturing, transportation, and logistics industries.

Video analytics enables simultaneous real-time quality inspection of 100 iterations

Accenture has developed a solution to improve quality monitoring processes for the manufacturing industry, leveraging AI and video analytics over 5G and boosting overall productivity using cameras and sensors to increase quality assurance testing throughput and improvements in defect detection. 5G’s ultra-low latency and high throughput make it possible to have rapid access to high volumes of data, enabling manufacturers to use analytics to help identify defects, and dynamically test based on real-time conditions, improving scrap rates and total cost of quality in the process.

“A large car manufacturing company wanted to take advantage of 5G technology to enhance their production quality, meet compliance effectively, and increase manufacturing plant efficiency. Partnering with Microsoft, Accenture leveraged video analytics asset Solution.AI for processing to build a solution that was developed using Azure Private MEC. One of the priority use cases was quality inspection. In this scenario, the solution captures images of the moving vehicles on the assembly line, processes the images in real-time using 5G network and solution, and alerts the monitoring team and allows the inspection team to analyze the data in real-time to facilitate faster remediation. The solution has enabled the automobile manufacturer to perform immediate quality inspection boosting their quality accuracy, enhanced safety, compliance, and production efficiencies. As a result, they can simultaneously visualize more than 100 iterations of the same model. It gives them a more flexible and efficient production line to better serve customers with their unique needs.”—Mark Rogers, Accenture Microsoft BG Communication & Media Lead

Protecting frontline workers’ safety with real-time video analytics

Avanade, a leading provider of digital, cloud, and advisory services works closely with enterprise clients and Microsoft to develop private network solutions to protect workers in hazardous environments, using real-time analytics. By leveraging 5G’s wide coverage and reliability characteristics, Avanade’s private networking solution provides the foundation for industries to transform their operations, addressing today’s production challenges around worker safety and enhanced operational processes.

“As we collaborate with our clients in preparing for 5G and emerging opportunities with next-generation connectivity and edge, a focus area has been worker effectiveness and safety. Many of our clients have frontline workers in mission-critical and dangerous locations like warehouses, ports, logistics facilities, and frontline mining, utility, and energy field sites and we have been working closely with Microsoft using private MEC services to bring worker safety as a service that leverages public and private wireless networks. Designed to enable reliable coverage in difficult areas, the solution provides a dashboard and operational tools for near real-time health and safety monitoring of workers, all delivered over a secure network from Microsoft's cloud to edge.”—Aaron Reich, Executive and Emerging Technologies lead at Avanade

Digital twins offer a bird’s eye view of how to improve worker safety

Cognizant, a company whose offerings help make buildings smarter, was striving to create a comprehensive, robust, and cost-effective solution to improve worker safety. Their holistic approach to enterprise private 5G network includes design, implementation, and management—as well as helping them implement innovative digital solutions using technology such as AGV/AMR, Vision AI, AR/VR, and digital twins to solve their operational challenges. Private 5G provides the ultra-low latency, device scale and reliability needed for these applications that Wi-Fi networks cannot guarantee.

“Cognizant’s solution for worker safety for oil and gas vessels and manufacturing environments uses off-the-shelf cameras and innovative visual behavior sensing software from its partner to create a digital twin of physical space. The solution provides a bird eye view of the space including location of workers using multi-camera field of view and raises alerts when a worker is in close proximity of a pre-defined hazard zone. In addition, the solution will provide information such as workers entering into unauthorized locations, or workers crowding into a space which can trigger alerts to the facility’s safety officer. The solution is to be available on Azure Stack Edge for deployment and will operate in a standalone mode in case of no connectivity between the vessel and Azure cloud.”—Anjali Deo, AVP & CTO Industry+, Cognizant

Reducing manufacturing costs with private networks

With 5G, it is possible to have critical communications over private networks in real time, allowing these applications to operate reliably and securely. 5G opens up new fields of application in industries where the performance of other wireless technologies, such as Wi-Fi, previously fell short. Capgemini views this new communications standard as key to reducing costs and enhancing manufacturing processes in automated factory operations.

“By integrating and deploying 5G use cases based on private MEC, Capgemini is working closely with Microsoft to bring the end-to-end solutions facilitating the rapid deployment of private networks. Capgemini is innovating by providing customized Industry 4.0 use cases and business process modernization to support Intelligent Industries. Capgemini has a rich repository of industrial use cases leveraging its unique microservices library, which enables the acceleration of tailored use cases for its clients. As an example, Capgemini has developed Remote Assistance and guided procedure use case to support and help field operations and technicians for complex maintenance and repair in real-time. Using high definition 5G video calls, collaborative tools, and 3D procedures, the solution is integrating AR/VR and Object detection Software Components available off the shelf."—Fotis Karonis, Executive Vice President, Capgemini Group Leader 5G & Edge

Private 5G powers biosecurity to improve airport safety

Successfully restarting air passenger travel while restoring confidence in safety is vital to enabling the global economy’s recovery from COVID-19. HCL is taking advantage of private 5G to bring biosecurity to airports, speeding up passenger checkpoints like luggage handling and security screening.  Biosecurity applications will limit necessary human interaction, helping airlines manage worker shortages, high travel demand and increased health screening requirements.

“Our comprehensive approach for biosecurity at airports involves people tracking, crowd management, and location awareness using the Microsoft private MEC and HCL’s homegrown solution, UnlocSafe. This implementation has proven how 5G addresses integration and operational challenges in the Airport 4.0 roadmap.”—Srinivas Yerramilli, Director, HCL Technologies

Container condition monitoring creates smarter ports via 5G networks

Tech Mahindra’s “PORTNxT” portfolio of 5G and edge solutions leverages in-house expertise and partner platforms for MEC, analytics, and AI, along with solutions on Private LTE (CBRS), 5G and SDN/NFV to drive the next generation of shipping ports.

“Tech Mahindra’s ContainerSight, an AI-based video analytics solution deployed over cellular networks, helps logistics firms and port authorities generate real-time insight into a container's condition at the time of entry and exit from their facilities, eliminating false claims and avoiding reputation loss.”—Ranga Thittai, Vice President & Head Enterprise Network Service at Tech Mahindra

Looking ahead


As we move forward with Azure private MEC, we continue to work with our partners, helping them identify and validate industry use cases, while developing and distributing solutions to enterprises via Azure Marketplace and sales motions. This effort remains at the core of our strategy, with the above use cases validating the importance of our partner ecosystem and its ability to positively impact a broad array of industries. The insights gained from our partners are pivotal to evolving the private MEC solution to meet the needs of our partners and enterprises.

It’s easy to get started with Azure private MEC


As innovative use cases for private wireless networks continue to develop and industry 4.0 transformation accelerates, we welcome ISVs, platform partners, operators, and SIs to learn more about Azure private MEC.

Application ISVs interested in deploying their industry or horizontal solutions on Azure should begin by onboarding their applications to Azure Marketplace.

Platform partners, operators, and SIs interested in partnering with Microsoft to deploy or integrate with private MEC can get started by reaching out to the Azure private MEC Team.

Source: microsoft.com