Showing posts with label Azure. Show all posts
Showing posts with label Azure. Show all posts

Saturday, 24 August 2024

Announcing mandatory multi-factor authentication for Azure sign-in

Announcing mandatory multi-factor authentication for Azure sign-in

As cyberattacks become increasingly frequent, sophisticated, and damaging, safeguarding your digital assets has never been more critical. As part of Microsoft’s $20 billion dollar investment in security over the next five years and our commitment to enhancing security in our services in 2024, we are introducing mandatory multifactor authentication (MFA) for all Azure sign-ins.

The need for enhanced security


One of the pillars of Microsoft’s Secure Future Initiative (SFI) is dedicated to protecting identities and secrets—we want to reduce the risk of unauthorized access by implementing and enforcing best-in-class standards across all identity and secrets infrastructure, and user and application authentication and authorization. As part of this important priority, we are taking the following actions:

  • Protect identity infrastructure signing and platform keys with rapid and automatic rotation with hardware storage and protection (for example, hardware security module (HSM) and confidential compute).
  • Strengthen identity standards and drive their adoption through use of standard SDKs across 100% of applications.
  • Ensure 100% of user accounts are protected with securely managed, phishing-resistant multifactor authentication.
  • Ensure 100% of applications are protected with system-managed credentials (for example, Managed Identity and Managed Certificates).
  • Ensure 100% of identity tokens are protected with stateful and durable validation.
  • Adopt more fine-grained partitioning of identity signing keys and platform keys.
  • Ensure identity and public key infrastructure (PKI) systems are ready for a post-quantum cryptography world.

Ensuring Azure accounts are protected with securely managed, phishing-resistant multifactor authentication is a key action we are taking. As recent research by Microsoft shows that multifactor authentication (MFA) can block more than 99.2% of account compromise attacks, making it one of the most effective security measures available, today’s announcement brings us all one step closer toward a more secure future.

In May 2024, we talked about implementing automatic enforcement of multifactor authentication by default across more than one million Microsoft Entra ID tenants within Microsoft, including tenants for development, testing, demos, and production. We are extending this best practice of enforcing MFA to our customers by making it required to access Azure. In doing so, we will not only reduce the risk of account compromise and data breach for our customers, but also help organizations comply with several security standards and regulations, such as Payment Card Industry Data Security Standard (PCI DSS), Health Insurance Portability and Accountability Act (HIPAA), General Data Protection Regulation (GDPR), and National Institute of Standards and Technology (NIST).

Preparing for mandatory Azure MFA


Required MFA for all Azure users will be rolled out in phases starting in the 2nd half of calendar year 2024 to provide our customers time to plan their implementation: 

  • Phase 1: Starting in October, MFA will be required to sign-in to Azure portal, Microsoft Entra admin center, and Intune admin center. The enforcement will gradually roll out to all tenants worldwide. This phase will not impact other Azure clients such as Azure Command Line Interface, Azure PowerShell, Azure mobile app and Infrastructure as Code (IaC) tools. 
  • Phase 2: Beginning in early 2025, gradual enforcement for MFA at sign-in for Azure CLI, Azure PowerShell, Azure mobile app, and Infrastructure as Code (IaC) tools will commence.

Beginning today, Microsoft will send a 60-day advance notice to all Entra global admins by email and through Azure Service Health Notifications to notify the start date of enforcement and actions required. Additional notifications will be sent through the Azure portal, Entra admin center, and the M365 message center.

For customers who need additional time to prepare for mandatory Azure MFA, Microsoft will review extended timeframes for customers with complex environments or technical barriers.

How to use Microsoft Entra for flexible MFA


Organizations have multiple ways to enable their users to utilize MFA through Microsoft Entra:

  • Microsoft Authenticator allows users to approve sign-ins from a mobile app using push notifications, biometrics, or one-time passcodes. Augment or replace passwords with two-step verification and boost the security of your accounts from your mobile device.
  • FIDO2 security keys provide access by signing in without a username or password using an external USB, near-field communication (NFC), or other external security key that supports Fast Identity Online (FIDO) standards in place of a password.
  • Certificate-based authentication enforces phishing-resistant MFA using personal identity verification (PIV) and common access card (CAC). Authenticate using X.509 certificates on smart cards or devices directly against Microsoft Entra ID for browser and application sign-in.
  • Passkeys allow for phishing-resistant authentication using Microsoft Authenticator.
  • Finally, and this is the least secure version of MFA, you can also use a SMS or voice approval as described in this documentation.

External multifactor authentication solutions and federated identity providers will continue to be supported and will meet the MFA requirement if they are configured to send an MFA claim.

Moving forward


At Microsoft, your security is our top priority. By enforcing MFA for Azure sign-ins, we aim to provide you with the best protection against cyber threats. We appreciate your cooperation and commitment to enhancing the security of your Azure resources.

Our goal is to deliver a low-friction experience for legitimate customers while ensuring robust security measures are in place. We encourage all customers to begin planning for compliance as soon as possible to avoid any business interruptions.

Source: azure.microsoft.com

Thursday, 22 August 2024

Announcing a new OpenAI feature for developers on Azure

Announcing a new OpenAI feature for developers on Azure

We are thrilled to announce the launch of OpenAI’s latest model on Azure. This new model, officially named GPT-4o-2024-08-06, brings innovative features designed to elevate developer experiences on Azure. Specifically, the new model focuses on enhancing productivity through Structured Outputs, like JSON Schemas, for the new GPT-4o and GPT-4o mini models.

A focus on Structured Outputs


GPT-4o was first announced in May 2024, as OpenAI’s new multimodal model, followed by GPT-4o mini in July 2024. Today’s version is designed with a specific use case in mind: simplifying the process of generating well-defined, structured outputs from AI models. This feature is particularly valuable for developers who need to validate and format AI outputs into structures like JSON Schemas. Developers often face challenges validating and formatting AI outputs into well-defined structures like JSON Schemas.  

Structured Outputs addresses this by allowing developers to specify the desired output format directly from the AI model. This feature enables developers to define a JSON Schema for text outputs, simplifying the process of generating data payloads that can seamlessly integrate with other systems or enhance user experiences. 

Use cases for JSON


JSON Schema is essential for defining the structure and constraints of JSON documents, ensuring they follow specific formats with mandatory properties and value types. It enhances data understandability through semantic annotation and serves as a domain-specific language for optimized application requirements. Development teams use JSON Schema to maintain consistency across platforms, drive model-driven UI constraints, and automatically generate user interfaces. It aids in data serialization, security testing, and partial validation in technical scenarios. JSON Schema also supports automated testing, Schema inference, and machine-readable web profiles, improving data interoperability. It standardizes validation interfaces and reporting, handles external validation, and ensures data consistency within and across documents. It can also help with customer support and how to communicate in a timely manner. 

Two flavors of Structured Outputs


Structured Outputs is available in two forms: 

1. User-defined JSON Schema: This option allows developers to specify the exact JSON Schema they want the AI to follow, supported by both GPT-4o-2024-08-06 and GPT-4o-mini-2024-07-18.
2. More Accurate Tool Output (“Strict Mode”): This limited version lets developers define specific function signatures for tool use, supported by all models that support function calling, including GPT-3.5 Turbo, GPT-4, GPT-4 Turbo, and GPT-4o models from June 2023 onwards. 

Technical guidance on using Structured Outputs


To help you get started with Structured Outputs, we recommend the following approach. 

Getting started with Structured Outputs 

1. Define Your JSON Schema: Determine the structure you want your AI outputs to follow. This can include required fields, data types, and other constraints. 
2. Configure the AI model: Use the Structured Outputs feature to specify your JSON Schema within the API call. This ensures that the AI output adheres to your defined structure. 
3. Integration and testing: Integrate the output into your application or system, and test thoroughly to ensure compliance with your JSON Schema. 

Example use case: Customer support automation


Imagine you’re developing a customer support chatbot that needs to generate responses in a specific format for logging and analytics. By using Structured Outputs, you can define a JSON Schema that includes fields like responseText, intent, confidenceScore, and timestamp. This ensures that every response generated by the chatbot is formatted correctly, making it easier to log, analyze, and act upon. 

Example API call

Here’s an example API call to illustrate how to use Structured Outputs:

{
  "model": "gpt-4o-2024-08-06",
  "prompt": "Generate a customer support response",
  "structured_output": {
    "schema": {
      "type": "object",
      "properties": {
        "responseText": { "type": "string" },
        "intent": { "type": "string" },
        "confidenceScore": { "type": "number" },
        "timestamp": { "type": "string", "format": "date-time" }
      },
      "required": ["responseText", "intent", "confidenceScore", "timestamp"]
    }
  }
}

Pricing


We will make pricing for this feature available soon. Please bookmark the Azure OpenAI Service pricing page

Learn more about the future of AI


We’ve been rolling out several new models recently, and we understand it can be a lot to keep up with. This flurry of activity is all about empowering developer innovation. Each new model brings unique capabilities and enhancements, helping you build even more powerful and versatile applications. 

The launch of this new model feature for GPT-4o and GPT-4o mini marks a significant milestone in our ongoing efforts to push the boundaries of AI capabilities. We’re excited to see how developers will leverage these new features to create innovative and impactful applications.

Source: microsoft.com

Saturday, 9 March 2024

Tackle large volumes of data with new solution from SLB for Azure Data Manager for Energy

Tackle large volumes of data with new solution from SLB for Azure Data Manager for Energy

As the energy industry continues to innovate, companies find themselves balancing the ever-growing demand for energy with the desire to work toward more efficient, sustainable operations. Decision makers spread across the globe rely on accurate data to improve confidence and minimize risk, so harnessing the power of data has become a central tenant of energy companies’ success as they push to evolve. However, different types of data and the variety of file types that energy companies manage daily make it difficult to access and analyze the data efficiently.

That is why energy companies around the world are using Microsoft technologies, including Azure Data Manager for Energy. This cloud-based OSDU® Data Platform maximizes customer performance by being truly interoperable across the exploration and production data landscape. It can integrate with industry datasets, applications, and other cloud services, which makes it easier for employees throughout an organization to access the data they need quickly and efficiently. These sources may include other data platforms for asset performance, the internet of things (IoT), or production, but also databases that may still be deployed on-premises before migration to the cloud can take place. Data siloes slow productivity and complicate workflows—optimizing access to high quality data is crucial for success.

The Microsoft partner ecosystem is a key component in how Microsoft delivers technology, services, and cloud-to-edge solutions for our customers. SLB—one of Microsoft’s strategic partners in the energy sector—designed the Enterprise Data Solution to work seamlessly with Azure Data Manager for Energy, easing data ingestion, management, and discoverability for domain applications.

Enterprise Data Solution from SLB facilitates day-to-day workflows


Designed to reduce the friction of dealing with large volumes of data traditionally stored in different silos, Enterprise Data Solution from SLB features a user-friendly, inclusive interface that facilitates the process of data ingestion, curation, and accessibility. Enterprise Data Solution from SLB allows customers to unlock data with a single click instead of running multiple steps to get the same results. This makes it easier for a variety of end users to ingest different data types and formats and access data faster. In addition, the curation of data along this pipeline is aided by several AI-enhanced tools, from more streamlined unstructured data ingestion, data quality control, contextualization, and data mastering, reducing the need for manual intervention.

Tackle large volumes of data with new solution from SLB for Azure Data Manager for Energy

Both traditional and new energy technical workflows are optimized when data and application interoperability are achieved. For example, handling large volumes of seismic data (access, ingestion, and streaming) can be dramatically improved with Enterprise Data Solution from SLB. The typical cost for handling these workloads is also reduced significantly, speeding up workflows and allowing users to incorporate as much diverse domain data as possible. Geoscientists and engineers can run more scenarios in different applications, compare results with their colleagues, and ultimately liberate the most accurate data quickly.

“Having the Enterprise Data Solution from SLB working with ADME is transformative. Workflow efficiency is vastly increased, and decision-making accelerated to levels that are impossible to achieve without it. Pairing the two technologies enables AI to work at full effect in the energy sector. AI plays a dual role, it enhances data quality, accessibility, and utilization, which in turn enables it to drive innovation across the entire planning to operations life cycle.”

—Jamie Cruise, Product Leader, SLB

Key features of Enterprise Data Solution from SLB


The key features of Enterprise Data Solution from SLB include:

Ingest: Collect, ingest, and stage data easily from desktop or other file sources for document transcription and parsing. For example, you can use a cloud-style file drop for changes to files and get automated translation into the correct schema. This is increasingly aided by AI to reduce human workload requirements.

Curate: Data quality is at the heart of many of the challenges in the industry and causes a lot of wasted time. The Enterprise Data Solution from SLB accelerates curation and refining of datasets into trusted data products, liberating them with streamlined tools for data quality so they’re ready for consumption. These tools include AI for quality control and mastering.

Discover: Ready-to-use data products enable quick data-asset analysis and screening online for quick-to-create collections and consumption from domain workflows. Locating and accessing quality data is critical for decision-making. Enterprise Data Solution from SLB enables workflows to access the right data for stakeholders to easily discover, visualize, and use.

Consume: The Enterprise Data Solution from SLB makes it easy for analysts and others to access trusted data products from within their familiar applications and workflows, whether that is the Delfi™ digital platform, Petrel™ subsurface software, or Techlog™ wellbore software. With an intuitive user interface that’s accessible from a web browser, full data lineage, and project versioning are always available in an optimized format. Data can be shared and consumed in workflows such as seismic processing, machine learning seismic interpretation, and machine learning property modeling. In addition, this data is now easily consumable in machine learning workbenches such as Delfi Data Science or tooling available in Microsoft Azure. This cuts time to decision and action, critical components for a smooth production workflow.

Unlock downstream advantages within the Microsoft ecosystem


Once data is in Azure Data Manager for Energy, it conforms to the OSDU® Technical Standard and is a reliable platform for other applications to consume and re-ingest data iteratively. The interoperability of Azure Data Manager for Energy improves accessibility, traceability, and validity of data, allowing domain users and data scientists to deliver business outcomes faster.

Enterprise Data Solution from SLB helps customers take full advantage of Azure Data Manager for Energy by making it more efficient to ingest large amounts of quality, trusted, and traceable data into the platform. Ultimately, Azure Data Manager for Energy’s interoperability empowers customers by harmonizing data that can be leveraged across the Microsoft ecosystem.

Once this data is in the platform, there are many opportunities to take advantage of Microsoft Azure OpenAI Service to drive additional insights and efficiencies further downstream in Microsoft 365 applications. For example, end users can extract tables directly from ingested documents and generate Open XML documents that are ready to use in Microsoft Excel, where they can be more easily visualized.

Source: microsoft.com

Thursday, 29 February 2024

Azure Decoded: Insider Tips for Optimal Performance

Azure Decoded: Insider Tips for Optimal Performance

Understanding Azure Architecture


Azure, Microsoft's cloud computing platform, offers a plethora of services to businesses, ranging from virtual machines to AI-powered analytics. To unlock its full potential, one must grasp its architecture intricately. At its core, Azure operates on a global network of data centers, ensuring low-latency access worldwide. Understanding this architecture is paramount for optimizing performance.

Leveraging Azure's Scalability


Scalability is one of Azure's most compelling features. With Azure, businesses can effortlessly scale resources up or down based on demand. Utilizing Azure Autoscale, businesses can automate this process, ensuring optimal resource allocation at all times. By leveraging Azure's scalability, businesses can handle fluctuations in traffic seamlessly, without compromising performance.

Enhancing Security with Azure


Security is a top priority for businesses migrating to the cloud. Azure offers a robust set of security tools and features to safeguard data and applications. Azure Security Center provides advanced threat detection and security recommendations, ensuring comprehensive protection against cyber threats. Additionally, Azure Active Directory enables businesses to manage user identities and access effectively, enhancing overall security posture.

Optimizing Performance with Azure Monitoring


Monitoring is crucial for maintaining optimal performance in Azure. Azure Monitor provides comprehensive insights into the performance and health of applications and infrastructure. By leveraging metrics, logs, and alerts, businesses can identify performance bottlenecks and optimize resource utilization effectively. Coupled with Azure Application Insights, businesses gain deep visibility into application performance, enabling proactive optimization strategies.

Maximizing Efficiency with Azure Cost Management


Cost management is a key consideration for businesses operating in the cloud. Azure offers robust tools for optimizing costs and maximizing efficiency. Azure Cost Management + Billing provides detailed insights into resource usage and spending patterns, allowing businesses to identify cost-saving opportunities. By implementing budget alerts and resource tagging, businesses can effectively manage costs and allocate resources judiciously.

Accelerating Development with Azure DevOps


In today's fast-paced digital landscape, rapid software development is essential for staying ahead of the competition. Azure DevOps streamlines the development process, enabling teams to collaborate seamlessly and deliver high-quality software faster. With features like Azure Pipelines for continuous integration and deployment, businesses can automate the software delivery lifecycle, reducing manual overhead and accelerating time to market.

Conclusion

Azure offers a myriad of capabilities to businesses seeking to harness the power of the cloud. By understanding Azure's architecture, leveraging its scalability, enhancing security measures, optimizing performance monitoring, maximizing cost efficiency, and accelerating development with Azure DevOps, businesses can unlock unparalleled performance and agility in the cloud.

Thursday, 14 December 2023

Microsoft and Oracle announce that Oracle Database@Azure is now generally available

Microsoft and Oracle announce that Oracle Database@Azure is now generally available

The Microsoft and Oracle partnership is focused on giving customers choice and removing the hurdles faced when migrating mission-critical workloads to the public cloud where they can access the rich set of technology needed to accelerate innovation and compete more effectively. In September, Satya Nadella and Larry Ellison announced Oracle Database@Azure, a new offering that delivers Oracle database services running on Oracle Cloud Infrastructure (OCI) inside Azure datacenters and gives customers more flexibility in where they run their workloads.

We are excited to share that Oracle Database@Azure is generally available in the Azure East United States region, with more regions coming in 2024. You can now use Oracle database services running on OCI hardware deployed in Azure data centers. This streamlines the migration of Oracle database workloads to Azure, where you can integrate and innovate with the breadth of Microsoft Cloud services like Azure AI and our application platform and developer services. You’ll be able to purchase this Oracle-managed service, running on OCI inside Azure datacenters, from the Azure Marketplace with the same features and pricing as Oracle Exadata cloud service on OCI. And, you can use your existing Azure and Oracle commitments, program benefits, and licenses to do so.

Exadata Database Service is the first service to be available along with support for Oracle Real Application Clusters (RAC), Oracle GoldenGate, and Oracle Data Guard technologies. We plan to offer Autonomous Database Service in the near future. Microsoft Azure is the only hyperscaler other than OCI to offer these Oracle database services.

Many customers have shown strong interest in the product, and the Oracle and Microsoft product teams are working diligently to expand the availability of more regions in 2024, beginning with Germany Central, Australia East, France Central, Canada Central, Brazil South, Japan East, United Kingdom South, Central United States, and South Central United States. We will continue to add more regions based on customer demand, and we encourage customers to share their requirements for additional regions with their Oracle or Microsoft account teams.

The benefits of Oracle databases in Azure


Oracle Database@Azure provides a host of benefits:

  • Optimized performance and latency: Oracle database services in Azure are designed to offer high levels of performance, scale, security, and availability. By running OCI databases in Azure datacenters alongside other services in Azure, you can optimize latency and performance.
  • Streamlined migration: You can use familiar and proven Oracle solutions like Oracle Zero Downtime Migration (ZDM) to migrate Oracle databases as is with no need for refactoring applications or databases.
  • Application modernization: You can use the rich set of application services, development tools, and frameworks available in Azure to modernize your workloads and run cloud native applications with direct access to data in Oracle Database@Azure, with the same security and compliance across the entire solution.
  • Data and AI innovation: You can unlock data insights and innovation when you apply Azure data analytics and AI services across your Oracle and non-Oracle data and applications between resources, and handle even mission-critical application.
  • Simplified operations: You can deploy and manage Oracle Database@Azure infrastructure through the Azure portal and use native Azure tools to view metrics, events, and logs for all Oracle Database@Azure databases. You can also do database provisioning and management via OCI interfaces along with access to familiar tooling like Data Guard, GoldenGate, and others.
  • Simplified purchasing: You can purchase Oracle Database@Azure through the Azure Marketplace and take advantage of existing Azure and Oracle commitments and licenses. You’ll also be eligible for Oracle Support Rewards, a program where you can earn rewards that you can use to reduce your Oracle technology license support bill.
  • Unified customer support: You can contact either your Oracle or Microsoft support team for a unified support process.  

Bringing the power of Oracle Exadata to Azure customers


Leading organizations in every industry have long relied on Oracle Exadata and Oracle Real Application Clusters (RAC) to make the most of their data and to power mission critical applications. Oracle Database@Azure customers will have access to Oracle RAC for high availability running on OCI hardware. The service will be deployable in multiple availability zones to ensure regional high availability and in cross-region pairs to support cross-geography disaster recovery scenarios.

Trusted migration and resilience


Migrating and deploying production-grade environments is straightforward when moving to Oracle Database@Azure. Oracle provides proven database migration strategies, including automated migration solutions like Zero-Downtime Migration (ZDM). Oracle Database@Azure also supports the Oracle Maximum Availability Architecture (MAA), which provides trusted high-availability architectures and configurations to meet a range of resiliency requirements.

Source: microsoft.com

Saturday, 2 December 2023

Building resilience to your business requirements with Azure

Building resilience to your business requirements with Azure

At Microsoft, we understand the trust customers put in us by running their most critical workloads on Microsoft Azure. Whether they are retailers with their online stores, healthcare providers running vital services, financial institutions processing essential transactions, or technology partners offering their solutions to other enterprise customers—any downtime or impact could lead to business loss, social services interruptions, and events that could damage their reputation and affect the end-user confidence. In this blog post, we will discuss some of the design principles and characteristics that we see among the customer leaders we work with closely to enhance their critical workload availability according to their specific business needs.

A commitment to reliability with Azure


As we continue making investments that drive platform reliability and quality, there remains a need for customers to evaluate their technical and business requirements against the options Azure provides to meet availability goals through architecture and configuration. These processes, along with support from Microsoft technical teams, ensure you are prepared and ready in the event of an incident. As part of the shared responsibility model, Azure offers customers various options to enhance reliability. These options involve choices and tradeoffs, such as possible higher operational and consumption costs. You can use the flexibility of cloud services to enable or disable some of these features if your needs change. In addition to technical configuration, it is essential to regularly check your team’s technical and process readiness.

“We serve customers of all sizes in an effort to maximize their return on investment, while offering support on their migration and innovation journey. After a major incident, we participated in executive discussions with customers to provide clear contextual explanations as to the cause and reassurances on actions to prevent similar issues. As product quality, stability, and support experience are important focus areas, a common outcome of these conversations is an enhancement of cooperation between customer and cloud provider for the possibility of future incidents. I’ve asked Director of Executive Customer Engagement, Bryan Tang, from the Customer Support and Service team to share more about the types of support you should seek from your technical Microsoft team & partners.”—Mark Russinovich, CTO, Azure.

Design principles


Key elements to building a reliable workload begin with establishing an agreed available target with your business stakeholders, as that would influence your design and configuration choices. As you continue to measure uptime against baseline, it is critical to be ready to adopt any new services or features that can benefit your workload availability given the pace of Cloud innovation. Finally, adopt a Continuous Validation approach to ensure your system is behaving as designed when incidents do occur or identify weak points early, along with your team’s readiness upon major incidents to partner with Microsoft on minimizing business disruptions. We will go into more details on these design principles:

  • Know and measure against your targets
  • Continuously assess and optimize
  • Test, simulate, and be ready

Know and measure against your targets


Azure customers may have outdated availability targets, or workloads that don’t have targets defined with business stakeholders. To cover the targets mentioned more extensively, you can refer to the business metrics to design resilient Azure applications guide. Application owners should revisit their availability targets with respective business stakeholders to confirm those targets, then assess if their current Azure architecture is designed to support such metrics, including SLA, Recovery Time Objective (RTO), and Recovery Point Objective (RPO). Different Azure services, along with different configurations or SKU levels, carry different SLAs. You need to ensure that your design does, at a minimum, reflect: 

- Defined SLA versus Composite SLA: Your workload architecture is a collection of Azure services. You can run your entire workload based on infrastructure as a service (IaaS) virtual machines (VMs) with Storage and Networking across all tiers and microservices, or you can mix your workloads with PaaS such as Azure App Service and Azure Database for PostgreSQL, they all provide different SLAs to the SKUs and configurations you selected. To assess their workload architecture, we asked customers about their SLA. We found that some customers had no SLA, some had an outdated SLA, and some had unrealistic SLAs. The key is to get a confirmed SLA from your business owners and calculate the Composite SLA based on your workload resources. This shows you how well you meet your business availability objectives.

Continuously assess options and be ready to optimize


One of the most significant drivers for cloud migration is the financial benefits, such as shifting from Capital Expenditure to Operating Expenditure and taking advantage of the economies cloud providers operating at scale. However, one often-overlooked benefit is our continued investment and innovation in the newest hardware, services, and features.

Many customers have moved their workloads from on-premises to Azure in a quick and simple way, by replicating workload architecture from on-premises to Azure, without using the extra options and features Azure offers to improve availability and performance. Or we see customers treating their Cloud architecture as pets versus cattle, instead of seeing them as resources that work together and can be changed with better options when they are available. We fully understand customer preference, habit, and maybe the worries of black-box as opposed to managing your own VMs where you do maintenance or security scans. However, with our ongoing innovation and commitment to providing platform as a service (PaaS) and software as a service (SaaS), it gives you opportunities to focus your limited resources and effort on functions that make your business stand out.

◉ Architecture reliability recommendations and adoption:

- We make every effort to ensure you have the most specific and latest recommendations through various channels, our flagship channel through Azure Advisor, which now also supports the Reliability Workbook, and we partner closely with engineering to ensure any additional recommendations that might take time to work into workbook and Azure Advisor are available to your consideration through Azure Proactive Resiliency Library (APRL). These collectively provide a comprehensive list of documented recommendations for the Azure services you leverage for your considerations.

◉ Security and data resilience:

- While the previous point focuses on configurations and options to leverage for the Azure components that make up your application architecture, it is just as critical to ensure your most critical asset is protected and replicated. Architecture gives you a solid foundation to withstand failure in cloud service level failure, it is as critical to ensure you have the necessary data and resource protection from any accidental or malicious deletes. Azure offers options such as Resource Locks, enabling soft delete on your storage accounts. Your architecture is as solid as the security and identity access management applied to it as an overall protection. 

◉ Assess your options and adopt:

- While there are many recommendations that can be made, ultimately, implementation remains your decision. It is understandable that changing your architecture might not just a matter of modifying your deployment template, as you want to ensure your test cases are comprehensive, and it may involve time, effort, and cost to run your workloads. Our field is prepared to help you with exploring options and tradeoffs, but the decision is ultimately yours to enhance availability to meet the business requirements of your stakeholders. This mentality to change is not limited to reliability, but also other aspects of Well-Architected Framework, such as Cost Optimization. 

Test, simulate, and be ready


Testing is a continuous process, both at a technical and process level, with automation being a key part of the process. In addition to a paper-based exercise in ensuring the selection of the right SKUs and configurations of cloud resources to strive for the right Composite SLA, applying Chaos Engineering to your testing helps find weaknesses and verify readiness otherwise. The criticality of monitoring your application to detect any disruptions and react to quickly recover, and finally, knowing how to engage Microsoft support effectively, when needed, can help set the proper expectations to your stakeholders and end users in the event of an incident. 

◉ Continuous validation-Chaos Engineering: Operating a distributed application, with microservices and different dependencies between centralized services and workloads, having a chaos mindset helps inspire confidence in your resilient architecture design by proactively finding weak points and validating your mitigation strategy. For customers that have been striving for DevOps success through automation, continuous validation (CV) became a critical component for reliability, besides continuous integration (CI) and continuous delivery (CD). Simulating failure also helps you to understand how your application would behave with partial failure, how your design would respond to infrastructure issues, and the overall level of impact to end users. Azure Chaos Studio is now generally available to assist you further with this ongoing validation. 

◉ Detect and react: Ensure your workload is monitored at the application and component level for a comprehensive health view. For instance, Azure Monitor helps collecting, analyzing, and responding to monitoring data from your cloud and on-premises environments. Azure also offers a suite of experiences to keep you informed about the health of your cloud resources in Azure Status that informs you of Azure service outages, Service Health that provides service impacting communications such as planned maintenance, and Resource Health on individual services such as a VM. 

◉ Incident response plan: Partner closely with our technical support teams to jointly develop an incident response plan. The action plan is essential to developing shared accountability between yourself and Microsoft as we work towards resolution of your incident. The basics of who, what, when for you and us to partner through a quick resolution. Our teams are ready to run test drill with you as well to validate this response plan for our joint success. 

Ultimately, your desired reliability is an outcome that you can only achieve if you take into account all these approaches and the mentality to update for optimization. Building application resilience is not a single feature or phase, but a muscle that your teams will build, learn, and strengthen over time.

Source: microsoft.com

Saturday, 18 November 2023

Azure sets a scale record in large language model training

Azure sets a scale record in large language model training

Azure empowers intelligent services like Microsoft Copilot, Bing, and Azure OpenAI Service that have captured our imagination in recent days. These services, facilitating various applications like Microsoft Office 365, chatbots, and search engines with generative AI, owe their magic to large language models (LLMs). While the latest LLMs are transcendental, bringing a generational change in how we apply artificial intelligence in our daily lives and reason about its evolution, we have merely scratched the surface. Creating more capable, fair, foundational LLMs that consume and present information more accurately is necessary.

How Microsoft maximizes the power of LLMs


However, creating new LLMs or improving the accuracy of existing ones is no easy feat. To create and train improved versions of LLMs, supercomputers with massive computational capabilities are required. It is paramount that both the hardware and software in these supercomputers are utilized efficiently at scale, not leaving performance on the table. This is where the sheer scale of the supercomputing infrastructure in Azure cloud shines and setting a new scale record in LLM training matters. 

Figure 1: Scale records on the model GPT-3 (175 billion parameters) from MLPerf Training v3.0 in June 2023 (3.0-2003) and Azure on MLPerf Training v3.1 in November 2023 (3.1-2002). 

Customers need reliable and performant infrastructure to bring the most sophisticated AI use cases to market in record time. Our objective is to build state-of-the-art infrastructure and meet these demands. The latest MLPerf™ 3.1 Training results are a testament to our unwavering commitment to building high-quality and high-performance systems in the cloud to achieve unparalleled efficiency in training LLMs at scale. The idea here is to use massive workloads to stress every component of the system and accelerate our build process to achieve high quality.

The GPT-3 LLM model and its 175 billion parameters were trained to completion in four minutes on 1,344 ND H100 v5 virtual machines (VMs), which represent 10,752 NVIDIA H100 Tensor Core GPUs, connected by the NVIDIA Quantum-2 InfiniBand networking platform (as shown in Figure 1). This training workload uses close to real-world datasets and restarts from 2.4 terabytes of checkpoints acting closely a production LLM training scenario. The workload stresses the H100 GPUs Tensor Cores, direct-attached Non-Volatile Memory Express disks, and the NVLink interconnect that provides fast communication to the high-bandwidth memory in the GPUs and cross-node 400Gb/s InfiniBand fabric. 

“Azure’s submission, the largest in the history of MLPerf Training, demonstrates the extraordinary progress we have made in optimizing the scale of training. MLCommons’ benchmarks showcase the prowess of modern AI infrastructure and software, underlining the continuous advancements that have been achieved, ultimately propelling us toward even more powerful and efficient AI systems.”—David Kanter, Executive Director of MLCommons 

Microsoft’s commitment to performance


In March 2023, Microsoft introduced the ND H100 v5-series which completed training a 350 million parameter Bidirectional Encoder Representations from Transformers (BERT) language model in 5.4 minutes, beating our existing record. This resulted in a four times improvement in time to train BERT within just 18 months, highlighting our continuous endeavor to bring the best performance to our users.

Figure 2: Relative size of the models BERT (350 million parameters) and GPT-3 (175 billion parameters) from MLPerf Training v3.1.  

Today’s results are with GPT-3, a large language model in the MLPerf Training benchmarking suite, featuring 175 billion parameters, a remarkable 500 times larger than the previously benchmarked BERT model (figure 2). The latest training time from Azure reached a 2.7x improvement compared to the previous record from MLPerf Training v3.0. The v3.1 submission underscores the ability to decrease training time and cost by optimizing a model that accurately represents current AI workloads.

The power of virtualization


NVIDIA’s submission to the MLPerf Training v3.1 LLM benchmark on 10,752 NVIDIA H100 Tensor Core GPUs achieved a training time of 3.92 minutes. This amounts to just a 2 percent increase in the training time in Azure VMs compared to the NVIDIA bare-metal submission, which has the best-in-class performance of virtual machines across all offerings of HPC instances in the cloud (figure 3).

Figure 3: Relative training times on the model GPT-3 (175 billion parameters) from MLPerf Training v3.1 between the NVIDIA submission on the bare-metal platform (3.1-2007) and Azure on virtual machines (3.1-2002). 

The latest results in AI Inferencing on Azure ND H100 v5 VMs show leadership results as well, as shown in MLPerf Inference v3.1. The ND H100 v5-series delivered 0.99x-1.05x relative performance compared to the bare-metal submissions on the same NVIDIA H100 Tensor Core GPUs (figure 4), echoing the efficiency of virtual machines.

Figure 4: Performance of the ND H100 v5-series (3.1-0003) compared to on-premises and bare metal offerings of the same NVIDIA H100 Tensor Core GPUs (3.1-0107 and 3.1-0121). All the results were obtained with the GPT-J benchmark from MLPerf Inference v3.1, scenarios: Offline and Server, accuracy: 99 percent.

In conclusion, created for performance, scalability, and adaptability, the Azure ND H100 v5-series offers exceptional throughput and minimal latency for both training and inferencing tasks in the cloud and offers the highest quality infrastructure for AI.

Source: microsoft.com

Saturday, 7 October 2023

Microsoft Azure achieves HITRUST CSF v11 certification

Microsoft Azure achieves HITRUST CSF v11 certification

The healthcare industry is undergoing a rapid transformation, driven by the increasing need for cloud computing to improve patient outcomes, capture cost efficiencies, and make it easier to coordinate care, especially for patients in remote areas. Cloud computing enables healthcare organizations to leverage advanced technologies such as artificial intelligence, machine learning, big data analytics, and Internet of Things to enhance their services and operations. However, cloud computing also brings new challenges and risks for securing and protecting sensitive healthcare data, such as electronic health records, medical images, genomic data, and personal health information. Healthcare organizations need to ensure that their cloud service providers meet the highest standards of security and compliance, as well as adhere to the complex and evolving regulations and frameworks that govern the healthcare industry.

Microsoft Azure committed to security and compliance in the healthcare industry


One of the most widely adopted and recognized frameworks for information protection in the healthcare industry is the HITRUST Common Security Framework (CSF). The HITRUST CSF is a comprehensive and scalable framework that integrates multiple authoritative sources, such as HIPAA, NIST, ISO, PCI, and COBIT, into a single set of harmonized controls. The HITRUST CSF provides a prescriptive and flexible approach for assessing and certifying the security and compliance posture of cloud service providers and their customers. Achieving HITRUST CSF certification demonstrates that a cloud service provider has implemented the best practices and controls to safeguard sensitive healthcare data in the cloud.

As healthcare organizations converge on the Dallas area for the HITRUST Collaborate 2023 event, providing secure and compliant cloud services for the healthcare industry is more important than ever. Microsoft Azure is committed to being a trusted partner for healthcare organizations in their digital transformation journey. Azure provides a comprehensive portfolio of cloud services that enable healthcare organizations to build innovative solutions that improve the entire healthcare experience. Azure also offers a range of capabilities that make it easier for healthcare organizations to achieve and maintain security and compliance in the cloud.

We are therefore proud to announce that Microsoft Azure has achieved HITRUST CSF v11.0.1 certification across 162 Azure services and 115 Azure Government services. All GA Azure regions across Azure and Azure Government clouds are included within this certification. This achievement reflects the continuous efforts by Azure to enhance its security and compliance offerings for customers in the healthcare industry.

HITRUST CSF v11.0.1 is the latest version of the framework that incorporates new requirements and updates from various authoritative sources, such as NIST SP 800-53 Rev 5, NIST Cybersecurity Framework v1.1, PCI DSS v3.2.1, FedRAMP High Baseline Rev 5, CSA CCM v3.0.1, GDPR, CCPA, and others. HITRUST CSF v11.0.1 also introduces new features and enhancements, such as maturity scoring model, risk factor analysis, inheritance program expansion, assessment scoping tool improvement, and more. Achieving HITRUST CSF v11.0.1 certification demonstrates the increasing commitment Azure has to providing secure and compliant cloud services for customers in the healthcare industry.

The HITRUST CSF v11.0.1 r2 Validated Assessment for Azure was performed by an independent third-party audit firm licensed under the HITRUST External Assessor program. The audit firm evaluated Azure for security policies, procedures, processes, and controls against the HITRUST CSF requirements applicable to cloud service providers. The audit firm also verified that security controls for Azure are implemented effectively and operate as intended. Azure customers can obtain the HITRUST CSF Letter of Certification, which contains the full scope of certified Azure offerings and regions, at the Service Trust Portal.

Microsoft Azure partners with HITRUST Alliance


In addition to today’s certification, Azure has also partnered in the past with HITRUST Alliance to release the HITRUST Shared Responsibility Matrix for Azure, which provides clarity around security and privacy responsibilities between Azure and its customers, making it easier for organizations to achieve their own HITRUST CSF certification. The matrix outlines which HITRUST CSF controls are fully managed by Azure, which are shared between Azure and customers, and which are solely the customers’ responsibility. The matrix also provides guidance on how customers can leverage the capabilities in Azure to meet their own security and compliance obligations.

Azure also supports the HITRUST Inheritance Program which empowers organizations to achieve more by significantly reducing the compliance cost and burden by enabling customers to externally inherit requirements from the Azure HITRUST CSF certification. The program allows customers to inherit up to 75 percent of applicable HITRUST CSF controls from the Azure certification scope without additional testing or validation by an external assessor. This reduces the time, effort, and resources required for customers to obtain their own HITRUST CSF certification or report on their compliance status using other frameworks or standards based on the HITRUST CSF. Azure has reviewed over 23,450 inheritance requests from customers since the program’s inception.

Azure has maintained the HITRUST CSF certification since November 2016. Azure was one of the first cloud service providers to achieve HITRUST CSF certification and has been continuously expanding its scope of certified services and regions. Azure is also one of the few cloud service providers that offer HITRUST CSF certified services in both public and government clouds. The Azure HITRUST CSF v11.0.1 certification is backward compatible with HITRUST CSF v9.1, v9.2, v9.3, v9.4, v9.5, and v9.6 certifications, offering support to a wide range of customers.

Azure HITRUST CSF certification


Azure is dedicated to helping healthcare organizations accelerate their digital transformation while ensuring security and compliance in the cloud. Azure provides a secure and compliant cloud platform that enables healthcare organizations to build innovative solutions that improve patient care, operational efficiency, and business agility. Azure also offers a variety of tools and resources that make it easier for healthcare organizations to achieve and maintain security and compliance in the cloud. The Azure HITRUST CSF certification is a testament to the commitment Azure has to be a trusted partner for healthcare organizations in their cloud journey.

Source: microsoft.com

Tuesday, 12 September 2023

Microsoft Azure Certifications: Your Path to Lucrative Tech Careers

Microsoft Azure Certifications: Your Path to Lucrative Tech Careers

In today's rapidly evolving digital landscape, the demand for skilled IT professionals is skyrocketing. Businesses across the globe are increasingly relying on cloud technologies to streamline their operations, making cloud expertise a hot commodity in the job market. Among the major players in the cloud computing sphere, Microsoft Azure stands out as a leader, offering a range of certifications that can pave the way to lucrative tech careers. In this comprehensive guide, we will delve into the world of Microsoft Azure certifications and explore how they can open doors to exciting opportunities in the tech industry.

Why Microsoft Azure?


Before we dive into the specifics of Microsoft Azure certifications, let's take a moment to understand why Azure is such a dominant force in cloud computing. Microsoft Azure is a cloud platform that provides a wide array of services, including computing, networking, storage, databases, and more. What sets Azure apart is its scalability, security, and integration with other Microsoft products such as Windows Server and SQL Server. This robust ecosystem makes it a preferred choice for businesses of all sizes.

The Azure Certification Pathway


Microsoft offers a well-structured certification pathway that caters to individuals with varying levels of expertise, from beginners to seasoned professionals. These certifications are categorized into three main levels:

1. Fundamental Certifications

These certifications are designed for those who are new to Azure and want to build a solid foundation in cloud technology.

Microsoft Certified: Azure Fundamentals: This entry-level certification is ideal for individuals who want to understand the basics of Azure, including its services, solutions, and pricing.

2. Associate Certifications

Associate-level certifications are tailored for professionals with some experience in Azure and are ready to take their skills to the next level.

Microsoft Certified: Azure Administrator Associate: This certification validates your ability to manage Azure resources, configure virtual networks, and ensure security and compliance within the Azure environment.

Microsoft Certified: Azure Developer Associate: For developers, this certification focuses on designing, building, testing, and maintaining applications and services on Azure.

3. Expert Certifications

Expert-level certifications are intended for Azure experts who are proficient in implementing advanced solutions.

Microsoft Certified: Azure Solutions Architect Expert: This certification is for architects who design solutions that run on Azure. It covers topics such as infrastructure, security, and governance.

Microsoft Certified: Azure DevOps Engineer Expert: DevOps professionals can validate their skills in implementing continuous integration and continuous delivery (CI/CD) pipelines on Azure.

The Advantages of Azure Certifications


Now that we've outlined the certification pathway, let's explore the tangible benefits of becoming Azure-certified:

1. High Demand for Azure Skills

The tech industry is experiencing a shortage of Azure-skilled professionals. This high demand translates into attractive salary packages and job security for Azure-certified individuals.

2. Diverse Career Opportunities

Azure certifications open doors to a wide range of career opportunities, including cloud architect, Azure administrator, cloud developer, and more. You can choose a career path that aligns with your interests and strengths.

3. Enhanced Skillset

Through the certification process, you'll gain in-depth knowledge of Azure's features and capabilities. This not only makes you more marketable but also equips you with the skills to tackle real-world challenges effectively.

4. Industry Recognition

Microsoft Azure certifications are globally recognized and respected by employers. Holding an Azure certification demonstrates your commitment to excellence and expertise in cloud technology.

5. Stay Current with Technology Trends

The tech industry evolves rapidly, and Azure certifications require candidates to stay updated with the latest Azure services and best practices. This commitment to ongoing learning keeps your skills relevant.

How to Prepare for Azure Certifications


Achieving an Azure certification requires dedication and thorough preparation. Here are some steps to help you succeed:

1. Choose the Right Certification

Select a certification that aligns with your career goals and current skill level. Start with the fundamentals if you're new to Azure and work your way up.

2. Study Resources

Utilize official Microsoft learning paths, online courses, and practice exams to enhance your knowledge. Consider investing in books and joining Azure communities for additional support.

3. Hands-On Experience

Practical experience is invaluable. Set up an Azure account, create virtual machines, and experiment with different Azure services to solidify your understanding.

4. Practice Exams

Take practice exams to assess your readiness. These exams simulate the real certification tests and help you identify areas where you need improvement.

5. Stay Informed

Keep up-to-date with Azure updates and industry trends by following Azure blogs, attending webinars, and participating in forums.

Conclusion

In the competitive tech industry, having Microsoft Azure certifications on your resume can be a game-changer. Whether you're a newcomer looking to break into the field or an experienced professional aiming to advance your career, Azure certifications offer a clear pathway to success. Embrace the opportunity to enhance your skills, boost your earning potential, and become a sought-after IT expert in the ever-expanding world of cloud computing.

Tuesday, 8 August 2023

Scale generative AI with new Azure AI infrastructure advancements and availability

Scale generative AI with new Azure AI infrastructure advancements and availability

Generative AI is a powerful and transformational technology that has the potential to advance a wide range of industries from manufacturing to retail, and financial services to healthcare. Our early investments in hardware and AI infrastructure are helping customers to realize the efficiency and innovation generative AI can deliver. Our Azure AI infrastructure is the backbone of how we scale our offerings, with Azure OpenAI Service at the forefront of this transformation, providing developers with the systems, tools, and resources they need to build next-generation, AI-powered applications on the Azure platform. With generative AI, users can create richer user experiences, fuel innovation, and boost productivity for their businesses.

As part of our commitment to bringing the transformative power of AI to our customers, today we’re announcing updates to how we’re empowering businesses Azure AI infrastructure and applications. With the global expansion of Azure OpenAI Service, we are making OpenAI’s most advanced models, GPT-4 and GPT-35-Turbo, available in multiple new regions, providing businesses worldwide with unparalleled generative AI capabilities. Our Azure AI infrastructure is what powers this scalability, which we continue to invest in and expand. We’re also delivering the general availability of the ND H100 v5 Virtual Machine series, equipped with NVIDIA H100 Tensor Core graphics processing units (GPUs) and low-latency networking, propelling businesses into a new era of AI applications. 

Here’s how these advancements extend Microsoft’s unified approach to AI across the stack.  

General availability of ND H100 v5 Virtual Machine series: Unprecedented AI processing and scale


Today marks the general availability of our Azure ND H100 v5 Virtual Machine (VM) series, featuring the latest NVIDIA H100 Tensor Core GPUs and NVIDIA Quantum-2 InfiniBand networking. This VM series is meticulously engineered with Microsoft’s extensive experience in delivering supercomputing performance and scale to tackle the exponentially increasing complexity of cutting-edge AI workloads. As part of our deep and ongoing investment in generative AI, we are leveraging an AI optimized 4K GPU cluster and will be ramping to hundreds of thousands of the latest GPUs in the next year. 

The ND H100 v5 is now available in the East United States and South Central United States Azure regions. Enterprises can register their interest in access to the new VMs or review technical details on the ND H100 v5 VM series at Microsoft Learn.  

The ND H100 v5 VMs include the following features today: 

  • AI supercomputing GPUs: Equipped with eight NVIDIA H100 Tensor Core GPUs, these VMs promise significantly faster AI model performance than previous generations, empowering businesses with unmatched computational power.
  • Next-generation computer processing unit (CPU): Understanding the criticality of CPU performance for AI training and inference, we have chosen the 4th Gen Intel Xeon Scalable processors as the foundation of these VMs, ensuring optimal processing speed.
  • Low-latency networking: The inclusion of NVIDIA Quantum-2 ConnectX-7 InfiniBand with 400Gb/s per GPU with 3.2 Tb/s per VM of cross-node bandwidth ensures seamless performance across the GPUs, matching the capabilities of top-performing supercomputers globally.
  • Optimized host to GPU performance: With PCIe Gen5 providing 64GB/s bandwidth per GPU, Azure achieves significant performance advantages between CPU and GPU.
  • Large scale memory and memory bandwidth: DDR5 memory is at the core of these VMs, delivering greater data transfer speeds and efficiency, making them ideal for workloads with larger datasets.
  • These VMs have proven their performance prowess, with up to six times more speedup in matrix multiplication operations when using the new 8-bit FP8 floating point data type compared to FP16 in previous generations. The ND H100 v5 VMs achieve up to two times more speedup in large language models like BLOOM 175B end-to-end model inference, demonstrating their potential to optimize AI applications further.

Azure OpenAI Service goes global: Expanding cutting-edge models worldwide


We are thrilled to announce the global expansion of Azure OpenAI Service, bringing OpenAI’s cutting-edge models, including GPT-4 and GPT-35-Turbo, to a wider audience worldwide. Our new live regions in Australia East, Canada East, East United States 2, Japan East, and United Kingdom South extend our reach and support for organizations seeking powerful generative AI capabilities. With the addition of these regions, Azure OpenAI Service is now available in even more locations, complementing our existing availability in East United States, France Central, South Central United States, and West Europe. The response to Azure OpenAI Service has been phenomenal, with our customer base nearly tripling since our last disclosure. We now proudly serve over 11,000 customers, attracting an average of 100 new customers daily this quarter. This remarkable growth is a testament to the value our service brings to businesses eager to harness the potential of AI for their unique needs.

As part of this expansion, we are increasing the availability of GPT-4, Azure OpenAI’s most advanced generative AI model, across the new regions. This enhancement allows more customers to leverage GPT-4’s capabilities for content generation, document intelligence, customer service, and beyond. With Azure OpenAI Service, organizations can propel their operations to new heights, driving innovation and transformation across various industries.

A responsible approach to developing generative AI


Microsoft’s commitment to responsible AI is at the core of Azure AI and Machine Learning. The AI platform incorporates robust safety systems and leverages human feedback mechanisms to handle harmful inputs responsibly, ensuring the utmost protection for users and end consumers. Businesses can apply for access to Azure OpenAI Service and unlock the full potential of generative AI to propel their operations to new heights.

We invite businesses and developers worldwide to join us in this transformative journey as we lead the way in AI innovation. Azure OpenAI Service stands as a testament to Microsoft’s dedication to making AI accessible, scalable, and impactful for businesses of all sizes. Together, let’s embrace the power of generative AI and Microsoft’s commitment to responsible AI practices to drive positive impact and growth worldwide.

Customer inspiration


Generative AI is revolutionizing various industries, including content creation and design, accelerated automation, personalized marketing, customer service, chatbots, product and service innovation, language translation, autonomous driving, fraud detection, and predictive analytics. We are inspired by the way our customers are innovating with generative AI and look forward to seeing how customers around the world build upon these technologies.

Mercedes-Benz is innovating its in-car experience for drivers, powered by Azure OpenAI Service. The upgraded “Hey Mercedes” feature is more intuitive and conversational than ever before. KPMG, a global professional services firm, leverages our service to improve its service delivery model, achieve intelligent automation, and enhance the coding lifecycle. Wayve trains large scale foundational neural-network for autonomous driving using Azure Machine Learning and Azure’s AI infrastructure. Microsoft partner SymphonyAI launched Sensa Copilot to empower financial crime investigators to combat the burden of illegal activity on the economy and organizations. By automating data collection, collation, and summarization of financial and third-party information, Sensa Copilot identifies money laundering behaviors and facilitates quick and efficient analysis for investigators.

Source: microsoft.com

Thursday, 13 April 2023

Azure Space technologies advance digital transformation across government agencies

Azure Exam, Azure Exam Prep, Azure Tutorial and Materials, Azure Prep, Azure Preparation, Azure Guides, Azure Learning

Since its launch, Microsoft Azure Space has been committed to enabling people to achieve more, both on and off the planet. This mission has transcended various industries, including agriculture, finance, insurance, and healthcare.

The announcements we’ve made thus far have helped showcase how our mission encompasses not only commercial industries but also empowers government missions through recent contract wins. By bringing new commercial technologies, such as Microsoft 365, Azure Government Cloud, and Azure Orbital, government agencies are increasing the speed, flexibility, and agility of their missions. Today, we are announcing additional momentum on this motion, including:

◉ Viasat RTE integration with Azure Orbital Ground Station, bringing high rate, low latency data streaming downlink from spacecraft directly to Azure.

◉ A partnership with Ball Aerospace and Loft Federal on the Space Development Agency’s (SDA) National Defense Space Architecture Experimental Testbed (NeXT) program, which will bring 10 satellites with experimental payloads into orbit and provide the associated ground infrastructure.

◉ Advancements on the Hybrid Space Architecture for the Defense Innovation Unit, U.S. Space Force and Air Force Research Lab, with new partners and demonstrations that showcase the power, flexibility, and agility of commercial hybrid systems that work across multi-path, multi-orbit, and multi-vendor cloud enabled resilient capabilities.

◉ Azure powers Space Information Sharing and Analysis Center (ISAC) to deliver Space cybersecurity and threat intelligence operating capabilities. The watch center’s collaborative environment provides visualization of environmental conditions and threat information to rapidly detect, assess and respond to space weather events, vulnerabilities, incidents, and threats to space systems.

Viasat Real-Time Earth general availability on Azure Orbital Ground Station


Azure Exam, Azure Exam Prep, Azure Tutorial and Materials, Azure Prep, Azure Preparation, Azure Guides, Azure Learning

Microsoft has partnered with Viasat Real-Time Earth (RTE) to offer customers new real-time capabilities to manage spacecraft and missions with Azure Orbital Ground Station as a service. This includes the ability to view, schedule, and modify passes at Viasat RTE sites for downlinking data to Azure and bring real-time streaming directly to Azure across secure Microsoft WAN.

As commercial satellite operators require increasingly higher downlink rates to bring mission data such as hyperspectral or synthetic aperture radar into Azure—this partnership with Viasat increases the opportunity to access an established global network on KA-band antennas. This unlocks new business opportunities for missions that require fast time to insight whilst also maintaining a high level of security.

“Viasat Real-Time Earth is enabling remote sensing satellite operators who are pushing the envelope of high-rate downlinks. Our strong relationship with Azure Orbital enables those same customers, through increased access to our ground service over the Azure Orbital marketplace and a dependable, high-speed terrestrial network, to reduce the time it takes to downlink and deliver massive amounts of data.”—John Williams, Vice President Viasat Real-Time Earth.

True Anomaly


True Anomaly delivers a fully integrated technology platform that combines training and simulation tools, advanced spacecraft manufacturing infrastructure and autonomy systems to revolutionize space security and sustainability.

True Anomaly is using the Viasat RTE integration with Azure Orbital Ground Station via Microsoft APIs today to advance their business with the government.

"Together, True Anomaly, Viasat, and Microsoft will employ cutting-edge modeling, simulation, and visualization tools available to train Space Force guardians and other operators. Our partnership will extend to satellite control, leveraging Microsoft Azure Orbital to provide seamless and efficient satellite management solutions for our fleet of Autonomous Orbital Vehicles. By joining forces, we unlock a path to disrupt space operations and training for years to come."— Even Rogers, Co-founder and CEO of True Anomaly.

This partnership combines True Anomaly's innovative Mission Control System with Microsoft’s Azure Orbital and Viasat, offering a seamless satellite management solution for space security operations and training.

Microsoft, Loft Federal, and Ball Aerospace partner on Space Development Agency NExT


The Space Development Agency is charged to create and sustain effective and affordable military space capabilities that provide persistent, resilient, global, low-latency surveillance. The National Defense Space Architecture Experimental Testbed (NExT) program will carry 10 satellites with experimental payloads into orbit.

SDA NExT builds upon Microsoft’s Azure Space products and partnerships. Central to Microsoft’s solution for NExT is the combination of Azure Orbital Ground Station and Azure Government air-gapped clouds which will allow SDA to do their mission work in a secure cloud environment. 

Through NExT, together the SDA and US Space Force will securely operate a government-owned satellite constellation with Azure Orbital Ground Station’s global network for the first time. Additionally, Microsoft 365 will also provide them with productivity tools to enable personnel to share information, which will help ensure a coordinated response.

Microsoft Azure Government cloud will enable SDA to extract spaceborne data insights from the cloud to the ultimate edge and to scale innovation faster and better meet the critical needs of the Guardians and strengthen national security.

New advancements and partnerships for Hybrid Space Architecture


Last year, we announced our contract supporting the Department of Defense's (DoD) Defense Innovation Unit (in partnership with United States Space Force and Air Force Research Lab) on the Hybrid Space Architecture (HSA). The goal of the program is to bring our advanced, trusted cloud, and innovative Azure Space capabilities, alongside a space partner ecosystem, to serve as a foundation to realize their Hybrid Space Architecture vision.

This year, Microsoft completed the first demonstration for the program focused on resilient communication and data paths which showcased:

◉ Multi-orbit, multi-vendor, resilient, edge-to-cloud connectivity including use of Azure Orbital Cloud Access through satellite communications partner SpaceX and SES.

◉ SpatioTemporal Asset Catalogs (STAC) standards for operating a private Planetary Computer to efficiently manage large geospatial datasets and enable space vehicle tasking across multiple providers.

◉ AI-enabled field user application to allow users to rapidly and easily discover and task satellite collection through an intuitive chat interface.

Microsoft is committed to a strong, and growing, partner ecosystem. As part of this first demonstration, the Hybrid Space Architecture ecosystem included the capabilities from Umbra and BlackSky.

Future demonstrations will incorporate all Azure Space capabilities including Azure Orbital Cloud Access, Azure Orbital Ground Station, Azure Orbital Space SDK, our leading security solutions, and vast threat intelligence, as well as multiple leading space partners.

Azure powers ISAC to deliver Space cybersecurity and threat intelligence operating capabilities


As a society, our increased reliance of space-based systems for commercial, government, and critical infrastructure sectors underscores the importance of sharing threat intelligence to safeguard space infrastructure, which supports billions of people globally.

The Space Information Sharing and Analysis Center (ISAC) was established several years ago, with Microsoft as a founding member, to facilitate timely collaboration across the global space industry to enhance the ability to prepare for and respond to vulnerabilities, incidents, and cybersecurity threats.

On March 30, 2023 the Space ISAC’s Operational Watch Center reached its initial operational capability hosted in Azure. The watch center’s collaborative environment provides visualization of environmental conditions and threat information to rapidly detect, assess and respond to space weather events, vulnerabilities, incidents, and threats to space systems. The Watch Center is supported by a dedicated team of 10 in-person analysts with additional virtual support enabled by Azure cloud architecture.

As the largest cloud architecture in the world, Microsoft has gained an exceptional vantage point and garnered unique experience on what it takes to secure cloud workloads and containers. Microsoft has a unique view into emerging threats based on analysis of over 65 trillion threat signals daily across over 200 global consumer and commercial services and shares this insight with the Space ISAC community.

Working with the Space ISAC Watch Center, we can rapidly share threat intelligence with the space community. In addition, the new Microsoft Security Copilot capability will be available to our Space ISAC partners, to enable cyber defense at machine speed and scale.

Source: microsoft.com