Microsoft has always been a developer-first organization, and we strive to make tools and platforms that serve developers. Open Source on Azure gives developers the freedom to build next-generation intelligent applications where they want, and when they want. Thanks to the power of open source, developers can now innovate faster and more securely on Azure. Microsoft helps you write more secure code, respond quickly to vulnerabilities in your software supply chain, and adopt the latest best practices to secure your development environments—all with a commitment to open source and support for all languages and frameworks.
By combining the most beloved coding and collaboration tools—Linux, GitHub, Visual Studio Code, along with the Azure platform—the open-source software in the Azure ecosystem aids developer productivity with best-in-class tools for code-to-cloud development.
Azure Open Source Day highlights Microsoft’s commitment to open source and focused on how Open Source technologies can be used to build intelligent apps faster and with more flexibility.
Opening panel: How open source and AI impact software development in the cloud
We are kicking off our show with a panel of thought leaders from Github, HashiCorp, Microsoft, and Redis to discuss how open source has continued to evolve software development, impacts on software supply chain and security, and how new AI capabilities may impact the future.
How Open Source on Azure builds an intelligent app
Today, we are excited to be showcasing a brand-new, intelligent, cloud-native application that connects owners with their lost pets using fine-tuned machine learning. Instead of printing posters, use an advanced machine learning image classification model, fine-tuned by the images on your camera roll. With this trained machine learning model, when a pet is found, you can instantly snap a photo that will match the model and connect you to the owner.
We are leveraging open source technologies to make sure that our application is using the latest and greatest technologies.
The app's frontend is a dotnet Blazor app, with a Python backend. The frontend and backend communicate using the Distributed Application Runtime (Dapr) that provides application programming interfaces (APIs) that simplify microservice connectivity. The backend uses a pre-built vision model from Hugging Face, fine-tuned directly through Azure Machine Learning for model training and prediction. The whole app is deployed using Bicep templates and runs on Azure Kubernetes Service. The Kubernetes Event Driven Autoscaling (KEDA) is used to provide autoscaling capabilities based on the number of messages being sent through Dapr.
The app’s data layer was built with Azure Cosmos DB and takes advantage of the autoscale feature that matches database capacity with traffic demands. So as the app grows, the database automatically grows with it. With autoscale, the risk of rate-limiting and over-provisioning is eliminated, app performance is maintained, and the developer doesn’t have to monitor and manage database capacity manually. Developers using MySQL will soon enjoy similar benefits, with the general availability of IOPS Autoscaling in Azure Database for MySQL—flexible server, Business Critical tier coming next month. This feature will save time and money by resizing input/output operations per second (IOPS) depending on workload needs. We’ve also made it easier to develop low-code apps with MySQL data and visualize MySQL data with the integrations of Power Apps and Power BI Desktop with Azure Database for MySQL—flexible server, now in public preview and general availability respectively.
Developers using Azure Cache for Redis Enterprise can now use the RedisJSON module on active, geo-replicated caches. Currently in preview, this feature simplifies development and minimizes downtime by enabling a single operation to read, write, and store JSON documents while simultaneously syncing data across all active regions.
PostgreSQL users can now apply enhanced security to their databases, with the general availability of Microsoft Azure Active Directory and customer-managed keys in Azure Database for PostgreSQL—flexible server. Database user identities and access, along with encryption key permissions and life cycles, can now be centrally managed to make it easier to build secure apps.
Compute scaling in the demo is provided by Azure Virtual Machine Scale Sets (VMSS) Flex to deploy GitHub Actions self-hosted runners on new Arm-based virtual machines. VMSS Flex allows you to easily manage and mix different virtual machine sizes and SKUs, including both Spot and standard virtual machines. Recent additions to the Azure portfolio include next-generation burstable Arm-based Bpsv2 virtual machines, which provide a low-cost option for workloads that typically run at a low to moderate baseline central processing unit (CPU) utilization, and Intel-based DIsv5 virtual machines that can deliver up to 30 percent increased performance and better price-performance than the Fsv2 virtual machines. Both virtual machine series feature broad support for Linux distributions.
The app uses a pre-trained vision transformer model obtained from Hugging Face for image classification tasks. Developers and data scientists can now use foundation models in Azure Machine Learning to easily start their data science works to fine-tune and deploy foundation models from Hugging Face using Azure Machine Learning components and pipelines. This feature, currently in preview, provides organizations with a comprehensive repository of popular large AI models from Hugging Face through the built-in Azure Machine Learning registries, supporting various tasks such as classification, summarization, question answering, and translation. It simplifies the process of data pre-processing and adaptation of model training scripts, freeing data scientists from the overhead of setting up and managing underlying infrastructure and environment dependencies.
Unleashing the AI technology wave: Training large language models at scale
AI is changing every industry and is top of mind for developers. Most companies have leveraged AI to improve efficiency and costs. Large AI applications leveraging natural language processing (NLP), automatic speech recognition (ASR), and text-to-speech (TTS) are becoming prevalent, but what powers these applications is the underlying infrastructure optimized for large AI workloads. As mentioned in the post announcing the general availability of Microsoft Azure OpenAI Service, Azure is the best place to build AI workloads. This session highlights the partnership between Microsoft and NVIDIA and how Azure’s AI infrastructure and Azure Machine Learning were built for speed.
Azure NDm A100 v4-series virtual machines are Azure’s flagship graphics processing unit (GPU) offerings and were used to run the model's new NVIDIA NeMo Megatron framework and test the limits of this series. Microsoft ran a 530B-parameter benchmark on 175 virtual machines, resulting in a training time per step of as low as 55.7 seconds. This benchmark measures the compute efficiency and how it scales by measuring the time taken per step to train the model after a steady state is reached, with a mini-batch size of one. The InfiniBand HDR provided superior communication between nodes without increased latency and was critical to the ludicrous speed performance.
The open source future with Web3
Azure Open Source Day will conclude with a fascinating fireside chat between Kathleen Mitford, CVP Azure Marketing and Donovan Brown, Partner Program Manager, Azure Open Source Incubations on the Open Source Future with Web3. The open and decentralized nature of Web3 can be a natural fit with open source philosophy, which is an exciting and developing space for new innovations.
Web3 refers to another evolution of the internet, which may be more decentralized. It is built on a blockchain, which is a distributed ledger technology that enables the creation of a secure and transparent way to transfer and store digital assets. Microsoft has first-party solutions and a rich partner ecosystem to help you build using Web3.
DevOps best practices are just as important, if not more important in the Web3 world as they are in the Web2 world. Azure has the key tools a developer needs—from Azure Static Web Apps, purposely built for your App, to GitHub Actions, and Azure DevOps, and Visual Studio Code. In the future, many organizations may build solutions with a combination of Web3 and Web2 working together. Go deeper into Web3 with Donovan Brown’s 10-part blog series on how to build a DevOps pipeline for the Ethereum Blockchain.
Also, with the power of AI, you can ask ChatGPT to create a "hello world" sample in any language and copy and paste the code into your project. When you go to modify the code, have GitHub copilot help you make the changes using all the best practices. You can do all of this inside a GitHub Codespace configured with all your favorite tools, frameworks, and Visual Studio Code extensions installed. Then, you can use Azure DevOps or GitHub Actions to deploy the application to Azure. If you choose to build a Web3 application, Microsoft has the best tools and cloud services to support you.
Source: microsoft.com
0 comments:
Post a Comment