Saturday, 30 November 2019

Bing delivers its largest improvement in search experience using Azure GPUs

Over the last couple of years, deep learning has become widely adopted across the Bing search stack and powers a vast number of our intelligent features. We use natural language models to improve our core search algorithm’s understanding of a user’s search intent and the related webpages so that Bing can deliver the most relevant search results to our users. We rely on deep learning computer vision techniques to enhance the discoverability of billions of images even if they don’t have accompanying text descriptions or summary metadata. We leverage machine-based reading comprehension models to retrieve captions within larger text bodies that directly answer the specific questions users have. All these enhancements lead toward more relevant, contextual results for web search queries.

Recently, there was a breakthrough in natural language understanding with a type of model called transformers (as popularized by Bidirectional Encoder Representations from Transformers, BERT). Unlike previous deep neural network (DNN) architectures that processed words individually in order, transformers understand the context and relationship between each word and all the words around it in a sentence. Starting from April of this year, we used large transformer models to deliver the largest quality improvements to our Bing customers in the past year. For example, in the query "what can aggravate a concussion", the word "aggravate" indicates the user wants to learn about actions to be taken after a concussion and not about causes or symptoms. Our search powered by these models can now understand the user intent and deliver a more useful result. More importantly, these models are now applied to every Bing search query globally making Bing results more relevant and intelligent.

Azure Study Materials, Azure Certifications, Azure Online Exam, Azure Tutorials and Material

Deep learning at web-search scale can be prohibitively expensive


Bing customers expect an extremely fast search experience and every millisecond of latency matters.  Transformer-based models are pre-trained with up to billions of parameters, which is a sizable increase in parameter size and computation requirement as compared to previous network architectures. A distilled three-layer BERT model serving latency on twenty CPU cores was initially benchmarked at 77ms per inference. However, since these models would need to run over millions of different queries and snippets per second to power web search, even 77ms per inference remained prohibitively expensive at web search scale, requiring tens of thousands of servers to ship just one search improvement.

Azure Study Materials, Azure Certifications, Azure Online Exam, Azure Tutorials and Material

Leveraging Azure Virtual Machine GPUs to achieve 800x inference throughput


One of the major differences between transformers and previous DNN architectures is that it relies on massive parallel compute instead of sequential processing. Given that graphics processing unit (GPU) architecture was designed for high throughput parallel computing, Azure’s N-series Virtual Machines (VM) with GPU accelerators built-in were a natural fit to accelerate these transformer models. We decided to start with the NV6 Virtual Machine primarily because of the lower cost and regional availability.  Just by running the three-layer BERT model on that VM with GPU, we observed a serving latency of 20ms (about 3x improvement). To further improve the serving efficiency, we partnered with NVIDIA to take full advantage of the GPU architecture and re-implemented the entire model using TensorRT C++ APIs and CUDA or CUBLAS libraries, including rewriting the embedding, transformer, and output layers.  NVIDIA also contributed efficient CUDA transformer plugins including softmax, GELU, normalization, and reduction.

We benchmarked the TensorRT-optimized GPU model on the same Azure NV6 Virtual Machine and was able to serve a batch of five inferences in 9ms, an 8x latency speedup and 43x throughput improvement compared to the model without GPU acceleration. We then leveraged Tensor Cores with mixed precision on a NC6s_v3 Virtual Machine to even further optimize the performance, benchmarking a batch size of 64 inferences at 6ms (~800x throughput improvement compared to CPU).

Transforming the Bing search experience worldwide using Azure’s global scale


With these GPU optimizations, we were able to use 2000+ Azure GPU Virtual Machines across four regions to serve over 1 million BERT inferences per second worldwide. Azure N-series GPU VMs are critical in enabling transformative AI workloads and product quality improvements for Bing with high availability, agility, and significant cost savings, especially as deep learning models continue to grow in complexity. Our takeaway was very clear, even large organizations like Bing can accelerate their AI workloads by using N-series virtual machines on Azure with built-in GPU acceleration. Delivering this kind of global-scale AI inferencing without GPUs would have required an exponentially higher number of CPU-based VMs, which ultimately would have become cost-prohibitive.  Azure also provides customers with the agility to deploy across multiple types of GPUs immediately, which would have taken months of time if we were to install GPUs on-premises.  The N-series Virtual Machines were essential to our ability to optimize and ship advanced deep learning models to improve Bing search, available globally today.

N-series general availability


Azure provides a full portfolio of Virtual Machine capabilities across the NC, ND, and NV series product lines. These Virtual Machines are designed for application scenarios for which GPU acceleration is common, such as compute-intensive, graphics-intensive, and visualization workloads.

◉ NC-series Virtual Machines are optimized for compute-intensive and network-intensive applications.

◉ ND-series Virtual Machines are optimized for training and inferencing scenarios for deep learning.

◉ NV-series Virtual Machines are optimized for visualization, streaming, gaming, encoding, and VDI scenarios.

Thursday, 28 November 2019

Multi-protocol access on Data Lake Storage now generally available

We are excited to announce the general availability of multi-protocol access for Azure Data Lake Storage. Azure Data Lake Storage is a unique cloud storage solution for analytics that offers multi-protocol access to the same data. This is a no-compromise solution that allows both the Azure Blob Storage API and Azure Data Lake Storage API to access data on a single storage account. You can store all your different types of data in one place, which gives you the flexibility to make the best use of your data as your use case evolves. The general availability of multi-protocol access creates the foundation to enable object storage capabilities on Data Lake Storage. This brings together the best of both object storage and Hadoop Distributed File System (HDFS) to enable scenarios that were not possible until today without data copy.

Azure Study Materials, Azure Tutorial and Material, Azure Guides, Azure Online Exam, Azure Certifications

Broader ecosystem of applications and features


Multi-protocol access provides a powerful foundation to enable integrations and features for Data Lake Storage. Existing object storage applications and connectors can now be used to access data stored in Data Lake Storage with no changes. This vastly accelerated the integration of Azure services and the partner ecosystem with Data Lake Storage. We are also announcing the general availability of multiple Azure service integrations with Data Lake Storage including: Azure Stream Analytics, IoT Hub, Azure Event Hubs Capture, Azure Data Box, and Logic Apps. These Azure services now integrate seamlessly with Data Lake Storage. Real-time scenarios are now enabled by easily ingesting streaming data into Data Lake Storage via IoT Hub, Stream Analytics and Event Hubs Capture.

Ecosystem partners have also strongly leveraged multi-protocol access for their applications. Here is what our partners are saying:

“Multi-protocol access is a massive paradigm shift that enables cloud analytics to run on a single account for both blob data and analytics data. We believe that multi-protocol access helps customers rapidly achieve integration with Azure Data Lake Storage using our existing blob connector. This brings tremendous value to customers without needing to do costly re-development efforts.” - Rob Cornell, Head of Cloud Alliances, Talend

Our customers are excited about how their existing blob applications and workloads “just work” leveraging the multi-protocol capability. There are no changes required for their existing blob applications saving them precious development and validation resources. We have customers today running multiple workloads seamlessly against the same data using both the blob connector and the Azure Data Lake Storage connector.

We are also making the ability to tier data between hot and cool tiers for Data Lake Storage generally available. This is great for analytics customers who want to keep frequently used analytics data in the hot tier and move less used data to cooler storage tiers for cost efficiencies. As we continue our journey, we will be enabling more capabilities on Data Lake Storage in upcoming releases. Stay tuned for more announcements in the future!

Wednesday, 27 November 2019

Revealed: The Best Tips for Passing Microsoft Certification Exams

Technology is developing, and so are the skills needed to keep abreast of the changes. The web app developers of today are not the same as that of the early 2000s. Their roles have evolved as technology and business requirements change. If you need to validate your courage in modern web application development, consider earning a Microsoft certification and passing the necessary exams.
Microsoft Certification, Microsoft Certification Exam, Microsoft, Microsoft Exam, MCSE, MCSA, MCP, Microsoft Certified Professional
It should be mentioned that to satisfy the requirements of the changing world of IT, Microsoft has also reviewed its certification program. Initially, IT professionals and developers would take an MCP exam to prove their skills and become a Microsoft Certified Professional.

Today, these exams have been classified into related job roles, and one has to attempt a role-based learning path to earn a certification and become an MCP. Although there are there levels at which the credentials are given fundamental, associate, and expert, some of them already existed certifications that refer to the MTA, MCSA, MCSE, and MCSD levels are still valid. The MCSA Web Applications is one of them.

The adage “fail to plan, plan to fail” could not be more accurate when it comes to preparing to take a Microsoft certification exam. Microsoft’s exams are known to be accurate, even for experts.

Technical topics like Microsoft SQL Server and Microsoft Azure are very hard, and taking a Microsoft exam without studying is an excellent way to set yourself up for failure.

Why Should You Earn a Microsoft Certification?

While there are many platforms and fields of study to think when attempting or advancing in a technical career, one name stands out: Microsoft.

A Microsoft certification confers current and potential employers that you maintain the core technical skills they need. You recognize and understand specific tools and applications, and Microsoft has validated your knowledge.

Microsoft Certified Professionals:

  • Are often hired more quickly by employers.
  • Have more stability and more opportunities for advancement.
  • Have more in-depth knowledge and knowledge of technologies, creating time savings and higher productivity.
  • Earn around $80,000 per year on average.

Tips for Passing Microsoft Certification Exams

1. Consult the Microsoft Virtual Academy

  • The Microsoft Virtual Academy should be the first place you begin your exam preparation. You will learn everything you require to know about Microsoft on this platform. If you need to learn anything about Microsoft certification tests, MVA is the place to check.
  • There are several resource materials, including instructor-led training, online training courses, self-paced study, practice tests, and lots of other tools to aid you in developing competence in the exam content. When you sign up on the site, you will have a way to a plethora of videos, tips, guides, and other data-focused on your particular exam.
  • You can also access many blog posts and a community of other test takers and professionals who come together to help each other achieve success in the certification test.

2. Use the Official Exam Guide

  • You can only have a valid study if you understand what the exam is all about. The Microsoft official guide will provide you with the skills you require to pass your certification test. You should go through the manual in detail and focus on the test contents, as highlighted in the book. You will learn all about the exam structure and pattern and the questions that you can require in the exam.

3. Join a Community

  • Remember, you are not the only one taking the test. Also, many other candidates have taken the exam before you. Becoming a part of a community of test-takers offer you the opportunity to learn from others. Members of the community who have given the exam share their success stories at the community forum, and you can ask them questions about how they prepared for the exam.
  • People who have failed also share their stories of what they experienced while studying and during the exam. With the information you gather from the community, you can plan a learning path for yourself and avoid the common pitfalls that other candidates have fallen into. When using data from an online community, it is suggested that you do not pay attention to people with a negative perspective about the exam. No matter how hard the test is, many people have passed it successfully. Therefore, you can pass the exam as well.

4. Take Practice Tests

  • One of the most important things you can do to prepare for your final exam is to take practice exams first. Use the practice exams like real exams so that you understand what your approach should be when the big day comes. Expect to fail your first practice exam. Then, go through each wrong answer to see your areas of weakness.
  • You build your faith by taking practice tests. The more you work on the questions, the more you learn about the ideas of the exam. AnalyticsExam.com, EduSum.com are some of the platforms where you can obtain up-to-date practice tests to train. There are many benefits you gain when you take them. First of all, you become familiar with the exam questions. Secondly, you build capability and skills in the exam content.

5. Build a Study Routine

  • Give yourself some building when it comes to studying and prioritize your study time. If you do not, it will be way too easy to delay or ignore what you need to do.
  • Figure out when you can learn new material the best. When will you have the least number of distractions? If you have kids, for example, it might be most comfortable to study after they have gone to bed or before they get up in the morning.

6. Review Topics Covered on the Exam

  • Microsoft certification exams are very complicated, so much so that even very experienced professionals have failed. The questions are meant to challenge your knowledge, so it is not advisable to guess and hope for the best.
  • While you can not know everything, you can get ahead, begin with studying if you understand the objectives and sub-objectives of each exam. Microsoft publishes the exam specifications for all exams and lists their goals. Each sub-objective will also have a set of skills that you are going to be tested on, so review this page as much as possible until you’re comfortable.

7. Find Study Materials

  • Get study materials from reliable sources that are approved by Microsoft. Microsoft publishes its companion content online, and you can attempt out online resources and forums from people who have previously taken the exam.

8. Take Notes While Studying

  • If in doubt, write it out. Write down every technical part you see while studying for any Microsoft Certification exam. Since Microsoft’s exams are mainly involved, this could be a lifesaver. While what you write down might not be on the exam, it is better to be safe than sorry.

9. Practice Using the Technology

  • Hands-on experience will be valuable as you take the exam. Instead of attempting to answer questions about a general practice you have only read about, you will have real-life knowledge of what a task does. But not everyone has a way to technology like a sandbox data center to experiment with.
  • Taking a course from a Microsoft-certified training company should let you test your understanding of the product in a lab environment. You can also sign up for a trial account to become comfortable with most Microsoft products, including Azure, SQL Server, Windows, and Dynamics 365.

10. Relax on Exam Day

  • When it is time for the exam, the best thing you can do is relax, knowing that you have prepared to the best of your ability. While you need to take every question thoughtfully, going into the exam stressed and confused will increase your chance of failure. Perfection is not the goal; completion is. Let yourself off the hook by explaining that it is okay to miss a question, hardly guess, or get confused by a question’s wording. You can always retake the exam if needed.

11. Think Microsoft

  • There are multiple ways of approaching a problem, but in the examination, the Microsoft way is the right way. You may already be an expert in the topic knowing all the shortcuts and tips and tricks, but when it comes to examinations, follow the procedure given in the textbooks as that is what is going to be evaluated.

12. It’s Your D-Day:

  • And, so you are on the battlefield now. Do not panic!
  • If you feel you are going to miss out on a question, it is ok. In the long years of exams, I think no one has wholly attended the questions, including the supervisors over there.
  • At times, the questions may be out of the field for you, let it be. When you do not know the answer, try to narrow down the options by an elimination process.
  • Still, you need to take all the questions sincerely to answer.
  • Most of the approaches mentioned over here are tried and tested ones by us for years. It is going to work even today also.
  • You are here to win and nothing less than it. Just walk out of the psychological hang-out of taking exams after years or as a professional.

In a Nutshell

The opportunities always come to professionals who continuously sharpen their knowledge in their area of expertise. The essential thing you should think of as an IT professional is to get the relevant training that will enable you to pass your certification exam successfully.

Get Microsoft Certified and become the next qualified software developer!

I hope this blog provides you some useful tips on how to take a Microsoft Certification exam.

Tuesday, 26 November 2019

Multi-language identification and transcription in Video Indexer

Multi-language speech transcription was recently introduced into Microsoft Video Indexer at the International Broadcasters Conference (IBC). It is available as a preview capability and customers can already start experiencing it in our portal.

Multi-language videos are common media assets in the globalization context, global political summits, economic forums, and sport press conferences are examples of venues where speakers use their native language to convey their own statements. Those videos pose a unique challenge for companies that need to provide automatic transcription for video archives of large volumes. Automatic transcription technologies expect users to explicitly determine the video language in advance to convert speech to text. This manual step becomes a scalability obstacle when transcribing multi-language content as one would have to manually tag audio segments with the appropriate language.

Microsoft Video Indexer provides a unique capability of automatic spoken language identification for multi-language content. This solution allows users to easily transcribe multi-language content without going through tedious manual preparation steps before triggering it. By that, it can save anyone with large archive of videos both time and money, and enable discoverability and accessibility scenarios.

Multi-language audio transcription in Video Indexer


The multi-language transcription capability is available as part of the Video Indexer portal. Currently, it supports four languages including English, French, German and Spanish, while expecting up to three different languages in an input media asset. While uploading a new media asset you can select the “Auto-detect multi-language” option as shown below.

Azure Study Materials, Azure Guides, Azure Tutorial and Material, Azure Learning, Azure Online Exam

Our application programming interface (API) supports this capability as well by enabling users to specify 'multi' as the language in the upload API. Once the indexing process is completed, the index JavaScript object notation (JSON) will include the underlying languages. 

Additionally, each instance in the transcription section will include the language in which it was transcribed.

Azure Study Materials, Azure Guides, Azure Tutorial and Material, Azure Learning, Azure Online Exam

Customers can view the transcript and identified languages by time, jump to the specific places in the video for each language, and even see the multi-language transcription as video captions. The result transcription is also available as closed caption files (VTT, TTML, SRT, TXT, and CSV).

Azure Study Materials, Azure Guides, Azure Tutorial and Material, Azure Learning, Azure Online Exam

Methodology


Language identification from an audio signal is a complex task. Acoustic environment, speaker gender, and speaker age are among a variety of factors that affect this process. We represent audio signal using a visual representation, such as spectrograms, assuming that, different languages induce unique visual patterns which can be learned using deep neural networks.

Our solution has two main stages to determine the languages used in multi-language media content. First, it employs a deep neural network to classify audio segments with very high granularity, in other words, very few seconds. While a good model will successfully identify the underlying language, it can still miss-identify some segments due to similarities between languages. Therefore, we apply a second stage for examining these misses and smooth the results accordingly.

Azure Study Materials, Azure Guides, Azure Tutorial and Material, Azure Learning, Azure Online Exam

Sunday, 24 November 2019

How AI can supercharge content understanding for businesses

Organizations face challenges when it comes to extracting insights, finding meaning, and uncovering new opportunities in the vast troves of content at their disposal. In fact, 82 percent of organizations surveyed in the latest Harvard Business Review (HBR) Analytic Services report say that exploring and understanding their content in a timely manner is a significant challenge. This is exacerbated because content is not only spread over multiple systems but also in multiple formats such as PDF, JPEG, spreadsheets, and audio files.

The first wave of artificial intelligence (AI) was designed for narrow applications, training a single model to address a specific task such as handwriting recognition. What’s been challenging, however, is that these models individually can’t capture all the different attributes hidden in various types of content. This means developers must painfully stitch together disparate components to fully understand their content.

Instead, organizations need a solution that spans vision, speech, and language to fully unlock insights from all content types. We are heavily investing in this new category of AI, called knowledge mining, to enable organizations to maximize the value of their content.

Knowledge mining with Azure Cognitive Search


Organizations can take advantage of knowledge mining today with Azure Cognitive Search. Organizations can now easily glean insights from all their content through web applications, bots, and Power BI visualizations. With Azure Cognitive Search, organizations can not only benefit from the industry’s most comprehensive domain-specific models but also integrate their own custom models. What used to take months to accomplish can be realized in mere hours without needing data science expertise.

Microsoft Tutorial and Material, Microsoft Study Materials, Microsoft Certifications, Azure Online Exam

Delivering real business impact


The same Harvard Business Review report describes how our customers across industries are benefiting from knowledge mining in ways that were previously unimaginable.

◉ Financial Services: “The return on investment (ROI) for knowledge mining at a small fund with one or two analysts is 30 percent to 58 percent. For much larger funds, with 50 or more analysts, it is over 500 percent.”—Subra Bose, CEO of Financial Fabric.

◉ Healthcare: “A reliable knowledge mining platform can drive roughly a third of the costs out of the medical claims process.” —Ram Swaminathan, CEO at BUDDI Health.

◉ Manufacturing: “Unlocking this potential will significantly change the way we do business with our customers and how we service their equipment.” —Chris van Ravenswaay, global business solutions manager for Howden.

◉ Legal: “AI tells you what is inside the contract. It also tells you what the relationship of the contract is with the outside world.” —Monish Darda, CTO of Icertis.

And we’re just getting started. You can expect even deeper integration and more great knowledge mining experiences built with Azure Cognitive Search as we continue this journey. I encourage you to take a look at Harvard Business Review’s survey and findings and hear their perspective on the landscape of knowledge mining.

Saturday, 23 November 2019

Azure Backup support for SQL Server 2019 and Restore as files

As SQL Server 2019 continues to push the boundaries of availability, performance, and data intelligence, a centrally managed, enterprise-scale backup solution is imperative to ensure the protection of all that data. This is especially true if you are running the SQL Server in the cloud to leverage the benefits of dynamic scale and don't want to continue using the legacy backup methods that are tedious, infrastructure-heavy, and difficult to scale.

We are excited to share native backup for SQL Server 2019 running in Azure Virtual Machine. This is a key addition to the general availability of Azure Backup for SQL Server Virtual Machine, announced earlier this year.  Azure Backup is a zero-infrastructure solution that protects standalone SQL Server and SQL AlwaysOn configurations in Azure Virtual Machine without the need to deploy and manage any backup infrastructure. While it offers long-term retention and central monitoring capabilities to help IT admins govern and meet their compliance requirements, it lets SQL Admin continue to exercise the power of self-service backup and restore for operational recoveries.

In addition to this, we are also sharing Azure Backup general availability for:

◉ SQL Server 2008 and 2008 R2 migrating to Azure as SQL Server running in virtual machines.
◉ SQL Server running on Windows 2019

Azure Tutorial and Material, Azure Certifications, Azure Guides, Azure Study Materials

Restore as files:

Adding to the list of enhancements is the key capability of Restore as Files, now restore anywhere by recovering the backed-up data as .bak files. Move these backup files across subscriptions, regions, or even to on-premises SQL Servers and trigger database restore wherever you want. Besides aiding cross-subscription and cross-region restore scenarios, this feature helps users stay compliant by giving them greater control over storing and recovering backup data to any destination of their choice.

Azure Tutorial and Material, Azure Certifications, Azure Guides, Azure Study Materials

Getting started:


Under the Restore operation, you will see a newly introduced option of Restore as files. Specify the destination server (this server should be SQL Server Virtual Machine registered to the vault) and path on that server. The service will dump all the .bak files specific to the recovery point you have chosen to this path. Typically, a network share path or path of a mounted Azure File share when specified as the destination enables easier access to these files by other machines in the same network or with the same Azure File share mounted on them.

Once the restore operation is completed, you can move these files to any machine across subscriptions or locations and restore them as a database using SQL Server Management Studio.

Azure Tutorial and Material, Azure Certifications, Azure Guides, Azure Study Materials

Friday, 22 November 2019

A Brief Overview of Microsoft MB2-716 Certification


The Microsoft MB2-716 certification examination is also known as Microsoft Dynamics 365 Customization & Configuration. This exam is designed to help an individual acquire expertise in the field of Microsoft Dynamics 365 Customization & Configuration. It is also intended to qualify the applicants with the skills required to develop new ideas in the IT field. There is no denying the fact that improving one’s skills is an important part of the career objective in the Information Technology field. This Microsoft certification exam, therefore, qualifies the applicants for the challenges in the IT industry.

The Microsoft MB2-716 exam is one of the exams that must be passed to achieve the Microsoft Certified Solutions Associate in Dynamics 365. The MCSA: Microsoft Dynamics 365 certification is a ranked certification level covered under the Microsoft Certified Professional program. There are six different paths of certification levels under this path. These comprise productivity, cloud platform & infrastructure, mobility, app builder, business applications, and data management & analytics. The Microsoft Dynamic 365 certification falls into the Business Applications competency category, and it is intended to assess the expertise of professionals in specific areas of the Microsoft Dynamics technologies.

Microsoft MB2-716 Exam as a part of Microsoft Dynamics 365

The Microsoft MB2-716 certification exam is available in an online proctored format where the applicants can take this examination from anywhere in the world using the computer system and Internet service. The proctored exam is usually monitored through a microphone and webcam by the examiner. TheMB2-716 examination is geared towards assessing the candidate’s capacity to access the customer’s business requirement and the skills to configure & customize applications.

The Microsoft Dynamics 365 certification itself is created to confirm the implementation skill of a professional for Universal Windows Platform applications over many devices. After the completion of the certification exam, successful applicants will be able to hold some significant jobs in the Information Technology organizations. Some of the positions you can take up after earning this certification are System Administrators, Technical Support Engineers, and Implementation Consultants.

The Microsoft Dynamics 365 certification, under which is the Microsoft MB2-716 exam, is suitable for professionals in the sales field who want to prove foundational competency in Dynamics Customer Relationship Management for software solutions consulting. There is no prerequisite to appear for this certification exam. However, the applicant should have practical experience of the product.

Microsoft regularly reviews and refreshes its certification exam with each new important variation of its CRM products. The Microsoft MB2-716 examination is, therefore, an essential method of confirming your proficiency in the CRM products. It is also projected at assessing your proficiency level on Dynamics solutions.

MCSA: Microsoft Dynamics 365 Certification

There are four certification exams that are available for the Enterprise Edition of the Microsoft Dynamics 365 Customer Relationship Management (CRM). These exams are as follows:
  1. Microsoft MB2-715 Exam – It includes the deployment of Dynamics 365 Customer Engagement Online.
  2. Microsoft MB2-716 Exam – It includes Microsoft Dynamics 365 Customization & Configuration.
  3. Microsoft MB2-717 Exam – It concentrates on Dynamics 365 created for sales.
  4. Microsoft MB2-718Exam– It addresses Dynamics 365 meant for customer service

The certification exam is granted after each of the stages of the examination for achieving the Microsoft certification. The certification given after the exams are MCSA and MCSE.

Conclusion

The Microsoft MB2-716 certification examination has thepotential to take your career to the next level. Although preparing and passing the exam is difficult, but the result is worth every minute you spend preparing for the exam.

Thursday, 21 November 2019

Introducing Azure Cost Management for partners

As a partner, you play a critical role in successful planning and managing long-term cloud implementations for your customers. While the cloud grants the flexibility to scale the cloud infrastructure to the changing needs, it does become challenging to control the spend when cloud costs can fluctuate dramatically with demand. This is where Azure Cost Management comes in to help you track and control cloud cost, prevent overspending and increase predictability for your cloud costs

Announcing general availability of Azure Cost Management for all cloud solution partners (CSPs) who have onboarded their customers to the new Microsoft Customer agreement. With this update, partners and their customers can take advantage of Azure Cost Management tools available to manage cloud spend, similar to the cost management capabilities available for pay-as-you-go (PAYG) and enterprise customers today.

This is the first of the periodic updates to enable cost management support for partners that enables partners to understand, analyze, dissect and manage cost across all their customers and invoices.

With this update, CSPs use Azure Cost Management to:

◉ Understand invoiced costs and associate the costs to the customer, subscriptions, resource groups, and services.

◉ Get an intuitive view of Azure costs in cost analysis with capabilities to analyze costs by customer, subscription, resource group, resource, meter, service, and many other dimensions.

◉ View resource costs that have Partner Earned Credit (PEC) applied in Cost Analysis.

◉ Set up notifications and automation using programmatic budgets and alerts when costs exceed budgets.

◉ Enable the Azure Resource Manager policy that provides customer access to Cost Management data. Customers can then view consumption cost data for their subscriptions using pay-as-you-go rates.

Analyze costs by customer, subscription, tags, resource group or resource using cost analysis


Using cost analysis, partners can group by and filter costs by customer, subscription, tags, resource group, resource, and reseller Microsoft partner Network identifier (MP NID), and have increased visibility into costs for better cost control. Partners can also view and manage the costs in the billing currency and in US dollars for billing scopes.

Microsoft Online Exam, Microsoft Tutorial and Material, Microsoft Certification, Azure Study Materials, Azure Learning

Reconcile cost to an invoice


Partners can reconcile costs by invoice across their customers and their subscriptions to understand the pre-tax costs that contributed to the invoice.

Microsoft Online Exam, Microsoft Tutorial and Material, Microsoft Certification, Azure Study Materials, Azure Learning

You can analyze azure spend for the customers you support and their subscriptions and resources. With this enhanced visibility into the costs of your customers, you can use spending patterns to enforce cost control mechanisms, like budgets and alerts to manage costs with continued and increased accountability.

Enable cost management at retail rates for your customers


In this update, a partner can also enable cost management features, initially at pay-as-you-go rates for your customers and resellers who have access to the subscriptions in the customer’s tenant. As a partner, if you decide to enable cost management for the users with access to the subscription, they will have the same capabilities to analyze the services they consume and set budgets to control costs that are computed at pay-as-you-go prices for Azure consumed services. This is just the first of the updates and we have features planned in the first half of 2020 to enable cost management for customers at prices that partner can set by applying a markup on the pay-as-you-go prices.

Partners can set a policy to enable cost management for users with access to an Azure subscription to view costs at retail rates for a specific customer.

Microsoft Online Exam, Microsoft Tutorial and Material, Microsoft Certification, Azure Study Materials, Azure Learning

If the policy is enabled for subscriptions in the customer’s tenant, users with role-based access control (RBAC) access to the subscription can now manage Azure consumption costs at retail prices.

Microsoft Online Exam, Microsoft Tutorial and Material, Microsoft Certification, Azure Study Materials, Azure Learning

Set up programmatic budgets and alerts to automate and notify when costs exceed threshold


As a partner, you can set up budgets and alerts to send notifications to specified email recipients when the cost threshold is exceeded. In the partner tenant, you can set up budgets for costs as invoiced to the partner. You can also set up monthly, quarterly, or annual budgets across all your customers, or for a specific customer, and filter by subscription, resource, reseller MPN ID, or resource group.

Microsoft Online Exam, Microsoft Tutorial and Material, Microsoft Certification, Azure Study Materials, Azure Learning

Any user with RBAC access to a subscription or resource group can also set up budgets and alerts for Azure consumption costs at retail rates in the customer tenant if the policy for cost visibility has been enabled for the customer.

Microsoft Online Exam, Microsoft Tutorial and Material, Microsoft Certification, Azure Study Materials, Azure Learning

When a budget is created for a subscription or resource group in the customer tenant, you can also configure it to call an action group. The action group can perform a variety of different actions when your budget threshold is met. 

All the experiences that we provide in Azure Cost Management natively are also available as REST APIs for enabling automated cost management experiences.

Coming soon


◉ We will be enabling cost recommendation and optimization suggestions, for better savings and efficiency in managing Azure costs.

◉ We will launch Azure Cost Management at retail rates for customers who are not on the Microsoft Customer Agreement and are supported by CSP partners.

◉ Showback features that enable partners to charge a markup on consumption costs are also being planned for 2020.

Try Azure Cost Management for partners today! It is natively available in the Azure portal for all partners who have onboarded customers to the new Microsoft Customer Agreement.

Thursday, 14 November 2019

Improving observability of your Kubernetes deployments with Azure Monitor for containers

Over the past few years, we’ve seen significant changes in how an application is thought of and developed, especially with the adoption of containers and the move from traditional monolithic applications to microservices applications. This shift also affects how we think about modern application monitoring, now with greater adoption of open source technologies and the introduction of observability concepts.

In the past, vendors owned the application and infrastructure, and as a result, they knew what metrics to monitor. With open source products growing in number, vendors do not own all the metrics, and custom metrics are extremely necessary with current monitoring tools. Unlike the monolith application, which is a single deployment unit with a simple status of healthy or not, modern applications will consist of dozens of different microservices with fractional n-states. This is due to the sophisticated deployment strategies and rollbacks where customers may be running different versions of the same services in production, especially on Kubernetes. Thus, embracing these shifts is essential in monitoring.

Azure Monitor, Azure Study Materials, Azure Tutorial and Materials, Azure Learning

Custom metrics and open source technologies help improve the observability of specific components of your application, but you also need to monitor the full stack. Azure Monitor for containers embraces both observability through live data and collecting custom metrics using Prometheus, providing the full stack end-to-end monitoring from nodes to Kubernetes infrastructure to workloads.

Azure Monitor, Azure Study Materials, Azure Tutorial and Materials, Azure Learning

Collecting Prometheus metrics and viewing using Grafana dashboards


By instrumenting Prometheus SDK into your workloads, Azure Monitor for containers can scrape the metrics exposed from Prometheus end-points so you can quickly gather failure rates, response per secs, and latency. You can use Prometheus to collect some of the Kubernetes infrastructure metrics that are not provided out of the box by Azure Monitor by configuring the containerized agent.

From Log Analytics, you can easily run a Kusto Query Language (KQL) query and create your custom dashboard in the Azure portal dashboard. For many customers using Grafana to support their dashboard requirements, you can visualize the container and Prometheus metrics in a Grafana dashboard.

Below is an example of a dashboard that provides an end-to-end Azure Kubernetes Service (AKS) cluster overview, node performances, Kubernetes infrastructure, and workloads.

Azure Monitor, Azure Study Materials, Azure Tutorial and Materials, Azure Learning

If you would like to monitor or troubleshoot other scenarios, such as list of all workload live sites, or noisy neighbor issues on a worker node, you can always switch to Azure Monitor for container to view the visualizations included from the Grafana dashboard by clicking on Azure Monitor – Container Insights in the top right-hand corner.

Azure Monitor, Azure Study Materials, Azure Tutorial and Materials, Azure Learning

Azure Monitor, Azure Study Materials, Azure Tutorial and Materials, Azure Learning

Azure Monitor for containers provides the live, real-time data of container logs and Kubernetes event logs to provide observability as seen above. You can see your deployments immediately and observe any anomalies using the live data.

Tuesday, 12 November 2019

Announcing the general availability of the new Azure HPC Cache service

If data-access challenges have been keeping you from running high-performance computing (HPC) jobs in Azure, we’ve got great news to report! The now-available Microsoft Azure HPC Cache service lets you run your most demanding workloads in Azure without the time and cost of rewriting applications and while storing data where you want to—in Azure or on your on-premises storage. By minimizing latency between compute and storage, the HPC Cache service seamlessly delivers the high-speed data access required to run your HPC applications in Azure.

Use Azure to expand analytic capacity—without worrying about data access


Most HPC teams recognize the potential for cloud bursting to expand analytic capacity. While many organizations would benefit from the capacity and scale advantages of running compute jobs in the cloud, users have been held back by the size of their datasets and the complexity of providing access to those datasets, typically stored on long-deployed network-attached storage (NAS) assets. These NAS environments often hold petabytes of data collected over a long period of time and represent significant infrastructure investment.

Here’s where the HPC Cache service can help. Think of the service as an edge cache that provides low-latency access to POSIX file data sourced from one or more locations, including on-premises NAS and data archived to Azure Blob storage. The HPC Cache makes it easy to use Azure to increase analytic throughput, even as the size and scope of your actionable data expands.

Keep up with the expanding size and scope of actionable data


The rate of new data acquisition in certain industries such as life sciences continues to drive up the size and scope of actionable data. Actionable data, in this case, could be datasets that require post-collection analysis and interpretation that in turn drive upstream activity. A sequenced genome can approach hundreds of gigabytes, for example. As the rate of sequencing activity increases and becomes more parallel, the amount of data to store and interpret also increases—and your infrastructure has to keep up. Your power to collect, process, and interpret actionable data—your analytic capacity—directly impacts your organization’s ability to meet the needs of customers and to take advantage of new business opportunities.

Some organizations address expanding analytic throughput requirements by continuing to deploy more robust on-premises HPC environment with high-speed networking and performant storage. But for many companies, expanding on-premises environments presents increasingly daunting and costly challenges. For example, how can you accurately forecast and more economically address new capacity requirements? How do you best juggle equipment lifecycles with bursts in demand? How can you ensure that storage keeps up (in terms of latency and throughput) with compute demands? And how can you manage all of it with limited budget and staffing resources?

Azure services can help you more easily and cost-effectively expand your analytic throughput beyond the capacity of existing HPC infrastructure. You can use tools like Azure CycleCloud and Azure Batch to orchestrate and schedule compute jobs on Azure virtual machines (VMs). More effectively manage cost and scale by using low-priority VMs, as well as Azure Virtual Machine Scale Sets. Use Azure’s latest H- and N-series Virtual Machines to meet performance requirements for your most complex workloads.

So how do you start? It’s straightforward. Connect your network to Azure via ExpressRoute, determine which VMs you will use, and coordinate processes using CycleCloud or Batch—voila, your burstable HPC environment is ready to go. All you need to do is feed it data. Ok, that’s the stickler. This is where you need the HPC Cache service.

Use HPC Cache to ensure fast, consistent data access


Most organizations recognize the benefits of using cloud: a burstable HPC environment can give you more analytic capacity without forcing new capital investments. And Azure offers additional pluses, letting you take advantage of your current schedulers and other toolsets to ensure deployment consistency with your on-premises environment.

But here’s the catch when it comes to data. Your libraries, applications, and location of data may require the same consistency. In some circumstances, a local analytic pipeline may rely on POSIX paths that must be the same whether running in Azure or locally. Data may be linked between directories, and those links may need to be deployed in the same way in the cloud. The data itself may reside in multiple locations and must be aggregated. Above all else, the latency of access must be consistent with what can be realized in the local HPC environment.

To understand how the HPC Cache works to address these requirements, consider it an edge cache that provides low-latency access to POSIX file data sourced from one or more locations. For example, a local environment may contain a large HPC cluster connected to a commercial NAS solution. HPC Cache enables access from that NAS solution to Azure Virtual Machines, containers, or machine learning routines operating across a WAN link. The service accomplishes this by caching client requests (including from the virtual machines), and ensuring that subsequent accesses of that data are serviced by the cache rather than by re-accessing the on-premises NAS environment. This lets you run your HPC jobs at a similar performance level as you could in your own data center. HPC Cache also lets you build a namespace consisting of data located in multiple exports across multiple sources while displaying a single directory structure to client machines.

HPC Cache provides a Blob-backed cache (we call it Blob-as-POSIX) in Azure as well, facilitating migration of file-based pipelines without requiring that you rewrite applications. For example, a genetic research team can load reference genome data into the Blob environment to further optimize the performance of secondary-analysis workflows. This helps mitigate any latency concerns when you launch new jobs that rely on a static set of reference libraries or tools.

Azure Study Material, Azure Guides, Azure Tutorial and Materials, Azure Online Exam
Azure HPC Cache Architecture

HPC Cache Benefits


Caching throughput to match workload requirements

HPC Cache offers three SKUs: up to 2 gigabytes per second (GB/s), up to 4 GB/s, and up to 8 GB/s throughput. Each of these SKUs can service requests from tens to thousands of VMs, containers, and more. Furthermore, you choose the size of your cache disks to control your costs while ensuring the right capacity is available for caching.

Data bursting from your datacenter

HPC Cache fetches data from your NAS, wherever it is. Run your HPC workload today and figure out your data storage policies over the longer term.

High-availability connectivity

HPC Cache provides high-availability (HA) connectivity to clients, a key requirement for running compute jobs at larger scales.

Aggregated namespace

The HPC Cache aggregated namespace functionality lets you build a namespace out of various sources of data. This abstraction of sources makes it possible to run multiple HPC Cache environments with a consistent view of data.

Lower-cost storage, full POSIX compliance with Blob-as-POSIX

HPC Cache supports Blob-based, fully POSIX-compliant storage. HPC Cache, using the Blob-as-POSIX format, maintains full POSIX support including hard links. If you need this level of compliance, you’ll be able to get full POSIX at Blob price points.

Monday, 11 November 2019

Glimpses of Getting the Microsoft 70-466 Certification

This article glimpses the Implementing Data Models and Reports with Microsoft SQL Server (70-466) exam for Business Intelligence developers. Each profile is generated based on the testing experience of trainers or authors. You would not get exact questions or answers, but you will see a real feel for the exam.
70-466, microsoft 70-466, 70-466 Sample Questions, Implementing Data Models and Reports with Microsoft SQL Server Exam, 70-466 Study Guide, 70-466 Certification, Microsoft BI Certification
Each profile describes question forms, trouble spots, tips for exam preparation, and recommendations for further study resources. Find out what you can expect to discuss the exam and how you can adequately prepare for it.

Who Should Take the 70-466 Certification Exam?

The 70-466 exam is intended to be taken by Business Intelligence (BI) Developers who are most likely to focus on hands-on work creating the BI solution, including performing multi-dimensional data models, implementing and managing OLAP cubes, and organizing information displays applied in business decision making.
Microsoft says that the exam is meant to be used by those whose primary responsibilities may include:
  • Working with extensive data collections beyond multiple database systems
  • Cube/warehouse/views fundamentals
  • Data model decisions: Unified Dimension Model (UDM) versus 
  • Develop Cubes, and multidimensional Expressions (MDX) doubts to support analysts
  • Online Analytical Processing (OLAP) cube performance
  • Structure pivot tables from cubes
  • Business Intelligence Semantic Model (BISM)
  • Building PowerPivot solutions
  • Building reports Using Crescent
  • Design and test report models, layouts and templates
  • Manage reporting system: configuration and subscriptions
  • Utilizing report builder to create reports
  • Develop complicated SQL queries for reports
After passing the Implementing Data Models and Reports with Microsoft SQL Server (70-466) certification exam, you complete the necessities for the MCSE: Business Intelligence certification.

Get Ready to Prepare As You Have Never Prepared Before

You are about to view a study guide that took hours of stubborn collection work, expert preparation, and constant feedback. That is why we know this exam prep will help you get that high-score on your journey to certification.

Journey to Pass the 70-466 Certification Exam

Maybe this is your first step toward the certification, or perhaps you are coming back for another round. We believe that you feel this exam challenges you, teaches you, and prepares you to pass the 70-466. It could be the first step to a new high-paying job and an Astonishing career. If you have been around the block a few times, consider taking a moment and answering some questions from newer techies.

What Should You Understand Before Studying The 70-466?

Every exam and certification has different requirements. If this is a severe venture, make sure to read the prerequisites before proceeding. Nothing is more rigorous than wasting months studying for an exam. You cannot take or pass an exam that would not help you get a certification!

What Is The 70-466 Focused On?

  • The 70-466 or as it is also known, the Implementing Data Models and Reports with Microsoft SQL Server, like all tests, there is a bit of freedom on Microsoft's part to exam a collection of subjects. That means knowing the bulk of 70-466 content is needed because they test randomly on the many topics available.
  • Be conscious too that experience requirements often exist because they have observed the average person and what is wanted. You can always push through that to succeed with the 70-466, but it may take some extra work.

Always Check the Foundation

  • Some certifications have elements going back to older exams, while others use two or more tests to improve someone's pass. If you find the 70-466 is over your head, which is ok. It might make feeling to see if a lower-level exam will give you some accuracy.

If Offered, Read the 70-466 Exam Objectives

  • The exam objectives are various for every single exam and usually provided by the certification provider. These typically tell the test taker what subjects are important, what they require to know, and why the exam attempts to cover these topics. It is essential to find them out for your specific exam.
  • This can be found on almost every vendor website and dramatically informs the way you will study! Don’t worry, though, we have taken those goals into account and seek to create a testing experience comparable to an actual exam.

Remember That 70-466 Certification Is Quite Satisfying

  • It can be hard to maintain your focus on studying, but remember that the best jobs in the world are only some tests away. Whether you join Cyber Security or do entry-level tech work, certification is a clear, learnable, and pleasant path to careers that pay a lot of money. They offer better work-life balance, and you will get in with some of the significant leaders in the business world. So do not give up, it is worth it, and all this work will pay off!

Take Online Practice Exam

  • Practicing for an exam like the 70-466 can be a full-time job. Some exams are paid for my work because they are so fast. Certification is not simple and takes tremendous work. It needs time, practice, and the right focus. At AnalyticsExam, they have been in this industry for years and working in space full of online practicing tests. These excellent practicing tests pushed their team to make a positive change in the Exam space.

Love to Say, Work Smarter, Not Harder!

  • To achieve the best result in the 70-466 exam, you need to experience the types of 70-466 exam questions, and you will be asked to answer and prepare for the 70-466 test for every topic.

Conclusion:

And best of all, a lot to sharpen your skills. It is ok if you feel in over your head. We all did at some point, and this next step is about pushing through that fear and making ready to tackle something as challenging as the 70-466. If you get lost, reach out. If you see others stuck, support them.