Saturday, 29 February 2020

Accelerate Your cloud strategy with Skytap on Azure

Azure is the best cloud for existing Microsoft workloads, and we want to ensure all of our customers can take full advantage of Azure services. We work hard to understand the needs of those customers running Microsoft workloads on premises, including Windows Server, and help them to navigate a path to the cloud. But not all customers can take advantage of Azure services due to the diversity of their on-premises platforms, the complexity of their environments, and the mission-critical applications running in those environments.

Microsoft works with many partners to create strategic partnerships to unlock the power of the cloud for customers relying on traditional on-premises application platforms. Azure currently offers several specialized application platforms and experiences, including Cray, SAP, and NetApp, and we continue to invest in additional options and platforms.

Allowing businesses to innovate with the cloud faster


Today we're pleased to share that we are enabling more customers to start on their journey to the cloud. Skytap has announced the availability of Skytap on Azure. The Skytap on Azure service simplifies cloud migration for traditional applications running on IBM Power while minimizing disruption to the business. Skytap has more than a decade of experience working with customers and offering extensible application environments that are compatible with on-premises data centers; Skytap’s environments simplify migration and provide self-service access to develop, deploy, and accelerate innovation for complex applications.

Brad Schick, Skytap CEO: “Today, we are thrilled to make the service generally available.  Enterprises and ISVs can now move their traditional applications from aging data centers and use all the benefits of Azure to innovate faster.”

Cloud migration remains a crucial component for any organization in the transformation of their business, and Microsoft continues to focus on how best to support customers in that journey. We often hear about the importance of enabling the easy movement of existing applications running on traditional on-premises platforms to the cloud and the desire to have those platforms be available on Azure.

The migration of applications running on IBM Power to the cloud is often seen as a difficult and challenging move involving re-platforming. For many businesses, these environments are running traditional, and frequently, mission-critical applications. The idea of re-architecting or re-platforming these applications to be cloud native can be daunting. With Skytap on Azure, customers gain the ability to run native Power workloads, including AIX, IBM i, and Linux on Azure. The Skytap service allows customers to unlock the benefits of the cloud faster and begin innovating across applications sooner, by providing the ability to take advantage of and integrate with the breadth of Azure native services. All of this is possible with minimal changes to the way existing IBM Power applications are managed on-premises.

Azure Study Materials, Azure Tutorial and Material, Azure Guides, Azure Learning, Azure Exam Prep
Application running on IBM Power and x86 in Skytap on Azure.

With Skytap on Azure, Microsoft brings the unique capabilities of IBM Power9 servers to Azure data centers, directly integrating with Azure network, and enabling Skytap to provide their platform with minimal connectivity latency to Azure native services such as Blob Storage, Azure NetApp Files, or Azure Virtual Machines.

Skytap on Azure is now available in the East US Azure region. Given the high level of interest we have seen already, we intend to expand availability to additional regions across Europe, the United States, and Asia Pacific. Stay tuned for more details on specific regional rollout availability.

Thursday, 27 February 2020

70-486 Exam: Ultimate Preparation Guide to Ace It

70-486 exam, Microsoft Exam, Microsoft 70-486, 70-486 Preparation tips, 70-486 practice questions

What is Microsoft 70-486 Exam?

The Microsoft 70-486 is an exam that measures different qualities of professional web developers who use Microsoft visual studio 2017 and ASP.NET to design and develop web solutions. The exam tests the skills of a candidate in different areas. These areas are knowledge of-Troubleshoot and Debug Web Applications, the process of developing the user experience, skills of designing the application architecture, etc.

Who Should Take the Microsoft 70-486 exam?

  • Are you able to design and develop web applications ASP.NET MVC model?
  • Do you have any experience with the full software development life cycle of web applications?
  • Can you design and develop asynchronous solutions?
  • Do you know how to develop and deploy multi-tier environments involving azure? Will you be able to plan and design user interaction solutions based on business needs?

If the answer is yes for all the above-mentioned questions, then the Microsoft 70-486 exam is for you.

Professional developers who use Microsoft Visual Studio 2017 and ASP.NET to design and develop web solutions are eligible for the 70-486 exam. They should have a minimum experience of three to five years of developing ASP.NET MVC based solutions.

70-486 Exam Preparation Tips:

Know about the Syllabus:

Microsoft exams are not so easy to crack. So the first step of the preparation should know about the complete syllabus. The syllabus is divided into six modules. Among the six modules, The Troubleshoot and Debug web applications cover the most. So a candidate can expect to have more questions from this area.

Set a Study Schedule:


Start studying at least six months before the exam. Divide your schedule depending on the complete syllabus. Keep more time for the weighted topics like Troubleshoot and Debug Applications. Other portions of the syllabus are equally important because they carry almost equal weightage.

Take Training for:

Microsoft provides training for the 70-486 exam. So taking the instructor-led training could be very helpful for passing the exam.

Read from a Standard Book:

Exam Ref 70-486 is a guide book for passing the 70-486 exam. Register your name for the book, and get access to maximum benefits.

Take Practice Exams to Crack the Exam:

Following a schedule, completing the whole syllabus are essential factors to ace the 70-486 exam. But evaluating yourself for the exam is very important. You can easily do so by taking online practice exams. These practice exams give an idea to the candidate about their weakness and strength. The attempts to score full marks in the practice exam takes a candidate closer to ace the real exam.

Don’t Follow Dumps to Study for 70-486 Exam:

Dumps are readily available for the 70-486 exam. But they don’t provide you the chance to evaluate yourself. So it is better to follow other online resources.

Final Words:

Passing the 70-486 exam makes a candidate Microsoft Certified Solution Developer. Microsoft certifications are recognized worldwide, so the certification helps to grow the career of a candidate in the web application field. Therefore, the salary of the candidate also increases.

Burst 4K encoding on Azure Kubernetes Service

Burst encoding in the cloud with Azure and Media Excel HERO platform.


Content creation has never been as in demand as it is today. Both professional and user-generated content has increased exponentially over the past years. This puts a lot of stress on media encoding and transcoding platforms. Add the upcoming 4K and even 8K to the mix and you need a platform that can scale with these variables. Azure Cloud compute offers a flexible way to grow with your needs. Microsoft offers various tools and products to fully support on-premises, hybrid, or native cloud workloads. Azure Stack offers support to a hybrid scenario for your computing needs and Azure ARC helps you to manage hybrid setups.

Finding a solution


Generally, 4K/UHD live encoding is done on dedicated hardware encoder units, which cannot be hosted in a public cloud like Azure. With such dedicated hardware units hosted on-premise that need to push 4K into the Azure data center the immediate problem we face is a need for high bandwidth network connection between the encoder unit on-premise and Azure data center. In general, it's a best practice to ingest into multiple regions, increasing the load on the network connected between the encoder and the Azure Datacenter.

How do we ingest 4K content reliably into the public cloud?

Alternatively, we can encode the content in the cloud. If we can run 4K/UHD live encoding in Azure, its output can be ingested into Azure Media Services over the intra-Azure network backbone which provides sufficient bandwidth and reliability.

How can we reliably run and scale 4K/UHD live encoding on the Azure cloud as a containerized solution? Let's explore below.

Azure Kubernetes Service


With Azure Kubernetes Services (AKS) Microsoft offers a managed Kubernetes platform to customers. It is a hosted Kubernetes platform without having to spend a lot of time creating a cluster with all the necessary configuration burden like networking, cluster masters, and OS patching of the cluster nodes. It also comes with pre-configured monitoring seamlessly integrating with Azure Monitor and Log Analytics. Of course, it still offers flexibility to integrate your own tools. Furthermore, it is still just the plain vanilla Kubernetes and as such is fully compatible with any existing tooling you might have running on any other standard Kubernetes platform.

Media Excel encoding


Media Excel is an encoding and transcoding vendor offering physical appliance and software-based encoding solutions. Media Excel has been partnering with Microsoft for many years and engaging in Azure media customer projects. They are also listed as recommended and tested contribution encoder for Azure Media Services for fMP4. There has also work done by both Media Excel and Microsoft to integrate SCTE-35 timed metadata from Media Excel encoder to an Azure Media Services Origin supporting Server-Side Ad Insertion (SSAI) workflows.

Networking challenge


With increasing picture quality like 4K and 8K, the burden on both compute and networking becomes a significant architecting challenge. In a recent engagement with a customer, we needed to architect a 4K live streaming platform with a challenge of limited bandwidth capacity from the customer premises to one of our Azure Datacenters. We worked with Media Excel to build a scalable containerized encoding platform on AKS. Utilizing cloud compute and minimizing network latency between Encoder and Azure Media Services Packager. Multiple bitrates with a top bitrate up to 4Kp60@20Mbps of the same source are generated in the cloud and ingested into the Azure Media Services platform for further processing. This includes Dynamic Encryption and Packaging. This setup enables the following benefits:

◉ Instant scale to multiple AKS nodes

◉ Eliminate network constraints between customer and Azure Datacenter

◉ Automated workflow for containers and easy separation of concern with container technology

◉ Increased level of security of high-quality generated content to distribution

◉ Highly redundant capability

◉ Flexibility to provide various types of Node pools for optimized media workloads

In this particular test, we proved that the intra-Azure network is extremely capable of shipping high bandwidth, latency-sensitive 4K packets from a containerized encoder instance running in West Europe to both East US and Honk Kong Datacenter Regions. This allows the customer to place origin closer to them for further content conditioning.

Azure Study Materials, Azure Guides, Azure Learning, Azure Certification, Azure Prep

Workflow:

1. Azure Pipeline is triggered to deploy onto the AKS cluster. In the YAML file (which you can find on Github) there is a reference to the Media Excel Container in Azure Container Registry.

2. AKS starts deployment and pulls container from Azure Container Registry.

3. During Container start custom PHP script is loaded and container is added to the HMS (Hero Management Service). And placed into the correct device pool and job.

4. Encoder loads source and (in this case) push 4K Livestream into Azure Media Services.

5. Media Services packaged Livestream into multiple formats and apply DRM (digital rights management).

6. Azure Content Deliver Network scales livestream.

Scale through Azure Container Instances


With Azure Kubernetes Services you get the power of Azure Container Instances out of the box. Azure Container Instances are a way to instantly scale to pre-provisioned compute power at your disposal. When deploying Media Excel encoding instances to AKS you can specify where these instances will be created. This offers you the flexibility to work with variables like increased density on cheaper nodes for low-cost low priority encoding jobs or more expensive nodes for high throughput high priority jobs. With Azure Container Instances you can instantly move workloads to standby compute power without provisioning time. You only pay for the compute time offering full flexibility for customer demand and future change in platform needs. With Media Excel’s flexible Live/File based encoding roles you can easily move workloads across different compute power offered by AKS and Azure Container Instances.

Azure Study Materials, Azure Guides, Azure Learning, Azure Certification, Azure Prep

Azure Study Materials, Azure Guides, Azure Learning, Azure Certification, Azure Prep

Azure DevOps pipeline to bring it all together


All the general benefits that come with containerized workload apply in the following case. For this particular proof-of-concept, we created an automated deployment pipeline in Azure DevOps for easy testing and deployment. With a deployment YAML and Pipeline YAML we can easily automate deployment, provisioning and scaling of a Media Excel encoding container. Once DevOps pushes the deployment job onto AKS a container image is pulled from Azure Container Registry. Although container images can be bulky utilizing node side caching of layers any additional container pull is greatly improved down to seconds. With the help of Media Excel, we created a YAML file container pre- and post-container lifecycle logic that will add and remove a container from Media Excel's management portal. This offers an easy single pane of glass management of multiple instances across multiple node types, clusters, and regions.

This deployment pipeline offers full flexibility to provision certain multi-tenant customers or job priority on specific node types. This unlocks the possibility of provision encoding jobs on GPU enabled nodes for maximum throughput or using cheaper generic nodes for low priority jobs.


Azure Media Services and Azure Content Delivery Network


Finally, we push the 4K stream into Azure Media Services. Azure Media Services is a cloud-based platform that enables you to build solutions that achieve broadcast-quality video streaming, enhance accessibility and distribution, analyze content, and much more. Whether you're an app developer, a call center, a government agency, or an entertainment company, Media Services helps you create apps that deliver media experiences of outstanding quality to large audiences on today’s most popular mobile devices and browsers.

Azure Media Services is seamlessly integrated with Azure Content Delivery Network. With Azure Content Delivery Network we offer a true multi CDN with choices of Azure Content Delivery Network from Microsoft, Azure Content Delivery Network from Verizon, and Azure Content Delivery Network from Akamai. All of this through a single Azure Content Delivery Network API for easy provisioning and management. As an added benefit, all CDN traffic between Azure Media Services Origin and CDN edge is free of charge.

With this setup, we’ve demonstrated that Cloud encoding is ready to handle real-time 4K encoding across multiple clusters. Thanks to Azure services like AKS, Container Registry, Azure DevOps, Media Services, and Azure Content Delivery Network, we demonstrated how easy it is to create an architecture that is capable of meeting high throughput time-sensitive constraints.

Tuesday, 25 February 2020

Fileless attack detection for Linux in preview

Attackers are increasingly employing stealthier methods to avoid detection. Fileless attacks exploit software vulnerabilities, inject malicious payloads into benign system processes, and hide in memory. These techniques minimize or eliminate traces of malware on disk, and greatly reduce the chances of detection by disk-based malware scanning solutions.

To counter this threat, Azure Security Center released fileless attack detection for Windows in October 2018. Our blog post from 2018 explains how Security Center can detect shellcode, code injection, payload obfuscation techniques, and other fileless attack behaviors on Windows. Our research indicates the rise of fileless attacks on Linux workloads as well.

Today, Azure Security Center is happy to announce a preview for detecting fileless attacks on Linux.  In this post, we will describe a real-world fileless attack on Linux, introduce our fileless attack detection capabilities, and provide instructions for onboarding to the preview.

Real-world fileless attack on Linux


One common pattern we see is attackers injecting payloads from packed malware on disk into memory and deleting the original malicious file from the disk. Here is a recent example:

1. An attacker infects a Hadoop cluster by identifying the service running on a well-known port (8088) and uses Hadoop YARN unauthenticated remote command execution support to achieve runtime access on the machine. Note, the owner of the subscription could have mitigated this stage of the attack by configuring Security Center JIT.

2. The attacker copies a file containing packed malware into a temp directory and launches it.

3. The malicious process unpacks the file using shellcode to allocate a new dynamic executable region of memory in the process’s own memory space and injects an executable payload into the new memory region.

4. The malware then transfers execution to the injected ELF entry point.

5. The malicious process deletes the original packed malware from disk to cover its tracks.

6. The injected ELF payload contains a shellcode that listens for incoming TCP connections, transmitting the attacker’s instructions.

This attack is difficult for scanners to detect. The payload is hidden behind layers of obfuscation and only present on disk for a short time.  With the fileless attack detection preview, Security Center can now identify these kinds of payloads in memory and inform users of the payload’s capabilities.

Fileless attacks detection capabilities


Like fileless attack detection for Windows, this feature scans the memory of all processes for evidence of fileless toolkits, techniques and behaviors. Over the course of the preview, we will be enabling and refining our analytics to detect the following behaviors of userland malware:

◉ Well known toolkits and crypto mining software.

◉ Shellcode, injected ELF executables, and malicious code in executable regions of process memory.

◉ LD_PRELOAD based rootkits to preload malicious libraries.

◉ Elevation of privilege of a process from non-root to root.

◉ Remote control of another process using ptrace.

In the event of a detection, you receive an alert in the Security alerts page. Alerts contain supplemental information such as the kind of techniques used, process metadata, and network activity. This enables analysts to have a greater understanding of the nature of the malware, differentiate between different attacks, and make more informed decisions when choosing remediation steps.

Azure Study Material, Azure Guides, Azure Learning, Azure Certifications, Azure Prep

The scan is non-invasive and does not affect the other processes on the system.  The vast majority of scans run in less than five seconds. The privacy of your data is protected throughout this procedure as all memory analysis is performed on the host itself. Scan results contain only security-relevant metadata and details of suspicious payloads.

Source: azure.microsoft.com

Friday, 21 February 2020

The Complete Guide to MCSE 70-764 Certification

When it comes to accomplishing your set goals, having an efficient strategy will help you achieve them. As for the Microsoft certification exams, the applicants need to ensure that the study materials they’re using will give the expected results.
The study resources must help them in meeting the real requirements of the exam they plan to pass. The exams require competent and reliable tools that help study all the exam topics.If you’re aspiring to take this exam, then this article will be beneficial. You’ll have knowledge of the best tools to use when preparing for this Microsoft certification exam. Before that, let’s look at the essential details.

Microsoft 70-764: Exam Details

The Microsoft 70-764 exam is associated with the MCSA certification. It’s also considered the exam that strengthens your skill and expertise in SQL Server technologies. As it concentrates on administering a SQL Database infrastructure, the exam includes on-premises and cloud-based databases.

It is the first of two Microsoft exams that you must pass to achieve the MCSA: SQL 2016 Database Administration certification. It’s intended for the database experts who have experience configuring, installing, and operating database systems. The second exam to pass is 70-765Microsoft Provisioning SQL Databases.

You also must be proficient in protecting databases from unauthorized access. This indicates that you must understand the Microsoft70-764 exam syllabus topics. The exam is consisting of 40-60 questions. These questions are structured in different formats, for example, multiple-choice, case study, active screen, mark review, and short answer. You have to complete the exam within 120 minutes and get 700 points to pass the exam. The cost for Microsoft 70-764 exam is $165.

The Microsoft 70-764 exam assesses one’s skill to carry out various technical tasks. These tasks include:


  • Configuring data access and auditing;
  • Managing backup and restoration of the database;
  • Monitoring and managing SQL Server instances;
  • Managing disaster recovery and high availability.

Microsoft 70-764 is ideal for IT professionals with experience in dealing with the database. Though, it’s also open to passionate applicants who plan to become experienced database professionals.

Microsoft 70-764: Corresponding Certification

As we discussed earlier, the Microsoft 70-764 exam results in earning the MCSA: SQL 2016 Database Administration certification. This certification evolves your skills in SQL server technologies focusing on installation, configuration, and provision of advanced SQL databases.

Apart from knowing the appropriate certification, it’s necessary to bear in mind that they may expire. And there is always the certification that will appear in place of the older and irrelevant versions. But as for MCSA: SQL 2016 Database Administration, it is still valid, and many professionals can earn it.

Microsoft 70-764: Career Prospects

Once you clear the Microsoft 70-764 exam, you’ll have an opportunity to work in an entry-level job. This lets you execute your skills and accommodate a Cloud-optimized technology solution. The MCSAcertification holders have an exciting and complete career path. With the help of this certification, you can work as a database administrator or an infrastructure professional.

Your career prospects will become more magnificent. The moment you achieve the MCSA: SQL 2016 Database Administration certification, you’ll have the opportunity to get more high-level certifications awarded by Microsoft. The advanced certification you can pursue is MCSE: Data Management and Analytics. This gives you the chance to work in lucrative positions, such as a database analyst, a database designer, or a business intelligence analyst.

Conclusion
These best prep resources for the Microsoft 70-764 exam are only helpful if you make use of them as needed. The preparation method isn’t just about studying; there’s also the aspect of how you do it. Ensure to revise with dedication while having your emphasize on doing well in the examination. With the appropriate tools and approach, there’s no way you’re going to fail your certification exam.

Thursday, 20 February 2020

Azure Offline Backup with Azure Data Box now in preview

An ever-increasing number of enterprises, even as they adopt a hybrid IT strategy, continue to retain mission-critical data on-premises and look towards the public cloud as an effective offsite for their backups. Azure Backup—Azure’s built-in data-protection solution, provides a simple, secure, and cost-effective mechanism to backup these data-assets over the network to Azure, while eliminating on-premises backup infrastructure. After the initial full backup of data, Azure Backup transfers only incremental changes in the data, thereby delivering continued savings on both network and storage.

With the exponential growth in critical enterprise data, the initial full backups are reaching terabyte scale. Transferring these large full-backups over the network, especially in high-latency network environments or remote offices, may take weeks or even months. Our customers are looking for more efficient ways beyond fast networks to transfer these large initial backups to Azure. Microsoft Azure Data Box solves the problem of transferring large data sets to Azure by enabling the “offline” transfer of data using secure, portable, and easy-to-get Microsoft appliances.

Announcing the preview of Azure Offline Backup with Azure Data Box


Today, we are thrilled to add the power of Azure Data Box to Azure Backup, and announce the preview program for offline initial backup of large datasets using Azure Data Box! With this preview, customers will be able to use Azure Data Box with Azure Backup to seed large initial backups (up to 80 TB per server) offline to an Azure Recovery Services Vault. Subsequent backups will take place over the network.

Azure Study Material, Azure Guides, Azure Tutorial and Material, Azure Prep, Azure Exam Prep

This preview is currently available to the customers of Microsoft Azure Recovery Services agent and is a much-awaited addition to the existing support for offline backup using Azure Import/Export Services. 

Key benefits


The Azure Data Box addition to Azure Backup delivers core benefits of the Azure Data Box service while offering key advantages over the Azure Import/Export based offline backup.

◉ Simple— No need to procure your own Azure-compatible disks or connectors as with the Azure Import based offline backup. Simply order and receive one or more Data Box appliances from your Azure subscription, plug-in, fill with backup data, return to Azure, and track all of it on the Azure portal.

◉ Built-in— The Azure Data Box based offline backup experience is built-into the Recovery Services agent, so you can easily discover and detect your received Azure Data Box appliances, transfer backup data, and track the completion of the initial backup directly from the agent.

◉ Secure— Azure Data Box is a tamper-resistant appliance that comes with ruggedized casing to handle bumps and bruises during transport and supports 256-bit AES encryption on your data.

◉ Efficient— Get freedom from provisioning temporary storage (staging locations) or use of additional tools to prepare disks and copy data, as in the Azure Import based offline backup. Azure Backup directly copies backup data to Azure Data Box, delivering savings on storage and time, and eliminating additional copy tools.

Getting started


Seeding your large initial backups using Azure Backup and Azure Data Box involves the following high-level steps. 

1. Order and receive your Azure Data Box based on the amount of data you want to backup from a server. Order an Azure Data Box Disk if you want to backup less than 7.2 TB of data. Order an Azure Data Box to backup up to 80 TB of data.

2. Install and register the latest Recovery Services agent to an Azure Recovery Services Vault.

3. Select the “Transfer using Microsoft Azure Data Box disks” option for offline backup as part of scheduling your backups with the Recovery Services agent.

Azure Study Material, Azure Guides, Azure Tutorial and Material, Azure Prep, Azure Exam Prep

4. Trigger Backup to Azure Data Box from the Recovery Services Agent.

5. Return Azure Data Box to Azure.

Azure Data Box and Azure Backup will automatically upload the data to the Azure Recovery Services Vault.

Tuesday, 18 February 2020

SQL Server runs best on Azure. Here’s why

SQL Server customers migrating their databases to the cloud have multiple choices for their cloud destination. To thoroughly assess which cloud is best for SQL Server workloads, two key factors to consider are:

1. Innovations that the cloud provider can uniquely provide.

2. Independent benchmark results.

What innovations can the cloud provider bring to your SQL Server workloads?


As you consider your options for running SQL Server in the cloud, it's important to understand what the cloud provider can offer both today and tomorrow. Can they provide you with the capabilities to maximize the performance of your modern applications? Can they automatically protect you against vulnerabilities and ensure availability for your mission-critical workloads?

SQL Server customers benefit from our continued expertise developed over the past 25 years, delivering performance, security, and innovation. This includes deploying SQL Server on Azure, where we provide customers with innovations that aren’t available anywhere else. One great example of this is Azure BlobCache, which provides fast, free reads for customers. This feature alone provides tremendous value to our customers that is simply unmatched in the market today.

Additionally, we offer preconfigured, built-in security and management capabilities that automate tasks like patching, high availability, and backups. Azure also offers advanced data security that enables both vulnerability assessments and advanced threat protection. Customers benefit from all of these capabilities both when using our Azure Marketplace images and when self-installing SQL Server on Azure virtual machines.

Only Azure offers these innovations.

What are their performance results on independent, industry-standard benchmarks?


Benchmarks can often be useful tools for assessing your cloud options. It's important, though, to ask if those benchmarks were conducted by independent third parties and whether they used today’s industry-standard methods.

Azure Tutorial and Material, Azure Guides, Azure Certification, Azure Learning, Azure SQL Server

In December, an independent study by GigaOm compared SQL Server on Azure Virtual Machines to AWS EC2 using a field test derived from the industry standard TPC-E benchmark. GigaOm found Azure was up to 3.4x faster and 87 percent cheaper than AWS. Today, we are pleased to announce that in GigaOm’s second benchmark analysis, using the latest virtual machine comparisons and disk striping, Azure was up to 3.6x faster and 84 percent cheaper than AWS.

Saturday, 15 February 2020

Announcing the preview of Azure Shared Disks for clustered applications

Today, we are announcing the limited preview of Azure Shared Disks, the industry’s first shared cloud block storage. Azure Shared Disks enables the next wave of block storage workloads migrating to the cloud including the most demanding enterprise applications, currently running on-premises on Storage Area Networks (SANs). These include clustered databases, parallel file systems, persistent containers, and machine learning applications. This unique capability enables customers to run latency-sensitive workloads, without compromising on well-known deployment patterns for fast failover and high availability. This includes applications built for Windows or Linux-based clustered filesystems like Global File System 2 (GFS2).

With Azure Shared Disks, customers now have the flexibility to migrate clustered environments running on Windows Server, including Windows Server 2008 (which has reached End-of-Support), to Azure. This capability is designed to support SQL Server Failover Cluster Instances (FCI), Scale-out File Servers (SoFS), Remote Desktop Servers (RDS), and SAP ASCS/SCS running on Windows Server.

We encourage you to get started and request access by filling out this form.

Leveraging Azure Shared Disks


Azure Shared Disks provides a consistent experience for applications running on clustered environments today. This means that any application that currently leverages SCSI Persistent Reservations (PR) can use this well-known set of commands to register nodes in the cluster to the disk. The application can then choose from a range of supported access modes for one or more nodes to read or write to the disk. These applications can deploy in highly available configurations while also leveraging Azure Disk durability guarantees.

The below diagram illustrates a sample two-node clustered database application orchestrating failover from one node to the other.

Azure Study Materials, Azure Guides, Azure Tutorial and Material, Azure Certification, Azure Exam Prep

The flow is as follows:

1. The clustered application running on both Azure VM 1 and  Azure VM 2 registers the intent to read or write to the disk.

2. The application instance on Azure VM 1 then takes an exclusive reservation to write to the disk.

3. This reservation is enforced on Azure Disk and the database can now exclusively write to the disk. Any writes from the application instance on Azure VM 2 will not succeed.

4. If the application instance on Azure VM 1 goes down, the instance on Azure VM 2 can now initiate a database failover and take-over of the disk.

5. This reservation is now enforced on the Azure Disk, and it will no longer accept writes from the application on Azure VM 1. It will now only accept writes from the application on Azure VM 2.

6. The clustered application can complete the database failover and serve requests from Azure VM 2.

The below diagram illustrates another common workload consists of multiple nodes reading data from the disk to run parallel jobs, for example, training of Machine Learning models.

Azure Study Materials, Azure Guides, Azure Tutorial and Material, Azure Certification, Azure Exam Prep

The flow is as follows:

1. The application registers all Virtual Machines registers to the disk.

2. The application instance on Azure VM 1 then takes an exclusive reservation to write to the disk while opening up reads from other Virtual Machines.

3. This reservation is enforced on Azure Disk.

4. All nodes in the cluster can now read from the disk. Only one node writes results back to the disk on behalf of all the nodes in the cluster.

Disk types, sizes, and pricing


Azure Shared Disks are available on Premium SSDs and supports disk sizes including and greater than P15 (i.e. 256 GB). Support for Azure Ultra Disk will be available soon. Azure Shared Disks can be enabled as data disks only (not OS Disks). Each additional mount to an Azure Shared Disk (Premium SSDs) will be charged based on disk size. 

Azure Shared Disks vs Azure Files


Azure Shared Disks provides shared access to block storage which can be leveraged by multiple virtual machines. You will need to use a common Windows and Linux-based cluster manager like Windows Server Failover Cluster (WSFC), Pacemaker, or Corosync for node-to-node communication and to enable write locking. If you are looking for a fully-managed files service on Azure that can be accessed using Server Message Block (SMB) or Network File System (NFS) protocol, check out Azure Premium Files or Azure NetApp Files.

Thursday, 13 February 2020

Microsoft Connected Vehicle Platform: trends and investment areas

Microsoft Online Exam, Azure Study Materials, Azure Prep, Azure Exam Prep

The past year has been eventful for a lot of reasons. At Microsoft, we’ve expanded our partnerships, including Volkswagen, LG Electronics, Faurecia, TomTom, and more, and taken the wraps off new thinking such as at CES, where we recently demonstrated our approach to in-vehicle compute and software architecture.

Looking ahead, areas that were once nominally related now come into sharper focus as the supporting technologies are deployed and the various industry verticals mature. The welcoming of a new year is a good time to pause and take in what is happening in our industry and in related ones with an aim to developing a view on where it’s all heading.

In this blog, we will talk about the trends that we see in connected vehicles and smart cities and describe how we see ourselves fitting in and contributing.

Trends


Mobility as a Service (Maas)

MaaS (sometimes referred to as Transportation as a Service, or TaaS) is about people getting to goods and services and getting those goods and services to people. Ride-hailing and ride-sharing come to mind, but so do many other forms of MaaS offerings such as air taxis, autonomous drone fleets, and last-mile delivery services. We inherently believe that completing a single trip—of a person or goods—will soon require a combination of passenger-owned vehicles, ride-sharing, ride-hailing, autonomous taxis, bicycle-and scooter-sharing services transporting people on land, sea, and in the air (what we refer to as “multi-modal routing”). Service offerings that link these different modes of transportation will be key to making this natural for users.

With Ford, we are exploring how quantum algorithms can help improve urban traffic congestion and develop a more balanced routing system. We’ve also built strong partnerships with TomTom for traffic-based routing as well as with AccuWeather for current and forecast weather reports to increase awareness of weather events that will occur along the route. In 2020, we will be integrating these routing methods together and making them available as part of the Azure Maps service and API. Because mobility constitutes experiences throughout the day across various modes of transportation, finding pickup locations, planning trips from home and work, and doing errands along the way, Azure Maps ties the mobility journey with cloud APIs and iOS and Android SDKs to deliver in-app mobility and mapping experiences. Coupled with the connected vehicle architecture of integration with federated user authentication, integration with the Microsoft Graph, and secure provisioning of vehicles, digital assistants can support mobility end-to-end. The same technologies can be used in moving goods and retail delivery systems.

The pressure to become profitable will force changes and consolidation among the MaaS providers and will keep their focus on approaches to reducing costs such as through autonomous driving. Incumbent original equipment manufacturers (OEMs) are expanding their businesses to include elements of car-sharing to continue evolving their businesses as private car ownership is likely to decline over time.

Connecting vehicles to the cloud

We refer holistically to these various signals that can inform vehicle routing (traffic, weather, available modalities, municipal infrastructure, and more) as “navigation intelligence.” Taking advantage of this navigation intelligence will require connected vehicles to become more sophisticated than just logging telematics to the cloud.

The reporting of basic telematics (car-to-cloud) is barely table-stakes; over-the-air updates (OTA, or cloud-to-car) will become key to delivering a market-competitive vehicle, as will command-and-control (more cloud-to-car, via phone apps). Forward-thinking car manufacturers deserve a lot of credit here for showing what’s possible and for creating in consumers the expectation that the appearance of new features in the car after it is purchased isn’t just cool, but normal.

Future steps include the integration of in-vehicle infotainment (IVI) with voice assistants that blend the in- and out-of-vehicle experiences, updating AI models for in-market vehicles for automated driving levels one through five, and of course pre-processing the telemetry at the edge in order to better enable reinforcement learning in the cloud as well as just generally improving services.

Delivering value from the cloud to vehicles and phones

As vehicles become more richly connected and deliver experiences that overlap with what we’ve come to expect from our phones, an emerging question is, what is the right way to make these work together? Projecting to the IVI system of the vehicle is one approach, but most agree that vehicles should have a great experience without a phone present.

Separately, phones are a great proxy for “a vehicle” in some contexts, such as bicycle sharing, providing speed, location, and various other probe data, as well as providing connectivity (as well as subsidizing the associated costs) for low-powered electronics on the vehicle.

This is probably a good time to mention 5G. The opportunity 5G brings will have a ripple effect across industries. It will be a critical foundation for the continued rise of smart devices, machines, and things. They can speak, listen, see, feel, and act using sensitive sensor technology as well as data analytics and machine learning algorithms without requiring “always on” connectivity. This is what we call the intelligent edge. Our strategy is to enable 5G at the edge through cloud partnerships, with a focus on security and developer experience.

Optimizations through a system-of-systems approach

Connecting things to the cloud, getting data into the cloud, and then bringing the insights gained through cloud-enabled analytics back to the things is how optimizations in one area can be brought to bear in another area. This is the essence of digital transformation. Vehicles gathering high-resolution imagery for improving HD maps can also inform municipalities about maintenance issues. Accident information coupled with vehicle telemetry data can inform better PHYD (pay how you drive) insurance plans as well as the deployment of first responder infrastructure to reduce incident response time.

As the vehicle fleet electrifies, the demand for charging stations will grow. The way in-car routing works for an electric car is based only on knowledge of existing charging stations along the route—regardless of the current or predicted wait-times at those stations. But what if that route could also be informed by historical use patterns and live use data of individual charging stations in order to avoid arriving and having three cars ahead of you? Suddenly, your 20-minute charge time is actually a 60-minute stop, and an alternate route would have made more sense, even if, on paper, it’s more miles driven.

Realizing these kinds of scenarios means tying together knowledge about the electrical grid, traffic patterns, vehicle types, and incident data. The opportunities here for brokering the relationships among these systems are immense, as are the challenges to do so in a way that encourages the interconnection and sharing while maintaining privacy, compliance, and security.

Laws, policies, and ethics

The past several years of data breaches and elections are evidence of a continuously evolving nature of the security threats that we face. That kind of environment requires platforms that continuously invest in security as a fundamental cost of doing business.

Laws, regulatory compliance, and ethics must figure into the design and implementation of our technologies to as great a degree as goals like performance and scalability do. Smart city initiatives, where having visibility into the movement of people, goods, and vehicles is key to doing the kinds of optimizations that increase the quality of life in these cities, will confront these issues head-on.

Routing today is informed by traffic conditions but is still fairly “selfish:” routing for “me” rather than for “we.” Cities would like a hand in shaping traffic, especially if they can factor in deeper insights such as the types of vehicles on the road (sending freight one way versus passenger traffic another way), whether or not there is an upcoming sporting event or road closure, weather, and so on.

Doing this in a way that is cognizant of local infrastructure and the environment is what smart cities initiatives are all about.

With the Microsoft Connected Vehicle Platform (MCVP) and an ecosystem of partners across the industry, Microsoft offers a consistent horizontal platform on top of which customer-facing solutions can be built. MCVP helps mobility companies accelerate the delivery of digital services across vehicle provisioning, two-way network connectivity, and continuous over-the-air updates of containerized functionality. MCVP provides support for command-and-control, hot/warm/cold path for telematics, and extension hooks for customer/third-party differentiation. Being built on Azure, MCVP then includes the hyperscale, global availability, and regulatory compliance that comes as part of Azure. OEMs and fleet operators leverage MCVP as a way to “move up the stack” and focus on their customers rather than spend resources on non-differentiating infrastructure.

Innovation in the automotive industry


At Microsoft, and within the Azure IoT organization specifically, we have a front-row seat on the transformative work that is being done in many different industries, using sensors to gather data and develop insights that inform better decision-making. We are excited to see these industries on paths that are trending to converging, mutually beneficial paths. Our colleague Sanjay Ravi shares his thoughts from an automotive industry perspective in this great article.

Turning our attention to our customer and partner ecosystem, the traction we’ve gotten across the industry has been overwhelming:

The Volkswagen Automotive Cloud will be one of the largest dedicated clouds of its kind in the automotive industry and will provide all future digital services and mobility offerings across its entire fleet. More than 5 million new Volkswagen-specific brand vehicles are to be fully connected on Microsoft’s Azure cloud and edge platform each year. The Automotive Cloud subsequently will be rolled out on all Group brands and models.

Cerence is working with us to integrate Cerence Drive products with MCVP. This new integration is part of Cerence’s ongoing commitment to delivering a superior user experience in the car through interoperability across voice-powered platforms and operating systems. Automakers developing their connected vehicle solutions on MCVP can now benefit from Cerence’s industry-leading conversational AI, in turn delivering a seamless, connected, voice-powered experience to their drivers.

Ericsson, whose Connected Vehicle Cloud connects more than 4 million vehicles across 180 countries, is integrating their Connected Vehicle Cloud with Microsoft’s Connected Vehicle Platform to accelerate the delivery of safe, comfortable, and personalized connected driving experiences with our cloud, AI, and IoT technologies.

LG Electronics is working with Microsoft to build its automotive infotainment systems, building management systems and other business-to-business collaborations. LG will leverage Microsoft Azure cloud and AI services to accelerate the digital transformation of LG’s B2B business growth engines, as well as Automotive Intelligent Edge, the in-vehicle runtime environment provided as part of MCVP.

Global technology company ZF Friedrichshafen is transforming into a provider of software-driven mobility solutions, leveraging Azure cloud services and developer tools to promote faster development and validation of connected vehicle functions on a global scale.

Faurecia is collaborating with Microsoft to develop services that improve comfort, wellness, and infotainment as well as bring digital continuity from home or the office to the car. At CES, Faurecia demonstrated how its cockpit integration will enable Microsoft Teams video conferencing. Using Microsoft Connected Vehicle Platform, Faurecia also showcased its vision of playing games on the go, using Microsoft’s new Project xCloud streaming game preview.

Bell has revealed AerOS, a digital mobility platform that will give operators a 360° view into their aircraft fleet. By leveraging technologies like artificial intelligence and IoT, AerOS provides powerful capabilities like fleet master scheduling and real-time aircraft monitoring, enhancing Bell’s Mobility-as-a-Service (MaaS) experience. Bell chose Microsoft Azure as the technology platform to manage fleet information, observe aircraft health, and manage the throughput of goods, products, predictive data, and maintenance.

Luxoft is expanding its collaboration with Microsoft to accelerate the delivery of connected vehicle solutions and mobility experiences. By leveraging MCVP, Luxoft will enable and accelerate the delivery of vehicle-centric solutions and services that will allow automakers to deliver unique features such as advanced vehicle diagnostics, remote access and repair, and preventive maintenance. Collecting real usage data will also support vehicle engineering to improve manufacturing quality.

We are incredibly excited to be a part of the connected vehicle space. With MCVP, our ecosystem partners and our partnerships with leading automotive players, both vehicle OEMs and automotive technology suppliers, we believe we have a uniquely capable offering enabling at global scale the next wave of innovation in the automotive industry as well as related verticals such as smart cities, smart infrastructure, insurance, transportation, and beyond.

Tuesday, 11 February 2020

Azure IoT improves pharmaceutical sample management and medication adherence

For the recent IoT Signals report, commissioned by our Azure IoT team and conducted by Hypothesis Group, more than 3,000 decision makers at enterprise companies across the US, UK, Germany, France, China, and Japan who were currently involved in IoT, participated in a 20-minute online survey. Healthcare was one of the industries included in the research. Of the healthcare executives surveyed, 82 percent said they have at least one IoT project in either the learning, proof of concept, purchase, or use phase, with many reporting they have one or more projects currently in ‘use.’ The top use cases cited by the healthcare executives included:

◉ Tracking patient staff and inventory.

◉ Remote device monitoring and service.

◉ Remote health monitoring and assistance.

◉ Safety, security, and compliance.

◉ Facilities management.

Today we want to shed light on how two innovative companies are building upon this momentum and their own research to build IoT-enabled solutions with Azure IoT technologies that support medication management and adherence. These solutions address the safety, security, compliance, and inventory use cases highlighted in the report.

The Cost of Pharmaceutical Samples


According to a January 2019 article published by JAMA, Medical Marketing in the United States, 1997-2016, “Marketing to health care professionals by pharmaceutical companies accounted for [the] most promotional spending and increased from $15.6 billion to $20.3 billion, including $5.6 billion for prescriber detailing, $13.5 billion for free samples.”

Improving sample management


Azure IoT, Azure Study Material, Azure Tutorial and Material, Azure Study Materials, Azure Prep
With billions of dollars on the line, one of our partners has developed an innovative way to ensure that pharmaceutical companies manage their samples in a cost-effective way. Using their own knowledge of the pharmaceutical industry and in-depth research, P360 (formerly Prescriber360), developed Swittons to Example of a branded virtual rep devicebridge the gap between pharmaceutical companies and physicians. Designed as a “virtual pharmaceutical representative,” this IoT-enabled device offers real-time, secure communications between the physician and the pharmaceutical company. With this single device, physicians can order a sample, request a visit from a medical science liaison (MSL) or sales rep, or connect with the pharmaceutical company’s inside sales rep (as shown in the graphic below).

Designed to be branded with each pharmaceutical company’s product, the device is a physician engagement tool that enables pharmaceutical companies to customize and manage a sales channel that remains fully authentic to their brand experience. Furthermore, it provides an audit trail to manage samples more economically, enabling pharmaceutical companies to penetrate market whitespace and extend efficient sampling in areas that were previously unreachable.

Azure IoT, Azure Study Material, Azure Tutorial and Material, Azure Study Materials, Azure Prep
Built on our Azure IoT platform, Swittons takes advantage of the latest in cloud, security, telecommunications, and analytics technology. “We strategically selected Azure IoT as the foundation for our Swittons ‘Virtual Rep.’ Microsoft’s vision, investments and the breadth of Azure cloud were the key criteria for selection. Having a reliable IoT platform along with world-class data and security infrastructure in Azure made the choice very easy,” commented Anupam Nandwana, CEO, P360, parent company of Swittons.

On the other end of the pharmaceutical supply chain is another scenario that dramatically affects the efficacy of pharmaceutical products—medication adherence.

Ensuring medication adherence


In the US today, 25 to 50 percent of all adults fail to take their prescribed medication on time, contributing to poor health outcomes, over-utilization of healthcare services and significant cost increases.

The causes of low levels of medication adherence are multi-faceted and include factors like carelessness, fear, supply, cost, and lack of understanding or information, with forgetfulness as the primary cause.

Furthermore, as cited in an editorial from BMJ Quality and Safety, “medication adherence thus constitutes one of the ‘big hairy problems’ or ‘big hairy audacious goals’ of healthcare. As well as affecting patients’ long-term outcomes, non-adherence can increase healthcare costs through consumption of medicines below the threshold of adherence required for clinical benefit, as well as contributing to healthcare resource use such as hospital admissions.

In response to this, the global market for medication adherence (hardware-based automation and adherence systems and software-based applications) was worth nearly $1.7 billion in 2016. The market is expected to reach more than $3.9 billion by 2021, increasing at a CAGR of 18.0 percent from 2016 through 2021. This steep increase is fueled by burgeoning demand for advanced medication adherence systems and a growing number of people worldwide with chronic diseases.

Personal experience leads to action


Emanuele Musini knows all too well the implications of not taking medications properly. In fact, it was the pain of losing his father in 2005 from a chronic condition and a lack of adhering to the prescribed medication regimen that became the catalyst for Emanuele to start studying the issue in-depth, searching for a solution. In 2015, Emanuele, along with his multidisciplinary team of doctors, entrepreneurs, engineers, and user-experience professionals, created Pillo Health, a health platform centered around a robot and digital assistant designed to prevent other family members from enduring what Emanuele and his family experienced. Since their founding, they’ve partnered with leading manufacturers, such as Stanley Black & Decker, to bring in-home medication management solutions to market with solutions like Pria, a winner of the 2019 CES Innovation Awards.”

The Pillo Health team built their medication adherence solution on Microsoft Azure Cloud Services using Azure Cognitive Services for voice technology and facial recognition, and services from the Azure IoT platform, including IoT Hub. The result is a voice-first, personalized, cloud-enabled, medication assistant that can help people maintain their medication regimen through social connectivity and delivery of important medical information at home. In a 4-week study conducted with AARP in 2018 for diabetic patients who were prescribed Metformin, Pillo delivered an average medication adherence rate of more than 87 percent—a meaningful 20 to 30 percent improvement from conventional reported standards.

Antonello Scalmato, Director of Cloud Services at Pillo Health noted, “We selected Microsoft Azure because it provided the best infrastructure for PaaS applications, allowed us to speed up the development of our complex product and avoided the overhead of machine and security management for traditional web API infrastructure. Moreover, IoT Hub provides a channel for secure communications and notifications to our users, and also enables simple device management that protects our product, from the factory into the users' homes.”

Azure IoT, Azure Study Material, Azure Tutorial and Material, Azure Study Materials, Azure Prep

Azure IoT, Azure Study Material, Azure Tutorial and Material, Azure Study Materials, Azure Prep

Monday, 10 February 2020

Exactly Those Tips That Will Help You Prepare for Microsoft 70-467 Certification Exam

About Microsoft 70-467 Certification

microsoft 70-467, 70-467, exam 70-467, Designing Business Intelligence Solutions with Microsoft SQL Server, Microsoft SQL Server, Designing Business Intelligence Solutions with Microsoft SQL Server Certification, MCSE
In this 70-467 exam, you will learn how to design, implement, and maintain a Business Intelligence (BI) platform with Microsoft SQL Server 2014. You will learn to create a data warehouse and Business Intelligence solutions, including ETL, BI Delivery, Reporting Services, Excel Reporting, SharePoint Server BI, with how to observe and optimize a BI solution.

You will learn big data ideas and analysis using Microsoft Azure HDInsight plus how to design, execute, and process big data solutions.

Microsoft 70-467 Exam Details

  • Name: Designing Business Intelligence Solutions with Microsoft SQL Server
  • Code: 70-467
  • Duration: 150 minutes
  • Questions: 40-60
  • Languages: English, Chinese (Simplified), French, German, Japanese, Portuguese (Brazil)
  • Passing Score: 700/1000
  • Exam Price: 165 USD
  • Technology: Microsoft SQL Server
  • Credit Toward Certification: MCSE
Microsoft 70-467 Exam Syllabus Topics
  • Plan business intelligence (BI) infrastructure (15-20%)
  • Design BI infrastructure (15-20%)
  • Design a reporting solution (20-25%)
  • Design BI data models (30-35%)
  • Design an ETL solution (10-15%)

Who Should Take the 70-467 Exam?

The 70-467 exam is intended for business intelligence (BI) architects, who are bound for the overall design of a BI infrastructure and how it compares to other data systems in use.

Who Can Go for Microsoft SQL Server Certifications?

Anyone who has a fundamental knowledge of SQL and database can go for these certifications.
Mainly:
  • Business Intelligence Professionals
  • Software Developers and IT Professionals
  • SQL and Database Administrators
  • Business Analysts and Managers
  • Project Managers
  • Big Data and Hadoop Professionals
  • Those aiming for a career in SQL development
This 70-467 uses Microsoft SQL Server 2014.

This 70-467 exam includes material from the Official Microsoft Learning Product 20467: Designing Business Intelligence Solutionswith Microsoft SQL Server 2014. This program plans to skills and knowledge measured by Microsoft exam 70-467, and in combination with on-the-job experience, can ease prepare you for the exam.

Prerequisites for Microsoft 70-467 Certification

  • Basic knowledge of dimensional modeling (star schema) for data warehouses.
  • The ability to perform Integration Services packages that include control flows and data flows.
  • The talent to create a primary multidimensional solid with Analysis Services.
  • The ability to generate a primary tabular model with PowerPivot and Analysis Services.
  • The capacity to form Reporting Services reports with Report Designer.
  • The strength to make authentication and support in the SQL Server database engine, Analysis Services, and Reporting Services.
  • Awareness with SharePoint Server and Microsoft Office applications, especially Excel.

Where to Go from Here?

After you pass the Designing Business Intelligence Solutionswith Microsoft SQL Server 2012 (70-467) exam, you may want to take the following exams:
  • Querying Microsoft SQL Server 2012/2014 (70-461)
  • Administering Microsoft SQL Server 2012/2014 Databases (70-462)
  • Implementing a Data Warehouse with Microsoft SQL Server 2012/2014 (70-463)
  • Developing Microsoft SQL Server Databases (70-464)
  • Designing Database Solutions for Microsoft SQL Server (70-465)
  • Designing Business Intelligence Solutions with Microsoft SQL Server (70-466)

Tips Should You Follow to Preparation for Microsoft 70-467 Certification Exam

There are so many sites and so many hints out there on how to pass the Microsoft 70-467 exam. However, you must assess the reliability of the information and its source. One reliable website you can trust is AnalyticsExam.
Here are some helpful tips for passing the 70-467 certification test:
  • First of all, do not forget to register for the trial.
  • Make sure that you clearly understand the objectives and expectations of Microsoft 70-467.
  • Devise your study plan and approach you are going to take to prepare for the exam.
  • Ensure that you create a practical plan and stick to it during the preparation period to increase your opportunities of acing Microsoft70-467 at the first attempt.
  • Use the practice tests from AnalyticsExam as your primary way of studying.
  • Join active study groups and online forums. This will help you ensure that you are investing yourself with people who are also preparing for the same test as you.
  • Ensure that you are completing your preparation a few days before the real exam so that you can have ample time to change what you have studied and to rest before taking the certification test.
  • Make sure that you are getting plenty of rest the night before the exam day. Go to bed early and sleep for at least 8 hours to ensure that you are waking up fresh and energetic. You will get it more convenient to recall many things when you are new.
Summary
After all this hard work, you will be getting practice exam scores of 80% and above. At that point, you are ready to book in the exam! You can take a 70-467 exam at a test center, but Microsoft has lately set up a system where you can take the exam at home!

It is a bit of a bother setting everything up and meeting all of the requirements, but it can save you some time and money not having to travel.
Read: Revealed: The Best Tips for Passing Microsoft Certification Exams

Good luck with your 70-467 exam, and remember success is earned, not given away. Put in the effort, and you will pass.