Wednesday, 31 January 2018

Using the MySQL sys schema to optimize and maintain a database

Microsoft Tutorials and Materials, Microsoft Guides, Microsoft Azure Certifications

The MySQL sys schema, which is fully enabled in Azure Database for MySQL 5.7, provides a powerful collection of user friendly views in a read-only database. Building on the MySQL Performance and Information Schemas, you can use the MySQL sys schema to troubleshoot performance issues and manage resources efficiently.

The MySQL Performance Schema, first available in MySQL 5.5, provides instrumentation for many vital server resources such as memory allocation, stored programs, metadata locking, etc. However, the Performance Schema contains more than 80 tables, and getting the necessary information often requires joining tables within the Performance Schema, as well as tables from the Information Schema. Let’s look more closely at how to use the MySQL sys schema.

Microsoft Tutorials and Materials, Microsoft Guides, Microsoft Azure Certifications

There are 52 views in the sys schema, and each view is prefixed by one of the following:

◉ Host_summary or IO: I/O related latencies.
◉ Innodb: Innodb buffer status and locks.
◉ Memory: Memory usage by the host and users.
◉ Schema: Schema related information, such as auto increment, indexes, etc.
◉ Statement: Information on SQL statements; this can be statements that resulted in a full table scan or long query time.
◉ User: Resources consumed and grouped by users. Examples are file I/Os, connections, and memory.
◉ Wait: Wait events, grouped by host or user.

Performance tuning


◉ IO is the most expensive operation in the database. We can find out the average IO latency by querying the sys.user_summary_by_file_io view. With the default 125GB of provisioned storage, my IO latency is about 15 seconds.

Microsoft Tutorials and Materials, Microsoft Guides, Microsoft Azure Certifications

Because Azure Database for MySQL scales IO with respect to storage, after I increase my provisioned storage to 1TB, my IO latency reduces to 571ms, representing a 26X performance increase!

Microsoft Tutorials and Materials, Microsoft Guides, Microsoft Azure Certifications

◉ Despite careful planning, many queries still result in full table scans. Full table scans are resource intensive and degrade your database performance. The quickest way to find tables with full table scan is to query the sys.schema_tables_with_full_table_scans view.

Microsoft Tutorials and Materials, Microsoft Guides, Microsoft Azure Certifications

◉ To troubleshoot database performance issues, it may be beneficial to identify the events happening inside of your database, and using the sys.user_summary_by_statement_type view may just do the trick.

Microsoft Tutorials and Materials, Microsoft Guides, Microsoft Azure Certifications

In this example, Azure Database for MySQL spent 53 minutes flushing the slog query log 44,579 times. That’s a long time and a lot of IOs. You can reduce this activity by either disabling your slow query log or decreasing the frequency of slow query log in Azure portal.

Database maintenance


◉ The InnoDB buffer pool resides in memory and is the main cache mechanism between the DBMS and storage. The size of the InnoDB buffer pool is tied to the performance tier and cannot be changed unless a different product SKU is chosen. As with memory in your operating system, old pages are swapped out to make room for fresher data. To find out which tables consume most of the InnoDB buffer pool memory, you can query the sys.innodb_buffer_stats_by_table view.

Microsoft Tutorials and Materials, Microsoft Guides, Microsoft Azure Certifications

In the graphic above, it is apparent that other than system tables and views, each table in the mysqldatabase033 database, which hosts one of my WordPress sites, occupies 16KB, or 1 page, of data in memory.

◉ Indexes are great tools to improve read performance, but they incur additional costs for inserts and storage. Sys.schema_unused_indexes and sys.schema_redundant_indexes provide insights into unused or duplicate indexes.

Microsoft Tutorials and Materials, Microsoft Guides, Microsoft Azure Certifications

Microsoft Tutorials and Materials, Microsoft Guides, Microsoft Azure Certifications

In summary, the sys schema is a great tool for both performance tuning and database maintenance. Make sure to take advantage of this feature in your Azure Database for MySQL.

Saturday, 27 January 2018

How Azure Security Center uses machine learning to enable adaptive application control

While the threat landscape has changed dramatically over the last several years, malware detection continues to be one of the biggest issues. There is an endless race - attackers develop new malware, security vendors create new signatures to detect it, a new variant of the malware is created to avoid detection, and the cycle continues. Malware is not the only type of application that can expose a server to risk. Unauthorized software can introduce vulnerabilities that are exploited by attackers. Most organizations lack the necessary application tracking and controls, leaving them blind to these risks.

Application controls, such as whitelisting, can help limit exposure to malicious and vulnerable applications. Instead of trying to keep pace with rapidly evolving malware and new exploits, application whitelisting simply blocks all but known good applications. For purpose-built servers that typically run a fixed set of applications, whitelisting can offer significant added protection. You can use application controls to:

◈ Block new and unknown malware.
◈ Comply with your organization's security policy that dictates the use of only licensed software.
◈ Avoid old and unsupported applications.
◈ Prevent specific software tools that are not allowed in your organization.

While the concept of application whitelisting has existed for some time now, it was not widely used. This is due to the complexity of creating and applying accurate whitelisting policies per server or group of servers, and managing these policies at scale in large environments.

Azure Security Center recently released adaptive application controls, which uses an innovate approach to application whitelisting, enabling you to realize the security benefits without the management overhead. Machine learning is used to analyze the behavior of your Azure VMs, create a baseline of applications, group the VMs and decide if they are good candidates for application whitelisting, as well as recommend and automatically apply the appropriate whitelisting rules. In addition, Security Center surfaces applications that can be exploited to bypass an application whitelisting solution, and provides full management and monitoring capabilities, through which you can change an existing whitelist (e.g. remove / add applications to the coverage) and be alerted on violations of the whitelists.

Adaptive application controls are currently available for Windows machines running in Azure (all versions, classic or Azure Resource Manager). To get started, open Security Center and select the application whitelisting tile as shown below.

Azure Tutorials and Materials, Azure Certifications, Azure Guides, Azure Learning

Select a resource group to view and apply the recommended application whitelisting rules.

Azure Tutorials and Materials, Azure Certifications, Azure Guides, Azure Learning

New rules are always set to audit mode, which alerts you to applications that violate the rules, but you can edit the policy to change in order to block these applications using enforce mode. You can also edit the details of the rules at any time.

Azure Tutorials and Materials, Azure Certifications, Azure Guides, Azure Learning

Adaptive application controls are available to Azure Security Center Standard customers as a limited public preview. 

Friday, 26 January 2018

Azure Data Factory: Visual Tools enabled in public preview

Our goal with visual tools for ADF v2 is to increase productivity and efficiency for both new and advanced users with intuitive experiences. You can get started by clicking the Author & Monitor tile in your provisioned v2 data factory blade.

Azure Tutorials and Materials, Azure Guides, Azure Learning, Microsoft Materials

Get Started Quickly


1. Create your first ADF v2 pipeline

2. Quickly Copy Data from a bunch of data sources using the copy wizard

3. Configure SSIS IR to lift and shift SSIS packages to Azure

4. Set up code repo (VSTS GIT) for source control, collaboration, versioning etc..

Azure Tutorials and Materials, Azure Guides, Azure Learning, Microsoft Materials

Visual Authoring


Author Control Flow Pipelines

Create pipelines, drag and drop activities, connect them on-success, on-failure, on-completion.

Azure Tutorials and Materials, Azure Guides, Azure Learning, Microsoft Materials

Create Azure & Self Hosted Integration runtimes

Create a self hosted integration runtime for hybrid data movement or an Azure-SSIS IR for lifting and shifting SSIS packages to Azure. Create linked service connections to your data stores or compute.

Azure Tutorials and Materials, Azure Guides, Azure Learning, Microsoft Materials

Support for all control flow activities running on Azure computes

Control Flow Activities:

◉ HDInsight Hive, HDInsight Pig, HDInsight Map Reduce, HDI Streaming, HDI Spark, U-SQL, Stored Procedure, Web, For Each, Get Metadata, Look up, Execute Pipeline

Support for Azure Computes:

◉ HDI (on-demand, BYOC), ADLA, Azure Batch

Iterative development and debugging

Do Test Runs before attaching a trigger on the pipeline and running on-demand or on a schedule.

Azure Tutorials and Materials, Azure Guides, Azure Learning, Microsoft Materials

Parameterize pipelines and datasets

Parameterize using expressions, system variables.

Azure Tutorials and Materials, Azure Guides, Azure Learning, Microsoft Materials

Rich Validation Support

You can now validate your pipelines to know about missed property configurations or incorrect configurations. Simply click the Validate button in the pipeline canvas. This will generate the validation output in side drawer. You can then click on each entry to go straight to the location of the missing validation.

Azure Tutorials and Materials, Azure Guides, Azure Learning, Microsoft Materials

Trigger pipelines

Trigger on-demand, run pipelines on schedule.

Azure Tutorials and Materials, Azure Guides, Azure Learning, Microsoft Materials

Use VSTS GIT

VSTS GIT for source control, collaboration, versioning, etc.

Azure Tutorials and Materials, Azure Guides, Azure Learning, Microsoft Materials

Copy Data

Data Stores (65)

Support for 65 data stores. 18 stores with first class support that require users to provide just configuration values. The remaining 47 stores can be used with JSON.

18 stores with first class support:

◉ Azure Blob, Azure CosmosDB, Azure Database for MySQL, Azure Data Lake Store, Amazon Redshift, Amazon S3, Azure SQL DW, Azure SQL, Azure Table, File System, HDFS, MySQL, ODBC, Oracle, Salesforce, SAP HANA, SAP BW, SQL Server

47 Stores with JSON support:

◉ Search Index, Cassandra, HTTP file, Mongo DB, OData, Relational table, Dynamics 365, Dynamics CRM, Web table, AWS Marketplace, PostgreSQL, Concur, Couchbase, Drill, Oracle Eloqua, Google Big Query, Greenplum, HBase, Hive, HubSpot, Apache Impala, Jira, Magento, MariaDB, Marketo, PayPal, Phoenix, Presto, QuickBooks, ServiceNow, Shopify, Spark, Square, Xero, Zoho, DB2, FTP, GE Historian, Informix, Microsoft Access, MongoDB, SAP Cloud for customer

Azure Tutorials and Materials, Azure Guides, Azure Learning, Microsoft Materials

Use copy wizard to quickly copy data from a bunch of data sources

The familiar ADF v1 copy wizard is now available in ADF v2 to do one-time quick import. Copy Wizard generates pipelines with copy activities on authoring canvas. The copy activities can now be extended to run other activities like Spark, USQL, Stored Proc etc. on-success, on-failure etc. and create the entire control flow pipeline.

Azure Tutorials and Materials, Azure Guides, Azure Learning, Microsoft Materials

Guided tour

Click on the Information Icon in the lower left. You can then click Guided tour to get step by step instructions on how to visually monitor your pipeline and activity runs.

Azure Tutorials and Materials, Azure Guides, Azure Learning, Microsoft Materials

Feedback

Click on the Feedback icon to give us feedback on various features or any issues that you may be facing.

Azure Tutorials and Materials, Azure Guides, Azure Learning, Microsoft Materials

Select data factory

Hover on the Data Factory icon on the top left. Click on the Arrow icon to see a list of Azure subscriptions and data factories that you can monitor.

Azure Tutorials and Materials, Azure Guides, Azure Learning, Microsoft Materials

Visual Monitoring


List View Monitoring

Monitor pipeline, activity & trigger runs with a simple list view interface. All the runs are displayed in local browser time zone. You can change the time zone and all the date time fields will snap to the selected time zone.

Monitor Pipeline Runs:

List view showcasing each pipeline run for your data factory v2 pipelines.

Azure Tutorials and Materials, Azure Guides, Azure Learning, Microsoft Materials

Monitor Activity Runs:

List view showcasing activity runs corresponding to each pipeline run. Click Activity Runs icon under the Actions column to view activity runs for each pipeline run.

Azure Tutorials and Materials, Azure Guides, Azure Learning, Microsoft Materials

Important note: You need to click the Refresh icon on top to refresh the list of pipeline and activity runs. Auto-refresh is currently not supported.

Monitor Trigger Runs:

Azure Tutorials and Materials, Azure Guides, Azure Learning, Microsoft Materials

Rich ordering and filtering

Order pipeline runs in desc/asc by Run Start and filter pipeline runs pipeline name, run start and run status.

Azure Tutorials and Materials, Azure Guides, Azure Learning, Microsoft Materials

Add/Remove columns to list view

Right click the list view header and choose columns that you want to appear in the list view.

Azure Tutorials and Materials, Azure Guides, Azure Learning, Microsoft Materials

Reorder columns widths in list view

Increase and decrease the column widths in list view by simply hovering over the column header.

Monitor Integration Runtimes

Monitor health of your Self Hosted, Azure, Azure-SSIS Integration runtimes.

Azure Tutorials and Materials, Azure Guides, Azure Learning, Microsoft Materials

Cancel/Re-run your pipeline runs

Cancel a pipeline run or re-run a pipeline run with already defined parameters.

Azure Tutorials and Materials, Azure Guides, Azure Learning, Microsoft Materials

This is the first public release of ADF v2 visual tools We are continuously working to refresh the released bits with new features based on customer feedback. Get more information and detailed steps for using the ADF v2 visual tools.

Wednesday, 24 January 2018

Keeping your environment secure with Update Management

The Azure Update Management service is included as part of an Azure Subscription. Update management allows you to manage updates and patches for your machines. With Update management, you can quickly assess the status of available updates, schedule installation of required updates, and review deployment results to verify updates that apply successfully. This is possible whether your machines are Azure VMs, hosted by other cloud providers, or on premise.

To use Update Management, you will need to take care of a few prerequisites. If you already have an Azure VM, this process is simple:

Navigate to your VM and choose Update management from the left-hand menu.

Microsoft Tutorials and Materials, Microsoft Certifications, Microsoft Guides, Microsoft Secure

Click the banner that says, "The Update management solution is not enabled on this virtual machine". Click there to learn more and enable.

On the next screen, click the Enable button. This creates a log analytics workspace and Automation account using default values. If you have an existing workspace or Automation account, you can choose those as well.

Microsoft Tutorials and Materials, Microsoft Certifications, Microsoft Guides, Microsoft Secure

Once this is completed, you will see the Update Management view. Although it will take some time for data to populate, this view will give you information about a single VM. There is also a multi-machine view which you can access by clicking Manage multiple computers

Microsoft Tutorials and Materials, Microsoft Certifications, Microsoft Guides, Microsoft Secure

You can easily add more machines from this view by selecting either Add Azure VM or Add Non-Azure Computer.

Microsoft Tutorials and Materials, Microsoft Certifications, Microsoft Guides, Microsoft Secure

Get visibility into your Update Compliance with Update Management


By enrolling machines in Update Management, you have access to dashboards reporting on the state of your machines. This is possible whether your machines are Azure VMs, AWS VMs, other cloud providers, or on premise.

Deploy Security Updates

To deploy patches to machines, select Schedule update deployment from the multi-machine view.

Microsoft Tutorials and Materials, Microsoft Certifications, Microsoft Guides, Microsoft Secure

This shows a new blade.

Microsoft Tutorials and Materials, Microsoft Certifications, Microsoft Guides, Microsoft Secure

In this blade, you can select computers which should receive updates. If you wish, you can filter Update classifications to only apply security updates. The update run can be scheduled to run once or on a recurring basis. The maintenance window defines how long the update process can run on the machine.

Sunday, 21 January 2018

Creating your first data model in Azure Analysis Services

Azure Analysis Services is a new preview service in Microsoft Azure where you can host semantic data models. Users in your organization can then connect to your data models using tools like Excel, Power BI and many others to create reports and perform ad-hoc data analysis.

To understand the value of Azure Analysis Services, imagine a scenario where you have data stored in a large database. You want to make that data available to your business users or customers so they can do their own analysis and build their own reports. To do this, one option would be to give those users access to that database. Of course, this option has several drawbacks. The design of that database, including the names of tables and columns may not be easy for a user to understand. They would need to know which tables to query, how those tables should be joined, and other business logic that needs to be applied to get the correct results. They would also need to know a query language like SQL to even get started. Most often this will lead to multiple users reporting the same metrics but with different results.

With Azure Analysis Services, you can encapsulate all the information needed into a semantic model which can be more easily queried by those users in an easy drag-and-drop experience. And you can ensure that all users will see a single version of the truth. Some of the metadata included in the semantic model includes; relationships between tables, friendly table/column names, descriptions, display folders, calculations and row level security.

Once your data is properly modeled for your users to consume, Azure Analysis Services offers additional features to enhance their querying experience. The biggest of which is the option to put the data in an in memory columnar cache which can accelerate queries to sub second performance. This not only improves the query experience but by hitting the cache also reduces the query load on your underlying database.

Create an Analysis Services server in Azure


1. Go to http://portal.azure.com.

2. In the Menu blade, click New.

Azure Analysis Services, Azure Tutorials and Materials, Azure Guides

3. Expand Intelligence + Analytics, and then click Analysis Services.

Azure Analysis Services, Azure Tutorials and Materials, Azure Guides

4. In the Analysis Services blade, enter the following and then click Create:

◈ Server name: Type a unique name.
◈ Subscription: Select your subscription.
◈ Resource group: Select Create new, and then type a name for your new resource group.
◈ Location: This is the Azure datacenter location that hosts the server. Choose a location nearest you.
◈ Pricing tier: For our simple model, select D1. This is the smallest tier and great for getting started. The larger tiers are differentiated by how much cache and query processing units they have. Cache indicates how much data can be loaded into the cache after it has been compressed. Query processing units, or QPUs, are a sign of how many queries can be supported concurrently. Higher QPUs may mean better performance and allow for a higher concurrency of users.

Now that you’ve created a server, you can build your first model. In the next steps, you’ll use SQL Server Data Tools (SSDT) to create a data model and deploy it to your new server in Azure.

Create a sample data source

Before you can create a data model with SSDT, you’ll need a data source to connect to. Azure Analysis Services supports connecting to many different types of data sources both on-premises and in the cloud. For this post, we’ll use the Adventure Works sample database.

1. In Azure portal, in the Menu blade, click New.

Azure Analysis Services, Azure Tutorials and Materials, Azure Guides

2. Expand Databases, and then click SQL Database.

Azure Analysis Services, Azure Tutorials and Materials, Azure Guides

3. In the SQL Database blade, enter the following and then click Create:

◈ Database name: Type a unique name.
◈ Subscription: Select your subscription.
◈ Resource group: Select the same resource group you created for your Analysis Services server.
◈ Select source: Select Sample (Adventure Works LT).
◈ Server: Choose a location nearest you.
◈ Pricing tier: For your sample database, select B.
◈ Collation: Leave the default, SQL_Latin1_General_CP1_CI_AS.

Now that you’ve created a sample data source, you’ll have some data to connect to when you build your data model.In the next steps, you’ll use SQL Server Data Tools (SSDT) to connect to your new data source, create a data model, and deploy it to your new server in Azure.

Create a data model

To create Analysis Services data models, you’ll use Visual Studio and an extension called SQL Server Data Tools (SSDT).

1. In SSDT, create a new Analysis Services Tabular Project.

Azure Analysis Services, Azure Tutorials and Materials, Azure Guides

If asked to select a workspace type, select Integrated.

2. Click the Import From Data Source icon on the toolbar at the top of the screen.

Azure Analysis Services, Azure Tutorials and Materials, Azure Guides

3. Select Microsoft SQL Azure as your data source type and click Next.

4. Fill in the connection information for the sample SQL Azure database created earlier and click Next.

Azure Analysis Services, Azure Tutorials and Materials, Azure Guides

◈ Server Name: Name of SQL Azure server to connect to.
◈ User Name: Name of the user which will be used to login to the server.
◈ Password: Password for the account.
◈ Database Name: Name of the SQL database to connect to.

Note: If using SQL Azure ensure that you have allowed your IP address access through the firewall. Also, ensure that “Allow access to Azure Services” is set to “on” for the firewall.

5. Select Service Account for the impersonation mode and click Next.

6. Select the tables you wish to import into cache and click Finish:

Azure Analysis Services, Azure Tutorials and Materials, Azure Guides

◈ At this step, you can optionally provide a friendly name for each table. For large tables, which may not fit into cache, you can also specify a filter expression to reduce the number of rows. When complete, click next.

◈ Data will now be read from the database and pulled into a local cache within Visual Studio.

◈ Once loading is complete, you will have your first model created and will be able to see each table and the data within them. You can also switch to a diagram view by clicking the little diagram option at the bottom right of the screen:

Azure Analysis Services, Azure Tutorials and Materials, Azure Guides

The diagram view makes it really easy to see all of the tables and the relationships between them.

Improving the model


Now that your basic model is built, you could start querying it now or you could enhance it further by using more of the available modeling features. Some of these features include:

◈ Create or edit relationships. You can add, remove or change relationships between tables by going to the diagram view and dragging a line between two columns in different tables. Once tables are joined together, they can automatically be queried together when a user selects columns from both tables.

◈ Edit properties for a table or column. You can update multiple properties for tables and columns by clicking on them and updating the values in the properties pane.

Azure Analysis Services, Azure Tutorials and Materials, Azure Guides

Azure Analysis Services, Azure Tutorials and Materials, Azure Guides

◈ Add more business logic to the model by creating calculations and measures.

Azure Analysis Services, Azure Tutorials and Materials, Azure Guides

Deploy

Once your model is complete, you can now deploy it to the Azure AS server which you created in the first step. This can be done with the following steps:

1. Copy your Azure Analysis Services server name for the Azure portal. This can be found at the top of the overview section of your server.

Azure Analysis Services, Azure Tutorials and Materials, Azure Guides

2. In the solution explorer in Visual Studio, right click on the project and click properties.

Azure Analysis Services, Azure Tutorials and Materials, Azure Guides

3. Change the deployment server to the name of your Azure AS server and click OK.

Azure Analysis Services, Azure Tutorials and Materials, Azure Guides

4. Right click the project name again, but this time click Deploy.

Azure Analysis Services, Azure Tutorials and Materials, Azure Guides

Connect

Now that you model has been creating you can connect with it through tools like the Power BI Desktop or Excel.

Power BI Desktop

If you don’t already have the Power BI Desktop, you can download it for free.

1. Open the Power BI Desktop

2. Click Get Data.

Azure Analysis Services, Azure Tutorials and Materials, Azure Guides

3. Select Databases/SQL Server Analysis Services and then click connect.

Azure Analysis Services, Azure Tutorials and Materials, Azure Guides

4. Enter your Azure AS server name and click OK.

Azure Analysis Services, Azure Tutorials and Materials, Azure Guides

5. On the Navigator screen, select your model and click OK.

Azure Analysis Services, Azure Tutorials and Materials, Azure Guides

You will now see your model displayed in the field list on the side. You can drag and drop the different fields on to your page to build out interactive visuals.

Azure Analysis Services, Azure Tutorials and Materials, Azure Guides