How to perform index maintenance on Azure SQL Database – Reviewed

A few months, I wrote a blog post on how to perform index maintenance on Azure SQL Database.
I thought my solution with Ola Hallengren maintenance scripts worked fine, however, I got some comments on the post and on Twitter that there was an issue with my solution.

After analyzing and retesting it again, it turns out that there was indeed an issue. During the setup, I tried to create the IndexOptimize stored procedure in the master database but that is not allow.
So, I created a management database and created the stored procedures in that DB. Now I’ve noticed that when you do a select * from sys.databases in the user database, it only returns the database itself and the master…
It’s also documented here https://msdn.microsoft.com/en-us/library/ms178534.aspx#Anchor_1

This was an issue because the index optimize stored procedure is also using sys.databases to determine the databases. So, the runbook was working fine, except it was only doing an index optimize in my management DB and nothing more…

I’ve contacted Ola on this and he told me that sys.databases is not the only issue. The problem is also that Azure SQL Database does not support three-part-naming (DatabaseName.SchemaName.ObjectName). That means that it is not possible to have a stored procedure in one database, and do something in another database.

Workaround

After investigating, I’ve found a possible workaround for this issue.
The first step that you have to do is creating the IndexOptimize stored procedure in every user database.
Secondly, I’ve modified my runbook so that the selection of the user databases is not done with the stored procedure but inside the runbook.

I’m first going to connect to the master database to retrieve a list of all the online user databases.
Once I’ve got it, I connect to each of the user databases separately in a loop, and execute the index maintenance procedure.

This solution seems to be working so far. I know it is not (yet) as flexible as Ola’s scripts but it is certainly a good step in the right direction.
You can find the new script on the original post.

Good luck with it and if you have any issues with it, please let me know.

Pieter

How to view Azure SQL Database Audit Logs?

When you enable auditing on you Azure SQL Database, it will capture database events and it will write them to an audit log on your Azure Storage Account.
In fact, the audit is actually an Azure Table which will keep the data according to the retention period that you have specified. More information about auditing on Azure SQL Database can be found here.
The only “issue” with auditing is that you cannot read the Azure Store Table directly from the portal. You’ll have to use different tools to analyze your audit logs.

The Azure Portal Dashboard

The Azure Portal Dashboard gives you an overview and summary of the events that have happened on your database.
You can see the number of schema changes, login failures, data changes, stored procedures, …
This dashboard is only available in the new portal.

 

Azure Storage Explorer

The Azure Storage Explorer is a free tool that you can download from Codeplex which allows you to read the Azure audit log file in detail. With the Storage Explorer, you can see and configure all your storage account components.
Once you have installed the tool, open it and click on add account. Specify the storage account and the access key where the audit table is stored.


When you’re connected, you’ll see a folder Tables, where you can find your audit log. By default, the name of the audit log always has the same structure. It begins with “SQLDBAuditLogs” + “the name of your server” + “the name of your database” + “the date the audit was activated”.

When you click on the table, the audit log will open and you can see all the tracked events that have been logged.
Be aware that the eventtime might be different than the time where you are located. You also have to take into account that it can take a sometime to open the audit log if many events are tracked.
You also have the possibility to download the logs locally. Next to the eventtime you also see the server and database name, the application that executed the query, the type of the event and even the SQL statement or stored procedure that have been executed.
You can really see who did what and when on your database and can be useful to troubleshoot your database.

In the same audit log, you can also see the number of AffectedRows, ResponseRows and how many time it took for the server to execute the query.

Excel and Power Query

A template can be downloaded from the Azure Portal to help you analyze the audit logs. The Excel file allows you to connect to your Storage Account and use Power Query to import the log information.

I won’t explain how you can use the Excel file because Microsoft has already a detailed explanation in the Excel file itself

Power BI

Microsoft also released the Power BI Azure Audit Logs content pack which can help you easily analyze and visualize the wealth of information contained in these logs.
You just need to connect to your storage account through the content pack and a nice dashboard will be created for you automatically.

More information can be found on this MSDN blog post.

If you should have other ways to connect to the audit logs, please do share it with me. I’m always interested in it.

Pieter

How to perform index maintenance on Azure SQL Database

It’s been a long time since I wrote a blog post and I finally found some time 😀 I’ve played a lot recently with the new Azure SQL Database V12 version. It’s really cool because it has a lot of improvements if you compare it to v11. With the V12 version, you have almost the same functionalities like an on premise database and it consists also new functionalities like row-level security, dynamic data masking and the query store. Just like an on premise database, you also have to maintain your indexes and statistics because indexes can also get fragmented on an Azure SQL Database. With an on premise server or a VM in Azure, most DBA’s schedule an index maintenance job under the SQL Server Agent. However, with Azure SQL Database, you have to think inside the database, which means you don’t have any server functionalities, including SQL Server Agent. Luckily, the Azure platform provides the Azure Automation, which can be used as a replacement for the SQL Server Agent. In this blog post, I’m going to explain how you can schedule your index maintenance job or any other job that needs to be scheduled.

Create a new Azure Automation Account

First of all, you have to begin with creating a new Azure Automation Account. Go to the portal and select the Automation tab. Click the Create button to create the new Automation Account. Fill in the name of the account and the region. Choose the region that is the closest to you. r

Create an Azure Automation credential asset

As prerequisite, create an Azure Automation credential asset that contains the username and password for the target Azure SQL DB logical server. Click on the Automation Account that you have just created and select Assets. In this section, select the button “Add Setting” at the bottom.

Select the option “Add Credential”


Select the Credential type “Windows PowerShell Credential” because we’re going to use this credential in the PowerShell Workflow. Give the Credential a name.

Specify the username and the password that you will link to the credential. This will be the user that will connect to your SQL Azure Database.

Click on OK and wait until the credential is created.

Install your maintenance scripts

Make sure that you have installed your maintenance scripts/procedures on all your Azure Databases. In my example, I’ve been using the maintenance scripts of Ola Hallengren. For the index maintenance, I have to install the scripts IndexOptimize.sql and CommandExecute.sql. Make sure you download the latest version because Ola fixed an issue with the index optimize on Azure SQL Database on July 19th 2015. There is a small issue with the scripts. Ola uses cross database stored procedures, which is not supported in Azure DB at the moment. So, the @Database parameter will not work correctly. Please also check the comments of this blog post. You have to implement a workaround in the runbook.

Import the Maintenance Runbook

We have now setup all the prerequisites so we can start with creating a runbook in our Azure Automation account. A runbook is a PowerShell Workflow that needs to be created or imported. You can actually compare it to configuring a job step in the SQL Server Agent job. The runbook contains the SQL Scripts that need to be executed for the index maintenance and will be scheduled later on. Select your Automation Account and go to the runbook tab. Click on the button “Import” at the bottom to upload your PowerShell Workflow. Select your PowerShell script that you have created and upload it.

Here is my script that I have used.

<#
.SYNOPSIS
 Perform index maintenance

.DESCRIPTION
 This runbook provides an example of how Azure Automation can be used to accomplish common SQL Agent tasks in the cloud. 
 As prerequisite, please create an Azure Automation credential asset that contains the username and password for the target Azure SQL DB logical server ($SqlServerName).
 Make sure that you have installed the scripts IndexOptimize.sql and CommandExecute.sql of Ola Hallengren (https://ola.hallengren.com/downloads.html)
 The check for the MAXDOP value in the IndexOptimize.sql script is using sys.dm_os_sys_info, which is currently not supported
 So be sure you disable that check otherwise it will return an error.
 
.EXAMPLE
 SQLServerIndexMaintenance
 
.NOTES
 AUTHOR: Pieter Vanhove
 LAST EDIT: October 20, 2015 
#>

workflow SQLServerIndexMaintenance
{
 param
 (
 # Fully-qualified name of the Azure DB server 
 [parameter(Mandatory=$true)] 
 [string] $SqlServerName,
 
 # Credentials for $SqlServerName stored as an Azure Automation credential asset
 # When using in the Azure Automation UI, please enter the name of the credential asset for the "Credential" parameter
 [parameter(Mandatory=$true)] 
 [PSCredential] $Credential
 )
 
 inlinescript{
 
 # Setup credentials 
 $ServerName = $Using:SqlServerName
 $UserId = $Using:Credential.UserName
 $Password = ($Using:Credential).GetNetworkCredential().Password
 
 
 # Create connection to Master DB
 $MasterDatabaseConnection = New-Object System.Data.SqlClient.SqlConnection
 $MasterDatabaseConnection.ConnectionString = "Server = $ServerName; Database = Master; User ID = $UserId; Password = $Password;"
 $MasterDatabaseConnection.Open();
 
 # Create command to query the current size of active databases in $ServerName
 $MasterDatabaseCommand = New-Object System.Data.SqlClient.SqlCommand
 $MasterDatabaseCommand.Connection = $MasterDatabaseConnection
 $MasterDatabaseCommand.CommandText = 
 "
 select name from sys.databases
 where state_desc='ONLINE'
 
 "
 # Execute reader and return tuples of results <database_name, SizeMB>
 $MasterDbResult = $MasterDatabaseCommand.ExecuteReader()
 
 # Proceed if there is at least one database
 if ($MasterDbResult.HasRows)
 {
 # Create connection for each individual database
 $DatabaseConnection = New-Object System.Data.SqlClient.SqlConnection
 $DatabaseCommand = New-Object System.Data.SqlClient.SqlCommand
 
 # Iterate through each database under $ServerName
 while($MasterDbResult.Read())
 {
 $DbName = $MasterDbResult[0]
 
 
 # Apply conditions for user databases (i.e., not master DB)
 if($DbName -ne "Master")
 {
 # Setup connection string for $DbName
 $DatabaseConnection.ConnectionString = "Server=$ServerName; Database=$DbName; User ID=$UserId; Password=$Password;"
 $DatabaseConnection.Open();
 
 # Create command for a specific database $DBName
 $DatabaseCommand.Connection = $DatabaseConnection
 
 Write-Output "Perform index maintenance on $DbName"

 # ExampleTable is a place holder for a table that holds a large volume of less important and expendable data
 # that can be truncated to save space on the database.

 $DatabaseCommand.CommandText ="
 EXECUTE dbo.IndexOptimize
 @Databases = '" + $DbName + "',
 @FragmentationLow = NULL,
 @FragmentationMedium = 'INDEX_REORGANIZE,INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE',
 @FragmentationHigh = 'INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE',
 @FragmentationLevel1 = 5,
 @FragmentationLevel2 = 30,
 @UpdateStatistics = 'ALL',
 @OnlyModifiedStatistics = 'Y'
 "
 Write-Output $DatabaseCommand.CommandText
 $NonQueryResult = $DatabaseCommand.ExecuteNonQuery()
 
 # Close connection to $DbName
 $DatabaseConnection.Close()
 
 }
 } 
 
 # Close connection to Master DB
 $MasterDatabaseConnection.Close() 
 } 
 }
}

It’s based on the script that I have found on Codeplex. As you will notice, I have specified 2 parameters. $SqlServerName: this is the server name where we want to perform the index maintenance $Credential: This is the username that will be used to connect to the SQL Server. This will be linked to the credential that we have created in step 2 The workflow is first going to connect to the master database to retrieve all the online databases. As mentioned the install maintenance script part, I had to find a workaround because you cannot use cross DB stored procedures on Azure DB, which are used by the Ola’s IndexOptimize script. Once I get the list of all the databases, I connect to each of the DB’s separatly and execute the index optimize. Please not that the @Databases parameter should contain the current DB name. This solution is not (yet) as flexible as Ola’s solution but it’s a good start. Once the import is done, you will notice that the column Authoring still has a “New” Status. The runbook still needs to be published (see next chapter)

Test your Workflow and Publish

Before you can actually start using this Workflow, you have to publish it, however, I recommend to first test this if everything works fine. Once the Runbook has been created, you can click on it and go the Author section


In the “Draft” section you can see the Workflow that you have just imported. Click on the button “Test” at the bottom to test your Runbook. Before the Runbook is actually executed, you have to provide the 2 parameters. In my example, the Credential will be the AzureDBCredential that I have created in step 2. The name of my SQL Server is called pvhv12.database.secure.windows.net. This is the “SQL Server” where my databases are running on.


If all goes well, you should see an empty output pane with a status: COMPLETED
If not everything goes well, you will notice the errors in the output pane. As soon as you have tested the job and everything works fine, you can publish the runbook and start using it.

Schedule the runbook

The last step is to schedule the published runbook. This is actually the same like in the SQL Server Agent. You have to link a schedule to the job or in this case the runbook. In the runbook, click on the section “Schedule” and select “Link to a New Schedule”
Type in the name of your schedule. Again, the same principle as the name of a job schedule of the SQL Server Agent
Secondly, you need to configure the schedule. Currently you can choose between One time, Hourly or Daily. In my example, I want my Index Maintenance on a daily basis starting at 23:00.

In the final step, you have to specify the runbook parameters that will be used when the job is executed. This will be the same values that I have used in the test phase. Once the schedule is created, your index maintenance will be done on a daily basis.

Verify job history

Just like in the SQL Server Agent, you can also verify the job history of a Runbook. You can select “Jobs” in your runbook to verify the job history. When you click on one of the jobs, you can even see the error output, job summary and the parameters that have been used.

Summary

So summarize I will make a comparison between the Azure Automation and the SQL Server Agent Job.

Azure Automation SQL Server Agent Job
Create an Azure Automation Account Create an SQL Server Agent Job
Create a runbook Create a Job Step
Test the runbook Start Job at Step
Publish the runbook Save the job
Schedule the runbook Schedule the SQL Server Agent Job
View jobs of the runbook View History

 

How easy is it to deploy your database to the cloud?

With the current hype in cloud computing, DBA’s want to have an easy way to deploy their database to SQL Azure. In previous versions, migrating a database to the cloud was a significant undertaking. You could download the SQL Azure Migration Wizard which actually created some scripts to move your database. Microsoft also provided following solution Migrating Databases to SQL Azure.  Other possibilities are BCP, building SSIS packages, “Generate script wizard”,…

I was wondering why it has to be that “difficult” and “complex”.  People are still using these solutions, which are certainly not bad, believe me, but…Now. with SQL Server Management Tools in SQL 2012  you can use the Deploy Database to SQL Azure feature, which under the hood exports a bacpac and then imports it to SQL Azure. It makes the life of the DBA much easier.  Let’s see how this works.

Open the Object Explorer and go to your database that you want to deploy. Right click on it and select Tasks – Deploy Database to SQL Azure

image

You should now see an introduction screen of the deploy wizard.  Just click next…

2

Now you need to specify the deployment settings.  First of all, you need to connect to your SQL Azure Server. Click the connect button.

image

Enter your SQL Azure credentials and click the connect button. Please note that the connection to a SQL Azure server in some companies can fail.  Probably because of firewall settings. …this company is not ready yet for the cloud… Denkende emoticon.

image

Once you are connected, specify the correct database name.
Select the Edition (Web or Business) of SQL Azure and the maximum database size that you need.  For more information on SQL Azure and Pricing check out this link

image

image

When you click next,you will get a summary of the specified settings

image

Click Finish to start the deployment to SQL Azure. The wizard will first evaluate your database if it meets all the requirements of SQL Azure. If not,it will fail, as you can see below Knipogende emoticon.

image

If you click on the “error link” to verify what is wrong.  In my case, I have a table with no clustered index.  That’s not allowed in SQL Azure.  Check the following page if you want an overview of the Guidelines and Limitations (SQL Azure Database)

image

You need to restart the wizard all over again.  Microsoft didn’t provide a retry button here…
I have corrected my database and retried the deploy again. Successful! As you can see, the wizard is first exporting the database, the schema and the data itself. Secondly a database on the SQL Azure is created, schema and data are imported.

image

Now let’s verify if my database is indeed deployed correctly.
When I check the Windows Azure Platform, I find indeed my new deployed database

image

Conclusion: The other deployment tools were very useful in previous versions but when you have installed SQL Server 2012, I really should recommend this new feature.