How to view Azure SQL Database Audit Logs?

When you enable auditing on you Azure SQL Database, it will capture database events and it will write them to an audit log on your Azure Storage Account.
In fact, the audit is actually an Azure Table which will keep the data according to the retention period that you have specified. More information about auditing on Azure SQL Database can be found here.
The only “issue” with auditing is that you cannot read the Azure Store Table directly from the portal. You’ll have to use different tools to analyze your audit logs.

The Azure Portal Dashboard

The Azure Portal Dashboard gives you an overview and summary of the events that have happened on your database.
You can see the number of schema changes, login failures, data changes, stored procedures, …
This dashboard is only available in the new portal.

 

Azure Storage Explorer

The Azure Storage Explorer is a free tool that you can download from Codeplex which allows you to read the Azure audit log file in detail. With the Storage Explorer, you can see and configure all your storage account components.
Once you have installed the tool, open it and click on add account. Specify the storage account and the access key where the audit table is stored.


When you’re connected, you’ll see a folder Tables, where you can find your audit log. By default, the name of the audit log always has the same structure. It begins with “SQLDBAuditLogs” + “the name of your server” + “the name of your database” + “the date the audit was activated”.

When you click on the table, the audit log will open and you can see all the tracked events that have been logged.
Be aware that the eventtime might be different than the time where you are located. You also have to take into account that it can take a sometime to open the audit log if many events are tracked.
You also have the possibility to download the logs locally. Next to the eventtime you also see the server and database name, the application that executed the query, the type of the event and even the SQL statement or stored procedure that have been executed.
You can really see who did what and when on your database and can be useful to troubleshoot your database.

In the same audit log, you can also see the number of AffectedRows, ResponseRows and how many time it took for the server to execute the query.

Excel and Power Query

A template can be downloaded from the Azure Portal to help you analyze the audit logs. The Excel file allows you to connect to your Storage Account and use Power Query to import the log information.

I won’t explain how you can use the Excel file because Microsoft has already a detailed explanation in the Excel file itself

Power BI

Microsoft also released the Power BI Azure Audit Logs content pack which can help you easily analyze and visualize the wealth of information contained in these logs.
You just need to connect to your storage account through the content pack and a nice dashboard will be created for you automatically.

More information can be found on this MSDN blog post.

If you should have other ways to connect to the audit logs, please do share it with me. I’m always interested in it.

Pieter

How to perform index maintenance on Azure SQL Database

It’s been a long time since I wrote a blog post and I finally found some time ūüėÄ I’ve played a lot recently with the new Azure SQL Database V12 version. It’s really cool because it has a lot of improvements if you compare it to v11. With the V12 version, you have almost the same functionalities like an on premise database and it consists also new functionalities like row-level security, dynamic data masking and the query store. Just like an on premise database, you also have to maintain your indexes and statistics because indexes can also get fragmented on an Azure SQL Database. With an on premise server or a VM in Azure, most DBA’s schedule an index maintenance job under the SQL Server Agent. However, with Azure SQL Database, you have to think inside the database, which means you don’t have any server functionalities, including SQL Server Agent. Luckily, the Azure platform provides the Azure Automation, which can be used as a replacement for the SQL Server Agent. In this blog post, I’m going to explain how you can schedule your index maintenance job or any other job that needs to be scheduled.

Create a new Azure Automation Account

First of all, you have to begin with creating a new Azure Automation Account. Go to the portal and select the Automation tab. Click the Create button to create the new Automation Account. Fill in the name of the account and the region. Choose the region that is the closest to you. r

Create an Azure Automation credential asset

As prerequisite, create an Azure Automation credential asset that contains the username and password for the target Azure SQL DB logical server. Click on the Automation Account that you have just created and select Assets. In this section, select the button “Add Setting” at the bottom.

Select the option “Add Credential”


Select the Credential type “Windows PowerShell Credential” because we’re going to use this credential in the PowerShell Workflow. Give the Credential a name.

Specify the username and the password that you will link to the credential. This will be the user that will connect to your SQL Azure Database.

Click on OK and wait until the credential is created.

Install your maintenance scripts

Make sure that you have installed your maintenance scripts/procedures on all your Azure Databases. In my example, I’ve been using the maintenance scripts of Ola Hallengren. For the index maintenance, I have to install the scripts IndexOptimize.sql and CommandExecute.sql.¬†Make sure you download the latest version because Ola fixed an issue with the index optimize on Azure SQL Database on July 19th 2015.¬†There is a small issue with the scripts. Ola uses cross database stored procedures, which is not supported in Azure DB at the moment. So, the @Database parameter will not work correctly. Please also check the comments of this blog post. You have to implement a workaround in the runbook.

Import the Maintenance Runbook

We have now setup all the prerequisites so we can start with creating a runbook in our Azure Automation account. A runbook is a PowerShell Workflow that needs to be created or imported. You can actually compare it to configuring a job step in the SQL Server Agent job. The runbook contains the SQL Scripts that need to be executed for the index maintenance and will be scheduled later on. Select your Automation Account and go to the runbook tab. Click on the button “Import” at the bottom to upload your PowerShell Workflow. Select your PowerShell script that you have created and upload it.

Here is my script that I have used.

<#
.SYNOPSIS
 Perform index maintenance

.DESCRIPTION
 This runbook provides an example of how Azure Automation can be used to accomplish common SQL Agent tasks in the cloud. 
 As prerequisite, please create an Azure Automation credential asset that contains the username and password for the target Azure SQL DB logical server ($SqlServerName).
 Make sure that you have installed the scripts IndexOptimize.sql and CommandExecute.sql of Ola Hallengren (https://ola.hallengren.com/downloads.html)
 The check for the MAXDOP value in the IndexOptimize.sql script is using sys.dm_os_sys_info, which is currently not supported
 So be sure you disable that check otherwise it will return an error.
 
.EXAMPLE
 SQLServerIndexMaintenance
 
.NOTES
 AUTHOR: Pieter Vanhove
 LAST EDIT: October 20, 2015 
#>

workflow SQLServerIndexMaintenance
{
 param
 (
 # Fully-qualified name of the Azure DB server 
 [parameter(Mandatory=$true)] 
 [string] $SqlServerName,
 
 # Credentials for $SqlServerName stored as an Azure Automation credential asset
 # When using in the Azure Automation UI, please enter the name of the credential asset for the "Credential" parameter
 [parameter(Mandatory=$true)] 
 [PSCredential] $Credential
 )
 
 inlinescript{
 
 # Setup credentials 
 $ServerName = $Using:SqlServerName
 $UserId = $Using:Credential.UserName
 $Password = ($Using:Credential).GetNetworkCredential().Password
 
 
 # Create connection to Master DB
 $MasterDatabaseConnection = New-Object System.Data.SqlClient.SqlConnection
 $MasterDatabaseConnection.ConnectionString = "Server = $ServerName; Database = Master; User ID = $UserId; Password = $Password;"
 $MasterDatabaseConnection.Open();
 
 # Create command to query the current size of active databases in $ServerName
 $MasterDatabaseCommand = New-Object System.Data.SqlClient.SqlCommand
 $MasterDatabaseCommand.Connection = $MasterDatabaseConnection
 $MasterDatabaseCommand.CommandText = 
 "
 select name from sys.databases
 where state_desc='ONLINE'
 
 "
 # Execute reader and return tuples of results <database_name, SizeMB>
 $MasterDbResult = $MasterDatabaseCommand.ExecuteReader()
 
 # Proceed if there is at least one database
 if ($MasterDbResult.HasRows)
 {
 # Create connection for each individual database
 $DatabaseConnection = New-Object System.Data.SqlClient.SqlConnection
 $DatabaseCommand = New-Object System.Data.SqlClient.SqlCommand
 
 # Iterate through each database under $ServerName
 while($MasterDbResult.Read())
 {
 $DbName = $MasterDbResult[0]
 
 
 # Apply conditions for user databases (i.e., not master DB)
 if($DbName -ne "Master")
 {
 # Setup connection string for $DbName
 $DatabaseConnection.ConnectionString = "Server=$ServerName; Database=$DbName; User ID=$UserId; Password=$Password;"
 $DatabaseConnection.Open();
 
 # Create command for a specific database $DBName
 $DatabaseCommand.Connection = $DatabaseConnection
 
 Write-Output "Perform index maintenance on $DbName"

 # ExampleTable is a place holder for a table that holds a large volume of less important and expendable data
 # that can be truncated to save space on the database.

 $DatabaseCommand.CommandText ="
 EXECUTE dbo.IndexOptimize
 @Databases = '" + $DbName + "',
 @FragmentationLow = NULL,
 @FragmentationMedium = 'INDEX_REORGANIZE,INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE',
 @FragmentationHigh = 'INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE',
 @FragmentationLevel1 = 5,
 @FragmentationLevel2 = 30,
 @UpdateStatistics = 'ALL',
 @OnlyModifiedStatistics = 'Y'
 "
 Write-Output $DatabaseCommand.CommandText
 $NonQueryResult = $DatabaseCommand.ExecuteNonQuery()
 
 # Close connection to $DbName
 $DatabaseConnection.Close()
 
 }
 } 
 
 # Close connection to Master DB
 $MasterDatabaseConnection.Close() 
 } 
 }
}

It’s based on the script that I have found on Codeplex. As you will notice, I have specified 2 parameters. $SqlServerName: this is the server name where we want to perform the index maintenance $Credential: This is the username that will be used to connect to the SQL Server. This will be linked to the credential that we have created in step 2 The workflow is first going to connect to the master database to retrieve all the online databases. As mentioned the install maintenance script part, I had to find a workaround because you cannot use cross DB stored procedures on Azure DB, which are used by the Ola’s IndexOptimize script. Once I get the list of all the databases, I connect to each of the DB’s separatly and execute the¬†index optimize. Please not that the @Databases parameter should contain the current DB name.¬†This solution is not (yet) as flexible as Ola’s solution but it’s a good start. Once the import is done, you will notice that the column Authoring still has a “New” Status. The runbook still needs to be published (see next chapter)

Test your Workflow and Publish

Before you can actually start using this Workflow, you have to publish it, however, I recommend to first test this if everything works fine. Once the Runbook has been created, you can click on it and go the Author section


In the “Draft” section you can see the Workflow that you have just imported. Click on the button “Test” at the bottom to test your Runbook. Before the Runbook is actually executed, you have to provide the 2 parameters. In my example, the Credential will be the AzureDBCredential that I have created in step 2. The name of my SQL Server is called pvhv12.database.secure.windows.net. This is the “SQL Server” where my databases are running on.


If all goes well, you should see an empty output pane with a status: COMPLETED
If not everything goes well, you will notice the errors in the output pane. As soon as you have tested the job and everything works fine, you can publish the runbook and start using it.

Schedule the runbook

The last step is to schedule the published runbook. This is actually the same like in the SQL Server Agent. You have to link a schedule to the job or in this case the runbook. In the runbook, click on the section “Schedule” and select “Link to a New Schedule”
Type in the name of your schedule. Again, the same principle as the name of a job schedule of the SQL Server Agent
Secondly, you need to configure the schedule. Currently you can choose between One time, Hourly or Daily. In my example, I want my Index Maintenance on a daily basis starting at 23:00.

In the final step, you have to specify the runbook parameters that will be used when the job is executed. This will be the same values that I have used in the test phase. Once the schedule is created, your index maintenance will be done on a daily basis.

Verify job history

Just like in the SQL Server Agent, you can also verify the job history of a Runbook. You can select “Jobs” in your runbook to verify the job history. When you click on one of the jobs, you can even see the error output, job summary and the parameters that have been used.

Summary

So summarize I will make a comparison between the Azure Automation and the SQL Server Agent Job.

Azure Automation SQL Server Agent Job
Create an Azure Automation Account Create an SQL Server Agent Job
Create a runbook Create a Job Step
Test the runbook Start Job at Step
Publish the runbook Save the job
Schedule the runbook Schedule the SQL Server Agent Job
View jobs of the runbook View History

 

Remove Backup files from Azure Storage Part 2

In April this year I’ve posted a script to delete your backup files from Windows Azure Storage.
This week, I’ve got a question from Mark, how you could use this script in a SQL Agent Job on an Azure VM. I mentioned in my previous blog post that you should¬†download the Azure PowerShell cmdlets before you could use the script. But Mark always got the following error when he was trying to download the installation file.


In this blog post I will explain more into detail how you can configure the script correctly in a SQL Server Agent job and how you should install the Azure Powershell Cmdlets.

Startup your VM on Azure and connect to the instance with the SSMS.
Before we install anything extra on the VM, let’s configure a job with the Removal Script that you can find in my previous post. Don’t forget to change the storage account, the access key and the container in the script¬†into your configuration.
Your job step should look like this. Choose the Type=Powershell and not T-SQL.


When you execute the job manually, you will probably get the following error message The term ‘New-AzureStorageContext’ is not recognized as the name of a cmdlet‚Ķ


This is because we didn’t install the Azure PowerShell cmdlets yet.¬†You can download the cmdlets here. Now, when you try to download this on the Azure VM, you will get the following error:


This is because your browser settings don’t allow you to download files. To fix this, open your Internet Explorer settings and select Internet Options


Select the security tab and click the button Custom level

Scroll down to Downloads and select the option Enable under “File download”


Click OK, close the settings and restart your browser. Now you should be able to download the file to install the Azure PowerShell Cmdlets


Save it and Run the file

First of all you will get this nice welcome screen…

Wait until it is finished and you see the next screen.
Just click Install

You will see that it will install the Microsoft Azure Powershell. Click “I accept” in the next screen

…and wait till the installation is finished

Click finish at the end

In the final screen you should actually see that the Azure Powershell Cmdlets are installed

Ok… that’s it!!!

Go back to the SQL Agent Job and execute it again. Normally the job should execute successfully!

To be 100% sure that it’s working, go to the Management portal of Windows Azure and verify the container if the files have been removed.

Have fun with it!

Cloud Witness in Windows Server v.Next

Last week, I attended the session “Did you vote today? A DBA’s guide to cluster quorum” at PASS Summit, by Allan Hirt.
Interesting stuff, and I learned that my quorum configuration is already done correctly according to what Allan explained.

Another interesting part was that Allan announced that there is a new type of quorum in Windows Server v.Next.

Cloud Witness!

Instead of using a File Share Witness or Disk Witness, you can now also choose Cloud Witness.
Basically it’s going to create a file on your Azure Storage which counts as a quorum vote.
This cloud witness can be really helpful in case of multi-site clusters.
With multi-site clusters, there is always the question in which datacenter you are going to put the file share witness.
In fact, to configure it correctly, you should put your file share witness in a third datacenter. But that’s really too expensive for most companies just for a file share witness.¬†The cloud witness provides a good alternative for this problem.

The only “issue” that I see with this cloud witness, is that your cluster nodes must have connection with the internet. And honestly, I haven’t seen much SQL Servers or cluster nodes that are able to connect to the internet.
But with the appropriate firewall settings, you should be OK.

I’ve already installed a 2 node Windows vNext Failover Cluster and tried it out.
It’s actually really easy to configure.

Start with opening the Failover Cluster Manager and connect to your cluster.
Right click on the cluster name à More Actions à Configure Cluster Quorum Settings…

In the “Select Quorum Configuration Option” windows, select “Select the quorum witness”

In the “Select Quorum Witness” windows, select “Configure a cloud witness”

To configure the cloud witness, you need to specify your storage account and your Azure storage account key.

This information can be found on the Azure Portal. Just go to the storage option. On the bottom of your screen you will see a button called Manage Access Keys.

Click on that button, copy one of the 2 keys and paste it in the Azure storage field of the cloud witness configuration

Your configuration should look similar like this screen shot below

Finally, complete the wizard and if all went well you have now configured your cloud witness.

When you look at your storage account in the Azure Portal, you will notice that a new container, “msft-cloud-witness”, is created with 1 blob file inside.

Pretty cool if you ask me ūüėÄ !

Pieter

 

Remove Azure Subscriptions from SSMS

Last week I was preparing my pre-con for SQL Server Days.
I was playing around a lot with the new Azure features because that will be a big part of my pre-con.
When I was trying to deploy a database to my Azure VM with the new “Deploy database to Azure VM” wizard,
I noticed that there were a lot of double subscriptions that I could use. This was probably caused by the testing that I have done lately ūüėČ
Adding a subscription is easy but I couldn’t find any way to remove a subscription from the dropdownbox.

Double Subscriptions

As an MVP, you get the privilege to ask questions to other SQL MVP’s and the Microsoft Product Team.
So, I posted this question and got quickly response from a Microsoft SQL Server Program Manager.
The solution is public information, so I’m happy to share it with you.

The references to the subscriptions are stored under the following directory

%USERPROFILE%\Documents\SQL Server Management Studio\Deploy DB to Windows Azure VM

Files

I removed them and went back to my deploy wizard but I still got one subscription available.
Default file

Apparently, that is my active subscription.
To get rid of this active subscription, you need to update the deploymentsettings.xml file in the folder

%USERPROFILE%\AppData\Local\SQL Server\Deploy to SQL in WA VM

When you should open the xml file you will see indeed the subscription.

After removing the deploymentsettings.xml file, I got my “clean” dropdownbox back.

This is something that you won’t need a lot but always good to know!

Pieter

Remove SQL Backup Files from your Azure Storage

Microsoft introduced a lot of new Azure features in SQL Server 2014, which has been released this week. It has become really easy to store your backup files on Windows Azure (or should I say Microsoft Azure J?).
You can check my previous blog post on how you can take a backup to Windows Azure. This all works really great and yesterday I also introduced this new feature to one of my customers and they wanted to go ahead with it! While I was driving home, I was thinking how I could implement this in a good way. My customer didn’t want to use Smart Backup, because they wanted to be in control of the backup process. Scheduling the backups with the SQL Server Agent looks like the appropriate solution here, just like you take backups to disk, but you choose to URL.

When you take backups to disk, you also have to schedule a cleanup job for the old backup files. The same counts for the Azure storage, unless you want to pay a lot of money for your storage… J
But how can I clean up the backups from my Azure Storage container? I decided to use PowerShell to clean up the backup files.

First of all you have to install the Azure PowerShell cmdlets. More information can be found here. Once the cmdlets are installed, don’t forget to connect to your subscription as mentioned. Execute the Get-AzureSubscription Command in the Windows PowerShell ISE.

Login to your Azure portal and copy the access key of your storage account because you will need it in the script where it says “Paste your storage access key here“.

In my script, I’m using the Get-AzureStorageBlob to get the blobs out of my backup container. I’m going to check the LastModified date for each blob with extension *.bak and *.log. If it’s less than my cleanup time (3 days for full backups and 1 day for transaction log backups) I’m removing the backup file with the Remove-AzureStorageBlob cmdlet.

$CleanupTime = [DateTime]::UtcNow.AddHours(-72)
$context = New-AzureStorageContext -StorageAccountName koherastorage -StorageAccountKey {Paste your storage access key here}
Get-AzureStorageBlob -Container "sqlbackup" -Context $context | 
Where-Object { $_.LastModified.UtcDateTime -lt $CleanupTime -and $_.BlobType -eq "PageBlob" -and $_.Name -like "*.bak"} |
Remove-AzureStorageBlob
$CleanupTime = [DateTime]::UtcNow.AddHours(-24)
Get-AzureStorageBlob -Container "sqlbackup" -Context $context | 
Where-Object { $_.LastModified.UtcDateTime -lt $CleanupTime -and $_.BlobType -eq "PageBlob" -and $_.Name -like "*.log"} |
Remove-AzureStorageBlob

 You can easily schedule it this with a SQL Server Agent Job. Just choose Powershell in the job step.

 I know this script can be fine-tuned and any suggestion is appreciated.

 Pieter

Faster migration of your SQL 2005 databases

Many DBA’s are trying to get rid of the old SQL Server 2005, which is a good thing.
I’ve been working on some consolidation and migration projects and I’ve found a new and faster way to migrate SQL 2005 databases.

Most easy way to migrate a database is to backup the database, copy it to the new server and restore it. But, how can you make this process faster? Check out this blog post of Jes Schultz Borland (blog | Twitter). Compression of the backup file can make the migration go faster. The backup will be faster because SQL Server has to write less to disk. The copy will be faster because the backup files will be much smaller and the restore will be faster because SQL Server has to read less from the disk. However, there is one little problem‚Ķ SQL Server 2005 doesn’t support backup compression. This is available from SQL Server 2008 Enterprise Edition or from SQL Server 2008 R2 Standard onwards.

But now, there is a way to compress your SQL 2005 backup files, without using any third party tool. Since a few months, Microsoft introduced a tool called “SQL Server Backup to Azure Tool”. They created it to have a consistent backup solution. In SQL Server 2014 the backup to Azure, encryption and compression are embedded in the engine, but these features are not available in older versions of SQL Server (except for compression of course). The Backup to Azure Tool allows you to use the same functionalities for older versions of SQL Server, starting from SQL 2005 onwards. The good thing about the tool is that you don’t need to use Azure functionality to use encryption or compression. This means that we can compress our on premise SQL 2005 databases and make our migration faster!

The tool can be downloaded from http://www.microsoft.com/en-us/download/details.aspx?id=40740 and is pretty straight forward to install. This tool is supported on SQL Server 2005 or later, and Operating System versions: Windows Server 2008 or later for Servers, and Windows 7 or later for Client Operating Systems.

Once you start up the tool you have to configure some rules.

Click the Add button and start by specifying the path where you are going to save you backup files and the file name pattern:

I don’t want to use the Azure functionality because I’m using the tool for migration purposes to another on premise server.
Select Use local Storage

Select the option “Enable compression”.

Now you have setup the rule that will be necessary to compress local backups.

I did a few tests to see how well the Backup To Azure Tool will compress my backup. I’ve used a 130 GB database and I did the tests on my laptop (16GB RAM and 8 CPU’s).
I’m using SQL Server 2012 for my tests because I wanted to compare the difference between normal compression and Backup To Azure Tool compression

Test Duration MB/sec Size Backup File CPU usage
Full backup without compression and without Backup to Azure Tool compression 2696 sec. 34 MB/sec 89.3 GB 4 %
Full backup with compression and without Backup to Azure Tool compression 1502 sec 61 MB/sec 16.9 GB 10%
Full backup without compression and with Backup To Azure tool compression 2344 sec 39 MB/sec 13.5 GB 20%
Full backup with compression and with Backup To Azure tool compression 1403 sec 65 MB/sec 16.3 GB 15%

I noticed that the size of the backup file is smaller when I’m using the Backup to Azure Tool compared to the normal backup compression.
On the other hand, the duration is taking almost the same time as a backup without compression. Also, the CPU is 10% higher than a normal backup compression.
But with these results, it will still makes sense to use the tool for faster migrations. You still have to copy less data and the restore will also be much faster because there is less data that has to be read from disk.

Happy migration!

Pieter

Speaking at SQL Saturday Slovenia #274

On the 8th of November, I delivered a session (together with Wesley Backelant) on SQL Server Days about the new AlwaysOn functionalities in SQL 2014. I love to talk about SQL Server but unfortunately I was only able to speak on local Belgian events like, SQL Server Days, Community Day, Microsoft Discovery Day… I tried already a few times to speak at other international events like SQLBITS or SQL PASS Summit, but I never got selected‚Ķ I didn’t gave up and I wanted to see how my chances are to present on a SQL Saturday event. So I submitted a few sessions for SQL Saturday in Slovenia and guess what, I got selected!! It will be my first international event and I’m really looking forward to meet the Slovenian SQL people and the other speakers. Special thanks go to my current employer Kohera for supporting me!

Harder, Better, Stronger and More Secure
The first part of my session, I’m going to talk about the new functionalities of AlwaysOn in SQL Server 2014 including live demo to add a replica to Windows Azure. The second part will be about the new backup features in SQL 2014 like backup to URL, managed backup and encryption. I will not focus on the basic AlwaysOn and backup capabilities but on the investments in SQL Server 2014 in the hybrid cloud area.

More information about the event can be found on http://www.sqlsaturday.com/274/eventhome.aspx

Hope to see you all there!

Restore a database from Windows Azure with SQL Server 2014 CTP2

In my previous blog post I explained how you could take a backup of a database to Windows Azure Storage with SQL Server 2014 CTP2.
In this blog post I’m going to explain how you can restore your database from Windows Azure.

If your database is still available on your server, right click on the database, select Tasks à Restore à Database

 

Apparently, the management studio knows that a backup has been taken to Windows Azure and it opens the “Connect to Windows Azure Storage” window.

First of all you need to fill in the Storage Account. That will be the one that you have created when taken the backup to Windows Azure.
To know the correct storage account, go to your Windows Azure web site, select Storage and choose the correct storage account.
In my example here, this will be “koherastorage”.

 

You should also fill in the Account Key that is linked to your storage. To access stored content, applications and also the SSMS, use one of two 512-bit storage access keys.
Click on the button in the main storage window. A new window with the manage keys will open.

Copy the primary Access Key on your clipboard and past it in the Account Key field of the “Connect to Windows Azure Storage” windows of the SSMS.
You will notice that the key will be masked with *stars*

Select the correct credential to connect to Windows Azure. You can use the credential that you have created while creating the backup.

Click the Connect button and the restore window will appear as we all know it, like in previous SQL Server editions.

 

Select the appropriate backup sets that you want to restore and click OK to start the restore.
On top, you see that a tail-log backup will be taken. And yes indeed, it’s also stored to my Windows Azure Storage Container!
You should get a similar result like this J

 

As you can see, restoring a backup from Windows Azure is as simple as restoring a database from local disk.
However, there is still one question that keeps me busy. How will I backup or restore my database if I don’t have any internet connection available‚Ķ?

Have fun!

Pieter

 

 

Backup to Windows Azure with SQL Server 2014 CTP2

Since SQL 2014 is released, you can take a backup of your database to Windows Azure Storage.
CTP2 has recently be released and it is now possible for you as DBA to create a manual backup to Azure in SQL Server Management Studio.
In this blog post, I’m going to show you how easy it is to do this.

Before you can start with creating a backup, you have to create a storage account to store the backup.
Go to your manage page of Windows Azure.

Click on Storage and select New in the left bottom corner

             

 

Now select Data Services à Storage à Quick Create.
Type in a URL name that you want to use and select the location where you want to create the storage

 

When all the data is filled in click

As a result your storage account is created

 

Next step is creating a container to store blob files. Containers provide a logical grouping for blobs stored in the Windows Azure Blob service.
When you upload a backup to the Blob service, you must specify a container.

Click on the storage you have just created and select containers.
Click on the link “Create a Container”

In the new container window, type in the container name and select the appropriate Access type.
I’m choosing Private because I’m the only one that can have access to my backup files.

As a result the container is created. You will see that a new URL for the container is created. We will need that later on in the management studio when taking the backup.

Now that the Azure side is configured. Let’s open the management studio and take a backup of our database.
Right click on the database à Tasks à Backup

 

As you can see, the destination options has been changed a lot. You can choose to take a backup to disk or to URL.
Choose URL to take a backup to Windows Azure.

By default the management studio suggest a file name, but of course this can be changed.

We also have to create a SQL Credential to store the Windows Azure storage authentication information in SQL Server.
Once it is create, you can reuse it.

Click on the button “Create” to create the credential
Specify the credential name.
You have to specify the management certificate, so that you can connect to Windows Azure.
If you don’t have the certificate, you can also connect to Azure by using the publishing profile file.
Go to https://manage.windowsazure.com/publishsettings/Index?client=vs&SchemaVersion=1.0 to download your configuration file and import it into your management studio by clicking the browse button.

Select the storage account to use and click the create button.

 

You should get a similar result as this one. Notice that the URL prefix is now the same as the URL of the container that you just have created.
If it is not the same, past the container URL into the URL prefix field.

 

Click OK and your backup will start and finish successfully.

Let’s go back to my Azure Manage page. When you check the container, you will see that a bak file has been added

Well, that wasn’t that difficult if you ask me J.

Next blog post will be the restore of the database from Windows Azure.