Remove Azure Subscriptions from SSMS

Last week I was preparing my pre-con for SQL Server Days.
I was playing around a lot with the new Azure features because that will be a big part of my pre-con.
When I was trying to deploy a database to my Azure VM with the new “Deploy database to Azure VM” wizard,
I noticed that there were a lot of double subscriptions that I could use. This was probably caused by the testing that I have done lately 😉
Adding a subscription is easy but I couldn’t find any way to remove a subscription from the dropdownbox.

Double Subscriptions

As an MVP, you get the privilege to ask questions to other SQL MVP’s and the Microsoft Product Team.
So, I posted this question and got quickly response from a Microsoft SQL Server Program Manager.
The solution is public information, so I’m happy to share it with you.

The references to the subscriptions are stored under the following directory

%USERPROFILE%\Documents\SQL Server Management Studio\Deploy DB to Windows Azure VM


I removed them and went back to my deploy wizard but I still got one subscription available.
Default file

Apparently, that is my active subscription.
To get rid of this active subscription, you need to update the deploymentsettings.xml file in the folder

%USERPROFILE%\AppData\Local\SQL Server\Deploy to SQL in WA VM

When you should open the xml file you will see indeed the subscription.

After removing the deploymentsettings.xml file, I got my “clean” dropdownbox back.

This is something that you won’t need a lot but always good to know!


It’s hard to convince clients to use SQL 2014

SQL Server 2014 has been released since the 1st of April and there are some really cool features that might help you and your organization and can definitely justify an upgrade. In my job as a SQL consultant, I have a lot of different customers and none of them is currently working on SQL Server 2014 or is planning to upgrade to this new version. I find it really hard to convince my clients to upgrade or even install it for a brand new installation.

Why is it so hard to convince people to start using SQL Server 2014…? After questioning my customers I figured out these 5 reasons?

I don’t trust it

This is probably the most famous argument of my customers. They don’t trust the new version because it has just been released and has probably a lot of bugs. Let’s wait till the first service pack has been released, and then we know it is stable… Most of my clients are telling me that there are just too many risks to install an RTM version of SQL Server. “What happens if something happens and we’re not prepared for it?” I don’t think Microsoft will release a version of SQL Server that is not stable or not fully tested. It’s true that in the beginning of a new release, more bugs are found, but I wouldn’t state that the new release is unstable. Microsoft is still releasing Service Packs and Cumulative Updates for older versions of SQL Server. Does this mean that this software is unstable? Well.., no…, because you’re probably using it right now. Bug fixes are always part of the software despite if it’s a new or old version.

Oh, is there a new release?

I’ve noticed that many people don’t know that a new version has been released. They just install the older version of which they know of. Or they have heard about a new release but are not aware of all the new cool functionalities that can be a benefit for their company. And that is where we, the SQL Family, come in. We are trying to make people aware of all these new features with SQL events, webinars, workshops, blogs,…

My vendor doesn’t support it yet…

Another reason that I hear a lot is that the vendor of a software package (which is using a SQL database), does not support the new version yet. I even had to install a brand new SQL Server 2005 last month because the vendor didn’t support another version… I told my customer to get rid of that software and find a good alternative. The vendor’s software will probably work on SQL Server 2014 but if you should use it, you will lose your full support on the software… Try the upgrade advisor and see if there is some exotic configuration that will not be working anymore. I doubt it… I usually send the report to my customer and the supplier. And I’ve noticed lately that suppliers are giving me a good response that they will support SQL Server 2014 in a next major release.

It’s too expensive

Upgrading to a new version of SQL Server is a project that requires some investments. Not only the license cost but also new hardware, re-coding your application, testing … Many customers are complaining that the enterprise edition has become really expensive but do they really need the enterprise edition? From SQL Server 2008 onwards, you can use the DMV sys.dm_db_persisted_sku_features to view all enterprise edition-specific features that are enabled in the current database. If they do need the enterprise edition, it might worth starting a consolidation project instead of and upgrade project to save license costs.

The upgrade can also save you money because new features can save resources like page and row compression, resource governor or the new T-SQL improvements can save your developers a lot of time.

No time, too much work

I hear this a lot… “Upgrade…? Sorry but we don’t have time for that… Why should we upgrade, the database is working just fine and we have other more important things on our agenda.” I don’t think that it has something to do with “no time”. Some people are just not motivated to do an upgrade. It’s hard to convince people that are not willing to learn and are not open to new ideas. And maybe that’s because of the above mentioned point, they don’t trust the software and it will cost the company a lot of money.

I don’t know if I’m the only one that is experiencing this is :) ? But I hope that I can change the perception of my clients … but it’s a tough job… believe me…

Remove SQL Backup Files from your Azure Storage

Microsoft introduced a lot of new Azure features in SQL Server 2014, which has been released this week. It has become really easy to store your backup files on Windows Azure (or should I say Microsoft Azure J?).
You can check my previous blog post on how you can take a backup to Windows Azure. This all works really great and yesterday I also introduced this new feature to one of my customers and they wanted to go ahead with it! While I was driving home, I was thinking how I could implement this in a good way. My customer didn’t want to use Smart Backup, because they wanted to be in control of the backup process. Scheduling the backups with the SQL Server Agent looks like the appropriate solution here, just like you take backups to disk, but you choose to URL.

When you take backups to disk, you also have to schedule a cleanup job for the old backup files. The same counts for the Azure storage, unless you want to pay a lot of money for your storage… J
But how can I clean up the backups from my Azure Storage container? I decided to use PowerShell to clean up the backup files.

First of all you have to install the Azure PowerShell cmdlets. More information can be found here. Once the cmdlets are installed, don’t forget to connect to your subscription as mentioned. Execute the Get-AzureSubscription Command in the Windows PowerShell ISE.

Login to your Azure portal and copy the access key of your storage account because you will need it in the script where it says “Paste your storage access key here“.

In my script, I’m using the Get-AzureStorageBlob to get the blobs out of my backup container. I’m going to check the LastModified date for each blob with extension *.bak and *.log. If it’s less than my cleanup time (3 days for full backups and 1 day for transaction log backups) I’m removing the backup file with the Remove-AzureStorageBlob cmdlet.

$CleanupTime = [DateTime]::UtcNow.AddHours(-72)
$context = New-AzureStorageContext -StorageAccountName koherastorage -StorageAccountKey {Paste your storage access key here}
Get-AzureStorageBlob -Container "sqlbackup" -Context $context | 
Where-Object { $_.LastModified.UtcDateTime -lt $CleanupTime -and $_.BlobType -eq "PageBlob" -and $_.Name -like "*.bak"} |
$CleanupTime = [DateTime]::UtcNow.AddHours(-24)
Get-AzureStorageBlob -Container "sqlbackup" -Context $context | 
Where-Object { $_.LastModified.UtcDateTime -lt $CleanupTime -and $_.BlobType -eq "PageBlob" -and $_.Name -like "*.log"} |

 You can easily schedule it this with a SQL Server Agent Job. Just choose Powershell in the job step.

 I know this script can be fine-tuned and any suggestion is appreciated.


Careful with autogrowth!

A few months ago I was invited by a customer to do an audit of their SQL Servers.
One of the issues that I found was the “implementation” of the autogrowth settings, like growth with 1MB or growth in %.
This week, they invited me back to perform the corrective actions on the issues that I have found.
I was manually modifying the autogrowth setting of one of the databases and I made a mistake… Instead of changing the value of MB I accidently changed the % value into 1024. To my surprise the SSMS accepted this value.
I expected that the percentage value couldn’t go above 100%. Wrong guess … J.
My mistake triggered me to perform some testing on the autogrowth setting.

Let’s see if it’s actually grows with 1024%.
I just created a simple DB with an initial data size of 10 MB and change the autogrowth to 1024%. (Don’t mind the autogrowth setting of the log file.)
I’ve done my tests on a SQL 2012 and on a SQL 2014 instance.

After loading some data into my database, my file actually has grown with 1024%!

(10 MB x 1024%) + 10 MB (initial size) = 112,4 MB !

Let’s see if we can push it a little bit further. What can be the maximum value of the percent growth?
Well, it turns out if you enter a large number like 9999999999 in the GUI, it will change it back to the following value 2147483647, which is the maximum value of an integer.
I’ve tried the same with T-SQL statement

ALTER DATABASE [testgrowth] MODIFY FILE ( NAME = N'testgrowth', FILEGROWTH = 9999999999%)

and I’ve got the error
Incorrect syntax near ‘9999999999’.

When I changed the 9999999999 into 2147483647%, my query ended successfully! As soon as you enter one digit higher than 2147483647, independent if you’re using KB, MB, GB, %,… after the number, the SSMS sees it as an “Incorrect syntax”.

So now we know what the limits are for % but are these the same when you choose, autogrowth in MB?
Let’s give it a try. I entered 9999999999 into the MB autogrowth field and the SSMS automatically changed it into 1048576 MB. It looks like the autogrowth in MB is limited to 1 TB and not 2147483647 like the percentage growth. It’s a completely different value.
If I try a T-SQL statement

ALTER DATABASE [testgrowth] MODIFY FILE (NAME = N'testgrowth', FILEGROWTH = 2147483647MB)

(a larger number will give me the incorrect syntax) I got the following error message:

Msg 1842, Level 16, State 1, Line 1

The file size, max size cannot be greater than 2147483647 in units of a page size. The file growth cannot be greater than 2147483647 in units of both page size and percentage.

Let’s see what happens when I do a T-SQL statement that is larger than 1048576MB (GUI limit) and smaller than 2147483647 (T-SQL limit).

ALTER DATABASE [testgrowth] MODIFY FILE (NAME = N'testgrowth', FILEGROWTH = 2097152 MB)

completes successfully.
Hey! Wasn’t that limited to 1TB by the GUI?

Let’s open the GUI again to see how SSMS deals with it. Well… not quite good… J

Conclusion here is that the GUI and the T-SQL statement are using different limitations, which should be corrected.

I’ve opened a connect on

Be careful when you change the autogrowth settings and when you make mistakes. The results can be quite strange.

Speaking at SQL Saturday Slovenia #274

On the 8th of November, I delivered a session (together with Wesley Backelant) on SQL Server Days about the new AlwaysOn functionalities in SQL 2014. I love to talk about SQL Server but unfortunately I was only able to speak on local Belgian events like, SQL Server Days, Community Day, Microsoft Discovery Day… I tried already a few times to speak at other international events like SQLBITS or SQL PASS Summit, but I never got selected… I didn’t gave up and I wanted to see how my chances are to present on a SQL Saturday event. So I submitted a few sessions for SQL Saturday in Slovenia and guess what, I got selected!! It will be my first international event and I’m really looking forward to meet the Slovenian SQL people and the other speakers. Special thanks go to my current employer Kohera for supporting me!

Harder, Better, Stronger and More Secure
The first part of my session, I’m going to talk about the new functionalities of AlwaysOn in SQL Server 2014 including live demo to add a replica to Windows Azure. The second part will be about the new backup features in SQL 2014 like backup to URL, managed backup and encryption. I will not focus on the basic AlwaysOn and backup capabilities but on the investments in SQL Server 2014 in the hybrid cloud area.

More information about the event can be found on

Hope to see you all there!

Restore a database from Windows Azure with SQL Server 2014 CTP2

In my previous blog post I explained how you could take a backup of a database to Windows Azure Storage with SQL Server 2014 CTP2.
In this blog post I’m going to explain how you can restore your database from Windows Azure.

If your database is still available on your server, right click on the database, select Tasks à Restore à Database


Apparently, the management studio knows that a backup has been taken to Windows Azure and it opens the “Connect to Windows Azure Storage” window.

First of all you need to fill in the Storage Account. That will be the one that you have created when taken the backup to Windows Azure.
To know the correct storage account, go to your Windows Azure web site, select Storage and choose the correct storage account.
In my example here, this will be “koherastorage”.


You should also fill in the Account Key that is linked to your storage. To access stored content, applications and also the SSMS, use one of two 512-bit storage access keys.
Click on the button in the main storage window. A new window with the manage keys will open.

Copy the primary Access Key on your clipboard and past it in the Account Key field of the “Connect to Windows Azure Storage” windows of the SSMS.
You will notice that the key will be masked with *stars*

Select the correct credential to connect to Windows Azure. You can use the credential that you have created while creating the backup.

Click the Connect button and the restore window will appear as we all know it, like in previous SQL Server editions.


Select the appropriate backup sets that you want to restore and click OK to start the restore.
On top, you see that a tail-log backup will be taken. And yes indeed, it’s also stored to my Windows Azure Storage Container!
You should get a similar result like this J


As you can see, restoring a backup from Windows Azure is as simple as restoring a database from local disk.
However, there is still one question that keeps me busy. How will I backup or restore my database if I don’t have any internet connection available…?

Have fun!




Backup to Windows Azure with SQL Server 2014 CTP2

Since SQL 2014 is released, you can take a backup of your database to Windows Azure Storage.
CTP2 has recently be released and it is now possible for you as DBA to create a manual backup to Azure in SQL Server Management Studio.
In this blog post, I’m going to show you how easy it is to do this.

Before you can start with creating a backup, you have to create a storage account to store the backup.
Go to your manage page of Windows Azure.

Click on Storage and select New in the left bottom corner



Now select Data Services à Storage à Quick Create.
Type in a URL name that you want to use and select the location where you want to create the storage


When all the data is filled in click

As a result your storage account is created


Next step is creating a container to store blob files. Containers provide a logical grouping for blobs stored in the Windows Azure Blob service.
When you upload a backup to the Blob service, you must specify a container.

Click on the storage you have just created and select containers.
Click on the link “Create a Container”

In the new container window, type in the container name and select the appropriate Access type.
I’m choosing Private because I’m the only one that can have access to my backup files.

As a result the container is created. You will see that a new URL for the container is created. We will need that later on in the management studio when taking the backup.

Now that the Azure side is configured. Let’s open the management studio and take a backup of our database.
Right click on the database à Tasks à Backup


As you can see, the destination options has been changed a lot. You can choose to take a backup to disk or to URL.
Choose URL to take a backup to Windows Azure.

By default the management studio suggest a file name, but of course this can be changed.

We also have to create a SQL Credential to store the Windows Azure storage authentication information in SQL Server.
Once it is create, you can reuse it.

Click on the button “Create” to create the credential
Specify the credential name.
You have to specify the management certificate, so that you can connect to Windows Azure.
If you don’t have the certificate, you can also connect to Azure by using the publishing profile file.
Go to to download your configuration file and import it into your management studio by clicking the browse button.

Select the storage account to use and click the create button.


You should get a similar result as this one. Notice that the URL prefix is now the same as the URL of the container that you just have created.
If it is not the same, past the container URL into the URL prefix field.


Click OK and your backup will start and finish successfully.

Let’s go back to my Azure Manage page. When you check the container, you will see that a bak file has been added

Well, that wasn’t that difficult if you ask me J.

Next blog post will be the restore of the database from Windows Azure.