Remove Backup files from Azure Storage Part 2

In April this year I’ve posted a script to delete your backup files from Windows Azure Storage.
This week, I’ve got a question from Mark, how you could use this script in a SQL Agent Job on an Azure VM. I mentioned in my previous blog post that you should download the Azure PowerShell cmdlets before you could use the script. But Mark always got the following error when he was trying to download the installation file.


In this blog post I will explain more into detail how you can configure the script correctly in a SQL Server Agent job and how you should install the Azure Powershell Cmdlets.

Startup your VM on Azure and connect to the instance with the SSMS.
Before we install anything extra on the VM, let’s configure a job with the Removal Script that you can find in my previous post. Don’t forget to change the storage account, the access key and the container in the script into your configuration.
Your job step should look like this. Choose the Type=Powershell and not T-SQL.


When you execute the job manually, you will probably get the following error message The term ‘New-AzureStorageContext’ is not recognized as the name of a cmdlet…


This is because we didn’t install the Azure PowerShell cmdlets yet. You can download the cmdlets here. Now, when you try to download this on the Azure VM, you will get the following error:


This is because your browser settings don’t allow you to download files. To fix this, open your Internet Explorer settings and select Internet Options


Select the security tab and click the button Custom level

Scroll down to Downloads and select the option Enable under “File download”


Click OK, close the settings and restart your browser. Now you should be able to download the file to install the Azure PowerShell Cmdlets


Save it and Run the file

First of all you will get this nice welcome screen…

Wait until it is finished and you see the next screen.
Just click Install

You will see that it will install the Microsoft Azure Powershell. Click “I accept” in the next screen

…and wait till the installation is finished

Click finish at the end

In the final screen you should actually see that the Azure Powershell Cmdlets are installed

Ok… that’s it!!!

Go back to the SQL Agent Job and execute it again. Normally the job should execute successfully!

To be 100% sure that it’s working, go to the Management portal of Windows Azure and verify the container if the files have been removed.

Have fun with it!

Cloud Witness in Windows Server v.Next

Last week, I attended the session “Did you vote today? A DBA’s guide to cluster quorum” at PASS Summit, by Allan Hirt.
Interesting stuff, and I learned that my quorum configuration is already done correctly according to what Allan explained.

Another interesting part was that Allan announced that there is a new type of quorum in Windows Server v.Next.

Cloud Witness!

Instead of using a File Share Witness or Disk Witness, you can now also choose Cloud Witness.
Basically it’s going to create a file on your Azure Storage which counts as a quorum vote.
This cloud witness can be really helpful in case of multi-site clusters.
With multi-site clusters, there is always the question in which datacenter you are going to put the file share witness.
In fact, to configure it correctly, you should put your file share witness in a third datacenter. But that’s really too expensive for most companies just for a file share witness. The cloud witness provides a good alternative for this problem.

The only “issue” that I see with this cloud witness, is that your cluster nodes must have connection with the internet. And honestly, I haven’t seen much SQL Servers or cluster nodes that are able to connect to the internet.
But with the appropriate firewall settings, you should be OK.

I’ve already installed a 2 node Windows vNext Failover Cluster and tried it out.
It’s actually really easy to configure.

Start with opening the Failover Cluster Manager and connect to your cluster.
Right click on the cluster name à More Actions à Configure Cluster Quorum Settings…

In the “Select Quorum Configuration Option” windows, select “Select the quorum witness”

In the “Select Quorum Witness” windows, select “Configure a cloud witness”

To configure the cloud witness, you need to specify your storage account and your Azure storage account key.

This information can be found on the Azure Portal. Just go to the storage option. On the bottom of your screen you will see a button called Manage Access Keys.

Click on that button, copy one of the 2 keys and paste it in the Azure storage field of the cloud witness configuration

Your configuration should look similar like this screen shot below

Finally, complete the wizard and if all went well you have now configured your cloud witness.

When you look at your storage account in the Azure Portal, you will notice that a new container, “msft-cloud-witness”, is created with 1 blob file inside.

Pretty cool if you ask me :D !

Pieter

 

PASS Summit My First Time

Only 2 weeks left until PASS Summit and believe it or not, it will be my first time at PASS Summit.
I was rewarded last year as an MVP for the first time and the MVP Summit is just prior to the PASS Summit event. This is an ideal combination and makes it affordable for me to go to Seattle and combine the 2 events!

I’m really looking forward to PASS Summit. Don’t know really what to expect so I read already some good blog posts that explain a lot for the first-timers.

The web site of Summit also has a really good explanation for the first-timers and Denny Cherry recorded a webinar where he explains how to get to Seattle, what you can do as a Tourist, the Convention Center,… PASS is also organizing a “First Timer’s Orientation Meeting & Speed Networking” that will bring the first-timers up to speed. I will attend that meeting for sure ;)

What I definitely know is that I will meet again a lot of people of the SQL Server Community that I have met before when I was speaking on SQL Saturdays and on SQL Server Days in Belgium. Will be fun, believe me!

I’m member of the SQLUG in Belgium and each year we’re organizing SQL Server Days, the biggest SQL event of the BeNeLux. This year, it was already our 7th edition which was a great success.
Although we had really good feedback of our speakers and attendees, you always want to push your event to a higher level. So I will keep my eyes and ears wide open during these 3 days and I’m sure I will have some great new ideas that we can use on SQL Server Days next year.

This year I’ve also started submitting sessions for SQL Saturday’s in Europe. I got selected to speak at Slovenia, Exeter, Oslo, Holland and I even did my first pre-con on SQL Server Days.
I’ve also submitted my sessions for PASS Summit. It would have been great to speak at “the Rock Werchter” of all SQL Events but unfortunately, none of my sessions got selected…

No worries, I got really good feedback from PASS so maybe I’ll be selected next year…

But…then… Mr. Denny announced his Speaker Idol Competition. Let me quote Denny what this contest is all about:

“The goal behind the Speaker Idol competition is to give speakers who are known in their region a chance to be seen at the international level so that they can improve their chances of being selected to speak in future years, to get some real time feedback from other speakers about their presentation style, and so that they can improve as speakers.”

THAT’S ME! I didn’t hesitate and registered for the competition. And guess what… I’m one of the 12 contestants! I really appreciate this opportunity. I’m speaking (for 5 minutes) on Wednesday, so let’s hope I can convince the “jury” and win the contest. Wish me luck J !

I’ve heard that there are also a lot of other activities during the evenings and night like a SQL Long Run, SQL Karaoke, Kilt Thursday … More info can be found on http://www.sqlpass.org/summit/2014/Connect/AfterHours.aspx

I’m definitely going to the SQL karaoke! Not that I’m a really good singer but sounds like fun J!

Well… let my US adventure begin and see you all within 2 weeks in Seattle at PASS SUMMIT 2014!

Pieter

Enable a SQL Job Unattended

Today a customer asked me to disable a daily load on their test database. I disabled the job but my problem was that I couldn’t enable the job again because I’m only once a week at that customer…
I found this little trick that I want to share with you to activate the daily job unattended.

First of all I scripted the enabling of the “daily job”. Just open the job properties à Check Enabled and hit the Script button.

USE [msdb]
GO

EXEC msdb.dbo.sp_update_job
@job_id=N'bfd08f62-4074-4f83-bebf-c9d630037595',
        @enabled=1

Secondly, I scheduled a new job “Enable daily job” that will activate the daily job the day after it has been disabled.

The job step will be the T-SQL statement that you have scripted.
You need to schedule this job after the daily job time (but has been disabled) but before the next run. Make sure that the Schedule type is set to “One Time” as we only want to activate that job just once ;)

Select Notifications and activate the checkbox “Automatically delete job: When the job succeeds”. We don’t need this “Enable Daily job” anymore once the “Daily job” has been activated.

Easy to configure, the daily job has been enabled unattended and I automatically cleanup my solution. Works like a charm and it is one reminder less in my agenda!

 

How I completely remove a SQL Server database – the right way!

I was shocked today!
I found this blog post on how you should remove a SQL Server database. I didn’t know if I should started crying or laughing…
This blogpost might be ridiculous for some of you but looks like it might be useful after all.

There are 2 possibilities to delete a database

GUI

With the SQL Server Management Studio you can right click on the database and select “Delete”

In the delete object window select the option “Delete backup and restore history information for databases” if you want to remove this information.
If you want to kick out open connections to your database select the “Close existing connections”. It will be impossible to remove the database if you don’t select the last option and there are still open connections to your database. You will receive an error that the database is still in use and cannot be deleted.

Once you hit the OK button, the database will be removed of the SQL instance and the database files on the OS level will also be removed. Definitely not necessary to shut down the whole instance to remove a database…

T-SQL

The script below does exactly the same as the actions that were taken in the GUI and have the same result. The database will be removed!

--This will remove the backup history
EXEC msdb.dbo.sp_delete_database_backuphistory @database_name = N'DBToDelete'
GO
USE [master]
GO
--This will drop all the open connections
ALTER DATABASE [DBToDelete] SET SINGLE_USER WITH ROLLBACK IMMEDIATE
GO
USE [master]
GO
--This will remove my database
DROP DATABASE [DBToDelete]
GO

Now after the removal you still have some extra cleanup stuff to do that people often forget…

Delete the jobs

Delete the jobs that were related to the database. If you will not remove them, the jobs will fail and you will receive unnecessary alerts.

Delete the backup files

If you don’t need the backup files anymore, just remove them. However, I would recommend to keep the last full backup of the database and archive it for at least a year or 2. You never know that somebody needs the data in the future… J

Delete the logins without DB user

Your database had probably some database users configured that were linked to a server login.
If that server login isn’t used for any other database user and isn’t member of any server role besides public, I would recommend to remove that login. Not only for security reasons but also to keep your server clean.

Well, I hope that this can help some junior DBA’s and that they find my blog post instead of the other one J

Pieter

Remove Azure Subscriptions from SSMS

Last week I was preparing my pre-con for SQL Server Days.
I was playing around a lot with the new Azure features because that will be a big part of my pre-con.
When I was trying to deploy a database to my Azure VM with the new “Deploy database to Azure VM” wizard,
I noticed that there were a lot of double subscriptions that I could use. This was probably caused by the testing that I have done lately ;)
Adding a subscription is easy but I couldn’t find any way to remove a subscription from the dropdownbox.

Double Subscriptions

As an MVP, you get the privilege to ask questions to other SQL MVP’s and the Microsoft Product Team.
So, I posted this question and got quickly response from a Microsoft SQL Server Program Manager.
The solution is public information, so I’m happy to share it with you.

The references to the subscriptions are stored under the following directory

%USERPROFILE%\Documents\SQL Server Management Studio\Deploy DB to Windows Azure VM

Files

I removed them and went back to my deploy wizard but I still got one subscription available.
Default file

Apparently, that is my active subscription.
To get rid of this active subscription, you need to update the deploymentsettings.xml file in the folder

%USERPROFILE%\AppData\Local\SQL Server\Deploy to SQL in WA VM

When you should open the xml file you will see indeed the subscription.

After removing the deploymentsettings.xml file, I got my “clean” dropdownbox back.

This is something that you won’t need a lot but always good to know!

Pieter

Speaking at SQL Server Days 2014

On the 30th of September and the 1st of October, the Belgian SQL Server User Group is organizing for the 7th time SQL SERVER DAYS.
Over the last few years SQL Server Days has become more and more famous in the SQL Server Community, and that’s really a fact if you look at the impressive conference schedule!

This year SQL Server Days 2014 will again be a two-day conference.
On the first day, there are deep dive trainings given by international and national speakers.
The second day will be the main technical conference for database administrators, developers and business intelligence professionals!

4 years ago I went to SQL Server Days as an attendee, the other 3 previous years I was selected to give a regular session.
But this year, I’m really honored to present my first pre-con “SQL Server 2014: What’s in it for you?
I’ve recently wrote a blog post why it’s so hard to convince people to upgrade to SQL Server 2014.
Well, in this pre-conference, I will dig into the new features of SQL Server 2014.
I will put a strong focus on the new Windows Azure and hybrid cloud functionalities and of course other features
like In-memory OLTP engine, Buffer Pool Extension, Enhanced Windows Server 2012 Integration, and many more…
The session will be bulk loaded with demos and hopefully I can convince people to upgrade ;)

Again, I’m really honored to be part of the event and really looking forward to meet some fine other SQL community speakers like,
Grant Fritchey, Bob Duffy, Denny Cherry, Joey d’Antoni and many more!
SQL Server Days 2014 is hosted at San Marco Village, which is an excellent venue close to Antwerp.
You really don’t want to miss this event and the opportunity for networking with the Microsoft Data Platform professionals!

You can still register here.

Well, I hope to see you all at SQL Server Days! Really looking forward to it!

Last known good DBCC CHECKDB not always correct

On the internet you can find a lot of methods to find out when DBCC CHECKDB has last run on your databases. The 2 most common queries are the based on DBCC DBINFO() and DBCC PAGE.
Below you can find 2 examples of this (replace “YourDatabase” with the correct database name):

USE YourDatabase
GO
CREATE TABLE #tmp
	(ParentObject varchar(1000) NULL,Object varchar(1000) NULL,Field varchar(1000) NULL,Value varchar(1000) NULL)
insert into #tmp
EXEC ('DBCC DBINFO() WITH TABLERESULTS')
select cast(value as datetime) from #tmp where field='dbi_dbcclastknowngood'
drop table #tmp

OR

DBCC TRACEON (3604); 
GO
DBCC PAGE (YourDatabase, 1, 9, 3);
GO

In my morning DBA checklist, I’m using the first query to check if there has been a database integrity check for the last 7 days. This query works well, however, last week I’ve noticed an issue. As soon as a database is in read-only or emergency mode, the DBCC CHECKDB command will succeed but the dbi_dbcclastknowngood field will (or can) not be updated. For that reason, my check indicates a failed status for my read-only databases…

Let’s take an example. First of all, I’m going to perform a DBCC CHECKDB for my AdventureWorks database

DBCC CHECKDB(AdventureWorks2012)

Once the check is finished, let’s verify the last dbcc check date with the DBCC DBINFO() query.

DBCC

Now, let’s set the database to read-only and do the test again.

DBCC read only

As you might notice, the last DBCC CHECKDB time is the same in both cases.

I found a good tip (old blog post) of Jonathan Kehayias to find the correct information. Just scan the SQL Error Log for the last 7 days. If I check the logs from my previous tests, you can see that there was a DBCC CHECKDB at 14:57:22 which was on my read-only database.

Error Log

I’ve modified the script of Jonathan. I’m using the xp_readerrorlog and I’m not checking the SQL 2000 version anymore… I’ve also changed my DBCC Check in my DBA checklist with the query below.

CREATE TABLE #ErrorLog
	(
	  Logdate datetime ,
	  ProcessInfo VARCHAR(12) ,
	  MsgText VARCHAR(3000)
	)
CREATE TABLE #NumberOfErrorLogFiles
	(
		ArchiveID tinyint,
		FileDate datetime,
		FileSize int
	)
insert into #NumberOfErrorLogFiles
EXEC master..xp_enumerrorlogs

DECLARE @SinceDate datetime  
SET @SinceDate = DATEADD(dd, -7, GETDATE())
DECLARE @ErrorFile tinyint
DECLARE @Severity tinyint
DECLARE @SeverityString nvarchar(3)
DECLARE @LastLogFile tinyint
select @LastLogFile=min(ArchiveID) from #NumberOfErrorLogFiles
where FileDate<@SinceDate
DECLARE @DBCCStatement nvarchar(1024)
set @DBCCStatement=N'DBCC CHECKDB (' + DB_NAME() + ')'

set @ErrorFile=0
while @ErrorFile<=ISNULL(@LastLogFile,0)
begin
    insert into #Errorlog
	EXEC master.dbo.xp_readerrorlog @ErrorFile, 1, @DBCCStatement, NULL, @SinceDate, NULL, N'ASC'
	set @ErrorFile=@ErrorFile+1
end

select * from #ErrorLog where Logdate>@SinceDate

drop table #ErrorLog
drop table #NumberOfErrorLogFiles

Have fun!

Pieter

Speaking at SQL Saturday Oslo #317

To end the holiday period in Belgium in style, I’m going on a trip to Oslo.
Not on vacation but to speak on the first edition of SQL Saturday!
Well, in some cases, this might be considered as a small vacation ;)

I’m very honored and exited that my session “How to manage many SQL Servers the easy way” has been selected.
It was really an impressive list of submitted session but my session made it!
It’s the second time that I’ll be presenting this session on a SQL Saturday.
First time was on SQL Saturday Exeter a few months ago and got really good feedback!
The purpose of this sessions is to point out which SQL Server tools you can use for multi-server management.
It is bulk loaded with demos and it will give you a good idea what features can be helpful in your environment.

I’m really looking forward to meet Cathrine Willemsen (twitter | blog) and Tone Hansson (twitter | blog), the organizers of the event, who are doing an excellent job!
A lot of other great SQL names are on the schedule. Looking forward to meet some new speakers like Tim Ford, Bob Duffy,Niko Neugebauer, Régis Baccaro, Kalen Delaney and many others.
And of course the other international speakers that I have met before like Boris Hristov, Matija Lah, André Kamman, Marco Russo, …

Hope to see you all there!

Tracking login and user changes in SQL Server

Did you ever wanted to track SQL Server logins and database user changes on all your SQL Servers? At one of my customers, the IT guys are sysadmins on the SQL Servers. In fact, the IT department is managing the SQL Servers when I’m not around. One day, “somebody” removed a user and of course, nobody did it ;)

First thing that came up to me was implementing SQL Server Audit. Good solution but not all versions of SQL Server are supporting SQL Server Audit. SQL Server 2008 and R2 are supporting SQL audit only in Enterprise Edition and there is no audit at all on SQL Server 2005. So, I had to come up with a solution that worked on these versions as well. My solution is based on this blog post of Aaron Bertrand (twitter | blog) but I’ve extended it.

Server Audit

I’ve implemented SQL Server Audit on all SQL Server 2008 and R2 Enterprise servers and SQL 2012 servers. Unfortunately no SQL 2014 yet, but you should also implement SQL Audit on this version The audit is dumping all the audit information into a audit log file on the server.

USE [master]
GO

CREATE SERVER AUDIT [Audit Login Changes]
TO FILE 
(	FILEPATH = N'ENTERYOURFILEPATHHERE'
	,MAXSIZE = 1024 MB
	,MAX_FILES = 10
	,RESERVE_DISK_SPACE = OFF
)
WITH
(	QUEUE_DELAY = 1000
	,ON_FAILURE = CONTINUE
	,AUDIT_GUID = '33e055a9-1924-47c3-9798-e8875a5bcd42'
)
ALTER SERVER AUDIT [Audit Login Changes] WITH (STATE = ON)
GO

 

I defined following Audit Action Types because I wanted to know all changes made on logins, roles and users.

Audit Action Type Description
DATABASE_ROLE_MEMBER_CHANGE_GROUP Raised whenever a login is added to or removed from a database role
SERVER_ROLE_MEMBER_CHANGE_GROUP Raised whenever a login is added or removed from a fixed server role
DATABASE_PERMISSION_CHANGE_GROUP Raised for any database permission
SERVER_OBJECT_PERMISSION_CHANGE_GROUP Raised whenever a GRANT, REVOKE, or DENY is issued for a server object permission
SERVER_PERMISSION_CHANGE_GROUP Raised when a GRANT, REVOKE, or DENY is issued for permissions in the server scope, such as creating a login
DATABASE_PRINCIPAL_CHANGE_GROUP Raised when users, are created, altered, or dropped from a database
SERVER_PRINCIPAL_CHANGE_GROUP Raised when server principals are created, altered, or dropped

More information on the Audit Action Types can be found here.

USE [master]
GO

CREATE SERVER AUDIT SPECIFICATION [Audit Login]
FOR SERVER AUDIT [Audit Login Changes]
ADD (DATABASE_ROLE_MEMBER_CHANGE_GROUP),
ADD (SERVER_ROLE_MEMBER_CHANGE_GROUP),
ADD (DATABASE_PERMISSION_CHANGE_GROUP),
ADD (SERVER_OBJECT_PERMISSION_CHANGE_GROUP),
ADD (SERVER_PERMISSION_CHANGE_GROUP),
ADD (DATABASE_PRINCIPAL_CHANGE_GROUP),
ADD (SERVER_PRINCIPAL_CHANGE_GROUP)
WITH (STATE = ON)
GO

Event Notifications

As mentioned before, we cannot use SQL Audit for SQL 2008 and R2 Standard Edition or SQL 2005. As an alternative, you can setup event notifications which will capture messages via Service Broker. The scripts below are based on the scripts of Aaron but I’ve added more events to it as I wanted to trace more than just “change password”

Create the following table in the msdb database

USE [msdb];
GO

CREATE TABLE dbo.SecurityChangeLog
(
    ChangeLogID			 int IDENTITY(1,1),
	LoginName            SYSNAME,
    UserName             SYSNAME,
    DatabaseName         SYSNAME,
    SchemaName           SYSNAME,
    ObjectName           SYSNAME,
    ObjectType           VARCHAR(50),
    DDLCommand           VARCHAR(MAX),
	EventTime			 DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
	CONSTRAINT PK_ChangeLogID PRIMARY KEY (ChangeLogID)
);

 

Setup the queue

CREATE QUEUE SecurityChangeQueue;
GO

CREATE SERVICE SecurityChangeService ON QUEUE SecurityChangeQueue
  ([http://schemas.microsoft.com/SQL/Notifications/PostEventNotification]);
GO

 

Setup the event notificiation. If you check the “FOR”-clause, you will notice that these are the same actions as defined in the SQL Audit Specification.

CREATE EVENT NOTIFICATION CreateLoginNotification
    ON SERVER WITH FAN_IN
    FOR CREATE_LOGIN,ALTER_LOGIN,DROP_LOGIN,CREATE_USER,ALTER_USER,DROP_USER,ADD_SERVER_ROLE_MEMBER,DROP_SERVER_ROLE_MEMBER,ADD_ROLE_MEMBER,DROP_ROLE_MEMBER
    TO SERVICE 'SecurityChangeService', 'current database';
GO

 

Install the following stored procedure to log all the event notifications into the table we’ve just created. You might notice in the loop that I’m checking the version of SQL Server for some events. This is because the event notification content is different for SQL 2008 (R2) and SQL 2005.

USE [msdb];
GO
CREATE PROCEDURE [dbo].[LogSecurityChange]
WITH EXECUTE AS OWNER
AS
BEGIN
    SET NOCOUNT ON;
	DECLARE @version int
    DECLARE @message_body XML;
	set @version = (SELECT convert (int,REPLACE (LEFT (CONVERT (varchar, SERVERPROPERTY ('ProductVersion')),2), '.', '')))

    WHILE (1 = 1)
    BEGIN
       WAITFOR 
       ( 
         RECEIVE TOP(1) @message_body = message_body
         FROM dbo.SecurityChangeQueue
       ), TIMEOUT 1000;

       IF (@@ROWCOUNT = 1)
       BEGIN
		if CONVERT(SYSNAME, @message_body.query('data(/EVENT_INSTANCE/EventType)')) in ('DROP_USER','CREATE_USER','ALTER_USER') or @version>9
		BEGIN
			INSERT dbo.SecurityChangeLog(LoginName,UserName,DatabaseName,SchemaName,ObjectName,ObjectType,DDLCommand) 
			SELECT CONVERT(SYSNAME, @message_body.query('data(/EVENT_INSTANCE/LoginName)')), 
				CONVERT(SYSNAME, @message_body.query('data(/EVENT_INSTANCE/UserName)')),
				CONVERT(SYSNAME, @message_body.query('data(/EVENT_INSTANCE/DatabaseName)')),
				CONVERT(SYSNAME, @message_body.query('data(/EVENT_INSTANCE/DefaultSchema)')),
			    CONVERT(SYSNAME, @message_body.query('data(/EVENT_INSTANCE/ObjectName)')),
		        CONVERT(VARCHAR(50), @message_body.query('data(/EVENT_INSTANCE/ObjectType)')),
		        CONVERT(VARCHAR(MAX), @message_body.query('data(/EVENT_INSTANCE/TSQLCommand/CommandText)'))
		END
		ELSE
		BEGIN
			INSERT dbo.SecurityChangeLog(LoginName,UserName,DatabaseName,SchemaName,ObjectName,ObjectType,DDLCommand) 
			SELECT CONVERT(SYSNAME, @message_body.query('data(/EVENT_INSTANCE/LoginName)')), 
				CONVERT(SYSNAME, @message_body.query('data(/EVENT_INSTANCE/UserName)')),
				CONVERT(SYSNAME, @message_body.query('data(/EVENT_INSTANCE/DatabaseName)')),
				CONVERT(SYSNAME, @message_body.query('data(/EVENT_INSTANCE/SchemaName)')),
			    CONVERT(SYSNAME, @message_body.query('data(/EVENT_INSTANCE/ObjectName)')),
		        CONVERT(VARCHAR(50), @message_body.query('data(/EVENT_INSTANCE/ObjectType)')),
		        CONVERT(VARCHAR(MAX), @message_body.query('data(/EVENT_INSTANCE/EventType)')) + ' ' + 
				CONVERT(VARCHAR(MAX), @message_body.query('data(/EVENT_INSTANCE/RoleName)')) + ' FOR ' +
				CONVERT(VARCHAR(MAX), @message_body.query('data(/EVENT_INSTANCE/LoginType)')) + ' ' +
				CONVERT(VARCHAR(MAX), @message_body.query('data(/EVENT_INSTANCE/ObjectName)'))
		END
       END
    END
END

 

Last step is modifying the queue so that it will use the stored procedure and starts tracking the login and user changes.

ALTER QUEUE SecurityChangeQueue
WITH ACTIVATION
(
   STATUS = ON,
   PROCEDURE_NAME = dbo.LogSecurityChange,
   MAX_QUEUE_READERS = 1,
   EXECUTE AS OWNER
);
GO

Bring it all together

All the information is now logged on each individual server. I want all the audit log information in one central database so that the customer can run reports on it. First of all, I’ve created a database on one of the SQL Servers and created a table that will store all the audit logs of all the servers.

CREATE TABLE [dbo].[SecurityChangeLog](
	[ChangeLogID] [int] IDENTITY(1,1) NOT NULL,
	[ServerName] [sysname] NOT NULL,
	[LoginName] [sysname] NOT NULL,
	[DatabaseName] [sysname] NOT NULL,
	[ObjectName] [nvarchar](128) NULL,
	[ObjectType] [varchar](50) NULL,
	[DDLCommand] [nvarchar](max) NULL,
	[EventTime] [datetime2](7) NULL,
 CONSTRAINT [PK_ChangeLogID] PRIMARY KEY CLUSTERED 
(
	[ChangeLogID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]

 

Next step is retrieving all the log information from each server. I was thinking of using an SSIS package for this but it looked more easily to use Powershell instead. The script below is based on the central management server that I’m using. First I’m retrieving all the SQL Servers where I have to collect the audit data. This is stored in the table msdb.dbo.sysmanagement_shared_registered_servers_internal of the central management server. Make sure that you change YOURGROUPID with the appropriate groupid of your CMS and YOURCMS with the instance name of your CMS.

Once we have the list, I execute a script GetAuditLog.sql against each instance to retrieve the audit information.

The final step will write the retrieved audit information into the database. Make sure you change the YOURDATABASENAME and YOURSQLSERVERINSTANCE in the script to your database name and SQL instance.

Schedule a SQL Agent Job to execute the powershell script on a daily basis.

 

#Set-ExecutionPolicy -ExecutionPolicy Unrestricted
Import-Module “sqlps” -DisableNameChecking
cls
$InputFile = "YOURFILEPATH + GetAuditLog.sql"

#Get the list of servers from the Central Management Server
$InstanceNameList = Invoke-Sqlcmd -Query "select server_name from msdb.dbo.sysmanagement_shared_registered_servers_internal 
                                          where server_group_id=YOURGROUPID" -ServerInstance "YOURCMS"
                                         
Clear-variable -Name results
Clear-variable -Name record

#Get Audit information
foreach($instanceName in $instanceNameList) 
{ 
    write-host "Executing query against server: " $instanceName.server_name
    $results += Invoke-Sqlcmd -InputFile $InputFile -ServerInstance $instanceName.server_name
}
#Save the Audit Log into the database
foreach($record in $results)
{
#write-host "ServerName: " $record.ServerName " " $record.DDLCommand
Invoke-Sqlcmd -Query "insert into [YOURDATABASENAME].[dbo].[SecurityChangeLog] ([ServerName],[LoginName],[DatabaseName],[ObjectName],[ObjectType],[DDLCommand],[EventTime])  
                      Values('$($record.ServerName)','$($record.LoginName)','$($record.DatabaseName)','$($record.ObjectName)','$($record.ObjectType)','$($record.DDLCommand)','$($record.EventTime)')" -ServerInstance "YOURSQLSERVERINSTANCE" 
}

 

Below you can find the SQL script that is used to retrieve the audit information in the powershell script. Dependent on the version we will read the audit file or the event notification table. I’m only selecting the logs of the last day because the powershell is scheduled to run on a daily basis.

USE MSDB
GO
declare @AuditLog  table (
LoginName sysname,
DatabaseName sysname,
ObjectName nvarchar(128),
ObjectType varchar(50),
DDLCommand nvarchar(4000),
EventTime datetime)
declare @LogFilePath varchar(256)
declare @version int
set @version = (SELECT convert (int,REPLACE (LEFT (CONVERT (varchar, SERVERPROPERTY ('ProductVersion')),2), '.', '')))
IF (@version=10 and (select convert(varchar,SERVERPROPERTY('Edition'))) not like '%Standard%') OR (@version=11)
BEGIN
	select @LogFilePath=log_file_path from sys.server_file_audits
	where name='Audit Login Changes'
	IF @LogFilePath IS NOT NULL
	BEGIN
		set @LogFilePath=@LogFilePath +'*.sqlaudit'
		insert into @AuditLog
		SELECT session_server_principal_name,database_name,object_name,m.class_type_desc,statement,event_time FROM sys.fn_get_audit_file (@LogFilePath,default,default) f
		INNER JOIN sys.dm_audit_class_type_map m on m.class_type=f.class_type;
		select serverproperty('Servername') as ServerName,LoginName,DatabaseName,ObjectName,ObjectType,replace(DDLCommand,'''','''''') as DDLCommand,EventTime from @AuditLog
		where EventTime>dateadd(dd,-1,getdate())
	END
END
ELSE
BEGIN
	IF exists (select name from msdb.sys.tables where name='SecurityChangeLog')
		select serverproperty('Servername') as ServerName,LoginName,DatabaseName,ObjectName,ObjectType,replace(DDLCommand,'''','''''') as DDLCommand,EventTime from msdb.dbo.SecurityChangeLog
	    where EventTime>dateadd(dd,-1,getdate()) and DDLCommand<>''
END

Report

To make it easy for the customer to retrieve the data, I’ve created this report.

Security ChangesI know that this solution is not really secure because people can just stop the audit, disable the queue or delete the records from the tables but I think this is a good start to track the login and user modifications.

Have fun!