It’s been a while…

So I know I have blogged in a while and that is bad on my part. I have been very busy with the new job that blogging just doesn’t make the list.  It has also been a struggle figuring out what I can share and what I should share. Well, I finally have found something I can share that has been plaguing me for a while and I think will be beneficial to share.

I run a Mac to do my daily work. I will run VMs or RDP sessions to Windows boxes so I can do SQL work. I have been struggling with trying to be more efficient and not having to use Windows if I do not have to. Plus, if I can stay on the Mac side, this will aid me in my current goal to learn Linux, containers, and other database platforms. To be able to use my Mac to do more SQL Server related work, I need to be able to connect to various different SQL Servers to run queries. I have an environment where I will connect to a development domain for most of my work and a production domain for things that I do that are customer facing. I also want to be able to use Windows Authentication to connect to these various SQL Servers. This has proven to be difficult on my Mac as I do not have my Mac joined to a domain.

I have heard of other members of the community doing this, but I kept struggling. I had read this blog post several times, but nothing worked.

Then it dawned on me and I realized my error. To use Windows Authentication, you must have Kerberos working. This was the missing link. To have Kerberos working you have to have a kerberos ticket and the server you want to connect to must have a Service Principal Name (SPN) created for it.

First, I made sure to run the Microsoft Kerberos Configuration Manager for SQL Server. You can download that here. This should be run from a Windows box that is joined to the domain. When you run this against one of your SQL Servers, it will scan your Active Directory to see if an SPN exists for the SQL Server. If there is an issue, you will see the below


Under the status column, you can see that it says Missing. That means that an SPN is not created for this SQL Server. The Kerberos Configuration Manager will provide you a script to fix this for you. You will want to provide this script to your Domain Admin so they can run it and correct the issue. Once this has been run and an SPN has been created, you can use Kerberos for Windows Authentication. If you don’t have an SPN, Windows will fall back to NTLM, but in the Mac world, you cannot fall back to that and thus, cannot use Windows Authentication.


Second, after the SPN is in place, open up a terminal and run

kinit username@DOMAIN.COM

username will be what you currently use to log in to your Active Directory Domain, while should be the domain name, but it must be in ALL CAPS.

You will get asked to authenticate by inputting your password. If everything is correct, you can run the next statement of

klist username@DOMAIN.COM

You should get similar output below
Credentials cache: API:BA74140D-7B6E-49B5-93E1-EA069AC87B20
Principal: chris.lumnah@RANGERS.LAB

Issued Expires Principal
Mar 20 20:05:58 2019 Mar 21 06:05:52 2019 krbtgt/RANGERS.LAB@RANGERS.LAB

Now you can use Azure Data Studio to connect to your SQL Server on Windows using Windows Authentication.

New job…new thinking?

So I recently started a new position as a SQL Server Solutions Architect at Rubrik Inc. We concentrate on data protection of your data center. I will be concentrating on SQL Server specific protection. Hit up the website or search YouTube for some videos if you want to know more.

Backup and recovery has always been something I am interested in. It was the first skill I learned when becoming a DBA. I wrote my own database backup process using SSIS before Powershell was a thing, that was highly dynamic, configurable, and most importantly simple to support. With the introduction of the Data Domain, I learned how to optimize the performance of a backup and restore. Over my career, I have learned that backup and recovery and data protection is probably the single biggest job the DBA has.

In my new role, it is expected of me to keep current and more importantly learn new things. SQL Server 2016 introduced a feature called Stretch Database. Essentially, this feature allows you to take data in a table or tables and put part of your data in Azure and part stays local. Conceptually, this works similarly to table partitioning but mechanically it is different. Table partitioning separates your table into partitions based on a query function. All of your data is in the table, it is just reorganized into those partitions so queries can run more efficiently against smaller subsets of data. Stretch is similar in that you have a filter query that says what data stays local and what goes to Azure. The big difference here is that the data going to Azure actually leaves the local database.

Now, why would you do this? A good example of why, may be a large sales order system. For the most part, your queries may run against the most recent data. By keeping that data local, queries can return fast. Data that needs to read that old data will have to reach into Azure to get it. Stretching a database allows you to place that old data upon cheaper storage. Yes, this data will probably take longer to query, but that is by design. Place the old, less used data on cheap storage to free up the local faster storage for more often used data.

I started to think about the implications of backup and recovery. How this affects RPO and RTO.


Thankfully, to back up a database that has been stretched to Azure existing native backup scripts do not need to change. This sounds great, I can offload a bunch of data to the cloud and nothing has to change from a backup and recovery perspective.

Well not so fast.

When you do a native SQL backup, SQL Server will perform what is called a shallow backup. This means that the backup will contain only the local data and eligible for migration at the time the backup runs. Eligible data is the data is not yet migrated, but will be migrated to Azure based on the query function.

So what happens to the data that has been moved to Azure? Azure will backup that data for you automatically with storage snapshots done every 8 hours.

See this link for more details.


So I have a backup or really backups of my data, how do I do a restore? Well it depends….Where is the failure?

If the failure you are recovering from is local,

  1. Restore your local database as you normally would from a native SQL backup.
  2. Run the stored procedure sys.sp_rda_reauthorize_db to reestablish the connection between the local database and the Azure stretch database

If the failure is in Azure,

  1. Restore the Azure database in the Azure portal
  2. Run the stored procedure sys.sp_rda_reauthorize_db to reestablish the connection between the local database and the Azure stretch database

While the above looks simple, it can be bit more involved as you walk through the details  here.

Ultimately, stretch database is a cool piece of technology that makes you think differently. It forces you to think more of a recovery strategy than a backup strategy. You have to test out how to restore this type of database to understand the steps involved. You want to have a run book that shows what you need to do to recover a database that is involved with stretch. You want to plan out your backups to most effectively give you a successful recovery. Essentially, you need a recovery strategy, not a backup strategy.


Hmmm, sounds like something Paul Randal has been saying for a while. So basically the same thinking, just reenforced.

DSC Install of SQL Server

It has been a while since I have made a post and figured it was long over due. I figured for my first post in a while, it would be about something I have been working on lately. The automation of installing and configuring of SQL Server.

So the installation of SQL Server is now fairly straightforward. The wizard does a nice job of guiding you along the way. 2016 even includes best practice suggestions for tempdb and instance file initialization. Along the way, Microsoft as given us ways to automate the installation of SQL Server. You can sysprep an instance, but this does not really automate the installation. It just helps create a template of an instance. At the end of the day, you still need to do things manually. You can also use a configuration file to assist here. This is a great step forward, but it does not allow for all of the things you need to do to configure a SQL server.

Powershell does. Desired State Configuration (DSC) is functionality built into Powershell that allows for the installation and configuration of a SQL Server.

First, to get started, you need to have a base understanding of DSC. Without this, the rest of this post will be hard to follow, as DSC can be difficult to follow. The link below will take you to the videos on the Microsoft Virtual Academy site that Jeffrey Snover put together that explains how DSC works.

Getting Started with Powershell Desired State Configuation (DSC)

Also, the modules you will need to make DSC work in your environment can all be downloaded from the Powershell Gallery. I have also found that searching for the module on GitHub, has returned back better documentation on how each module works.This has been great in my learning of how to get DSC going.

For the example what I have put together, we will use a Push mode for our DSC script.

The snippet that is below, uses a Powershell section called Configuration. This is similar to a Powershell Function in construction and how it works.

Configuration SQLServerInstall
        Import-DscResource –Module PSDesiredStateConfiguration
        Import-DscResource -Module xSQLServer
        Import-DscResource -Module SecurityPolicyDsc
        Import-DscResource -Module xPendingReboot

        Node $ComputerName
            WindowsFeature NET-Framework-Core
                Name = "NET-Framework-Core"
                Ensure = "Present"
                IncludeAllSubFeature = $true
            xSqlServerSetup InstallSQL
                DependsOn = '[WindowsFeature]NET-Framework-Core'
                Features = $Configuration.InstallSQL.Features
                InstanceName = $Configuration.InstallSQL.InstanceName
                SQLCollation = $Configuration.InstallSQL.SQLCollation
                SQLSysAdminAccounts = $Configuration.InstallSQL.SQLSysAdminAccounts
                InstallSQLDataDir = $Configuration.InstallSQL.InstallSQLDataDir
                SQLUserDBDir = $Configuration.InstallSQL.SQLUserDBDir
                SQLUserDBLogDir = $Configuration.InstallSQL.SQLUserDBLogDir
                SQLTempDBDir = $Configuration.InstallSQL.SQLTempDBDir
                SQLTempDBLogDir = $Configuration.InstallSQL.SQLTempDBLogDir
                SQLBackupDir = $Configuration.InstallSQL.SQLBackupDir

                SourcePath = $Configuration.InstallSQL.SourcePath
                SetupCredential = $Node.InstallerServiceAccount
            xSQLServerNetwork ConfigureSQLNetwork
                DependsOn = "[xSqlServerSetup]InstallSQL"
                InstanceName = $Configuration.InstallSQL.InstanceName
                ProtocolName = "tcp"
                IsEnabled = $true
                TCPPort = 1433
                RestartService = $true
            xSQLServerConfiguration DisableRemoveAccess
                SQLServer = $ComputerName
                SQLInstanceName = $Configuration.InstallSQL.InstanceName
                DependsOn = "[xSqlServerSetup]InstallSQL"
                OptionName = "Remote access"
                OptionValue = 0
            UserRightsAssignment PerformVolumeMaintenanceTasks
                Policy = "Perform_volume_maintenance_tasks"
                Identity = "Builtin\Administrators"
            UserRightsAssignment LockPagesInMemory
                Policy = "Lock_pages_in_memory"
                Identity = "Builtin\Administrators"
            xPendingReboot PendingReboot
                Name = $ComputerName
                RebootNodeIfNeeded = $True
            xSQLServerAlwaysOnService EnableAlwaysOn
                SQLServer = $ComputerName
                SQLInstanceName = $Configuration.InstallSQL.InstanceName
                DependsOn = "[xSqlServerSetup]InstallSQL"
                Ensure = "Present"
            xSQLServerMaxDop SetMAXDOP
                SQLInstanceName = $Configuration.InstallSQL.InstanceName
                DependsOn = "[xSqlServerSetup]InstallSQL"
                MaxDop = $MAXDOP
            xSQLServerMemory SetMAXDOP
                SQLInstanceName = $Configuration.InstallSQL.InstanceName
                DependsOn = "[xSqlServerSetup]InstallSQL"
                MaxMemory = $MAXMemory
                DynamicAlloc = $False

To break down the script and the areas in the Configuration function, the names are thankfully fairly self explanatory.
WindowsFeature – Ensures that Windows features are present. This is a function that will initiate the installation if needed
xSqlServerSetup – This section will install SQL Server. If you look at the properties it is asking for, it will mirroring most of the properties that are in a configuration file
xSQLServerNetwork – This section is used to enable protocols, like TCP and configure what port SQL Server should use.
xSQLServerConfiguration – Is used to set values that you would normally set via sp_configure
UserRightsAssignment – Will set values in the Local Security Policy. In this case, I am setting who can Lock Pages in Memory and Perform Volume Maintenance Tasks
xPendingReboot – Will check to see if a reboot is required and reboot if needed
xSQLServerAlwaysOnService – Used to enable the Always On Service
xSQLServerMaxDop – Used to set MAXDOP
xSQLServerMemory – Used to set MIN and MAX memory values

Any place you see $Configuration.InstallSQL.* this is a value from a JSON file. I use this a parameter file that will feed this script. This allows me to easily reuse this script as many times as I need.

When the Powershell Script runs, the DSC will output a text file with an extension of MOF. DSC will then push this configuration file to the server and then proceed to configure the server as instructed. Then I use scripts that I have written or found along the years to finish the configuration. Ideally, using Powershell, I can configure the SQL Server any way I would like.


I have posted my entire set of scripts for installing and configuring SQL Server on Githib. You can find the scripts here. Here you can see the full example. Enjoy.

T-SQL Tuesday #85

For the last T-SQL Tuesday of the year, this month is being hosted by Kenneth Fisher (b|t) and is about backup and recovery.

Any DBA knows that backups are crucial to their job. Without good backups it is very difficult to ensure that the environment is protected from dangers that lurk all around. Any DBA knows the BACKUP and RESTORE commands fairly well. They have scripts handy that can do a backup or a restore of database ready to go at the slightest whisper of a problem. However there are new commands that are available via Powershell to assist a DBA in getting a backup or a restore completed. These can possible aid in doing those items faster, as you do not need to wait for SSMS to load. Lets take a look at these commands at their basic level. I am going to use WideWorldImporters and my SQL 2016 instance.

Restore Database


USE [master]
RESTORE DATABASE [WideWorldImporters]
FROM  DISK = N'E:\Backup\WideWorldImporters-Full.bak'
MOVE N'WWI_Primary' TO N'E:\SQLData\WideWorldImporters.mdf',
MOVE N'WWI_UserData' TO N'E:\SQLData\WideWorldImporters_UserData.ndf',
MOVE N'WWI_Log' TO N'E:\SQLLog\WideWorldImporters.ldf',
MOVE N'WWI_InMemory_Data_1' TO N'E:\SQLData\WideWorldImporters_InMemory_Data_1',


$RelocateData1 = New-Object Microsoft.SqlServer.Management.Smo.RelocateFile('WWI_Primary', 'E:\SQLData\WideWorldImporters.mdf')
$RelocateData2 = New-Object Microsoft.SqlServer.Management.Smo.RelocateFile('WWI_UserData', 'E:\SQLData\WideWorldImporters_UserData.ndf')
$RelocateLog = New-Object Microsoft.SqlServer.Management.Smo.RelocateFile('WWI_Log', 'E:\SQLLog\WideWorldImporters.ldf')
$RelocateData3 = New-Object Microsoft.SqlServer.Management.Smo.RelocateFile('WWI_InMemory_Data_1', 'E:\SQLData\WideWorldImporters_InMemory_Data_1')
Restore-SqlDatabase -ServerInstance 'sql2016' -Database 'WideWorldImporters' -BackupFile 'E:\Backup\WideWorldImporters-Full.bak' -RelocateFile @($RelocateData1,$RelocateData2,$RelocateData3,$RelocateLog)

Backup Database


BACKUP DATABASE [WideWorldImporters]
TO  DISK = N'E:\Backup\WideWorldImporters-Full.bak'
NAME = N'WideWorldImporters-Full Database Backup',
STATS = 10


Backup-SqlDatabase -ServerInstance 'sql2016' -Database 'WideWorldImporters' -BackupFile  'E:\Backup\WideWorldImporters-Full.bak'

As you can see, these basic commands are very similar. With some additional review and time, the possibilities to automate a backup or restore of a database, or a group of databases, becomes quite easy. Better yet, these commands are native T-SQL and Powershell statements.

What if you want to go deeper? What if you want to answer the following questions

  1. When was my last database backup?
  2. What is the list of the last backups that were done?
  3. When was the database restored?

DBAs have scripts that will answer these questions relatively quickly using T-SQL. If they do not, they will spend time writing the code themselves or spend time searching online for a query that someone else has published already that answers the questions. What if you want to do this via Powershell? At this time, there is no native command provided by Microsoft to answer the above questions. However, the SQL Community has taken it upon themselves to create Powershell commands that will answer the above questions and do much more.

Chrissy LeMaire (b|t), a Data Platform MVP out of Belgium, and few other DBAs and MVPs have teamed up to create a suite of Powershell commands aimed at DBA. I encourage you to check out their site, , and download their module. The suite covers a bunch of common DBA tasks as well as database migration tasks. Here I will highlight 3 commands that will answer the above questions.

When was my last database backup?

This is a simple question that we as DBAs get asked, and ask a lot. With the dbatools Powershell module, this question can be answered in one line.

Get-DbaLastBackup -SqlServer sql2016 | Out-GridView

The above simple command returns back the below data


What is the list of the last backups that were done?

This is another question we all need to answer from time to time. Generally, we will need to answer this question when we need to a restore and need to know what files have been created recently and where they exist.

get-dbabackuphistory -SqlServer sql2016 -Databases WideWorldImporters | Out-GridView

This above one liner, returns back the below output. This tells me that my latest backups were a transaction log and a full backup. From here I can go and perform the restore if I so desired.


What is great about a command like this is that I can take this and create subsequent restore commands in Powershell and automate the restore.

When was the database restored?

Lastly, at times we need to know when a database was restored and from what file it was restored from. Again, a Powershell one liner from dbatools comes to our rescue to quickly and efficiently answer the question.

Get-DBARestoreHistory -SqlServer sql2016 -Databases WideWorldImporters -Detailed | Out-Grid

Here is the output:


Powershell is a great tool for the DBA to become more efficient in the normal day to day work. The above commands that I have highlighted above is just scraping the surface. To learn more in depth info about these commands, please read Chrissy’s blog which explains this new functionality much deeper and with videos.

The team that is putting together the dbatools suite, is doing a great job coming up with commands that are useful now to all DBAs. I hope you explore what they have to offer and incorporate them into your daily work.

Backing up on-premise SQL Server databases to Azure

So SQL 2016 has been officially released in June and there are a boat load of new features. One of the first new features to review is the ability to backup your databases to Azure. Now, technically, this feature has been around since SQL 2012 SP1 CU2, but the main idea behind SQL Server 2016 is that its bigger, better, faster than previous versions. Going with that theme, 2016 has upgraded its functionality and the performance of this feature. (more…)

T-SQL Tuesday #84

This month’s #tsql2sday is being hosted by Andy Yun (b|t) and is about helping new speakers.

I am taking a shot at my very first T-SQL Tuesday. This month it all about helping new speakers. This is an interesting topic, as I am a new speaker. Just about a year ago I was voluntold to give a lightening talk at the NESQL User Group. I was sitting in the back of the room, working away on something, while Mike Hillwig was going through the normal announcements at the start of the meeting. I was not paying close attention as I was lost in thought. However, I noticed that the room went silent and I felt eyes on me. I looked up and saw Mike staring right at me with a big grin on his face. Then him just saying “…you’re up next month…”


My First…

This is a post of firsts, and hopefully the beginning to more.

I have been a DBA for the past 10 years. Since there is really no way to get formal training in this IT discipline, I, like others is self taught. When i took the position as a DBA, it was after turning down the position initially. My impression of a DBA was that all they did was write reports. This just did not seem appealing or challenging in any sense of the word. Once I chose to take the position at the urging of my CIO, I started to look at what a DBA did. My first task was to look at job postings on From here I looked at the job description of what a DBA did from various companies. I started to see that there was a common theme to the tasks and skill sets. I started to read documentation on how to do basic DBA tasks. Backups, job management, security, database design 101. Once I felt comfortable with each of those skills, I was hungry for more and kept taking in as much as I could over the years.

I later stumbled across PASS. This lead me to local user groups. I attended the first of many meetings of the NESQL user group meetings. These are generally 1 hour sessions once a month that covers 1 or two topics per meeting. The sessions, while informative and providing free pizza, taught me something more important, there were other DBAs that existed and suffered from the same problems I did. I got to hear the stories of other DBAs in the same industry or other industries with the same struggles. I got to hear other DBAs tell those DBAs the solution. There was this free exchange of knowledge that was very foreign to me. Most people do not like to share data, as they feel it gives them the advantage over their colleagues. This was just an awesome concept. I would absorb this knowledge that I was hearing and take that information back to the office, test it out and implement it.

Fast forward through a few years and 3 jobs.

I am at a senior level now in my DBA career. However, I am still just as hungry to learn and get better as I was when I was started. I attend the NESQL group still, and now attending the RISSUG as well. (this one meets in RI at New England Tech, come on down. Here is the link) I still love hearing the stories of other DBAs and their issues. Trying to figure out how it applies to myself, and the solutions that come of it. Now, I try to assist and give a solution and not just sit on the sideline. I have met some great people in the user groups. These people have become my friends and mentors. Two of which, being Mike Hilwig and Andy Mallon have “volun-told” me to start speaking and blogging. This was not something that was high on my agenda ever. First, I am quite introverted, so talking to people and presenting in front of others is not something I like to do. I did my first lightening talk a month later and it went well. I enjoyed the experience, but was glad it was over and that I would not have to do it again.  After that presentation, I was pulled aside by Mike and Andy and told…”Now make that into an hour presentation and submit for SQL Saturday Boston”. The thought of this, was a bit daunting. I had attended many of these events, if you have not, and you are a DBA, you should. It is a free day of training from both local speakers, Microsoft reps, and MVPs.

Come the next SQL Saturday and I am one of the first speaker of the day. I get to the room and get set up. The room holds maybe 12-15 people and it was full. This shocked me because these people chose to see my presentation over someone else’s. This amped up the pressure I was already feeling. A few minutes later, two MVPs, Grant Fritchey and Jes Borland,  that I look up to enter the room and sit in the front row. This of course, increased the pressure. I took a couple of breaths and settled in. An hour later, it was over and I felt even better, than the last time. I got some great feedback from both Grant and Jes which I later used to refine the topic for later presentations.

As I write this, I am thinking about another first. My first time to the PASS Summit in Seattle. The PASS Summit is an expanded version of the SQL Saturday event. It lasts 3 days, 5 if you do the pre-cons,  and covers all database topics you can think of regarding SQL Server. I had grant intentions to attend every session I could. When I built my schedule, I double and triple booked myself in any session that looked remotely interesting. I was intending to take in as much knowledge as I could. I was going to be a sponge.

Then I got there. I ended up missing the first half of the first day to take care of some responsibilities in the office. This shift in schedule, opened up my morning. In doing so, I was able to take advantage of this time and sit with people from Microsoft and get 1-on-1 time to ask questions on how Azure and SQL Database worked. I must have spent about an hour asking any question I could, and they would just keep giving me answers. It filled in so many gaps I had and in turn gave me so many ideas. The more I asked, the more they answered. It was great. Later, I was able to get great 1-on-1 time with the SQL CAT Team to ask them how to fix a problem I was having. What was challenging to me, was easy to them, and they explained it easily for me to resolve my issue. I took in some sessions the rest of the week, but I stopped worrying about taking in every session. I decided to be more practical and take sessions that truly interested me and figured any other session I could not go to, I would watch on the recordings later.

The other thing that was great for my first time at the Summit, was professional development courses I took during this week. The only session I was going to take was Brent Ozar’s session on career internals. It was an amazing session on advancing your career. Right after that session I took a session by Edwin Sarmiento on what DBAs get out of blogging. This was also a fantastic session. What was bigger take-away from these sessions, was the personal 1-on-1 time I got with both. Brent stood there for at least 3o minutes answering questions from me and others. Edwin did the same with the group that stayed after. He then spent over an hour after that with me just chatting. This 1-on-1 time I got was very valuable and insightful.

In closing, the significant firsts in my DBA career have been incredible and I am happy I took those first steps. Taking that initial first step of first becoming a DBA, was the most crucial, as it provided so many more opportunities down the road. I have learned so much, benefited so much, from these initial firsts. I have also benefitted so much from the connections I have made. Now, I look to share more by hopefully blogging more, and speaking at more SQL Saturdays and user group sessions.