It’s been a while…

So I know I have blogged in a while and that is bad on my part. I have been very busy with the new job that blogging just doesn’t make the list.  It has also been a struggle figuring out what I can share and what I should share. Well, I finally have found something I can share that has been plaguing me for a while and I think will be beneficial to share.

I run a Mac to do my daily work. I will run VMs or RDP sessions to Windows boxes so I can do SQL work. I have been struggling with trying to be more efficient and not having to use Windows if I do not have to. Plus, if I can stay on the Mac side, this will aid me in my current goal to learn Linux, containers, and other database platforms. To be able to use my Mac to do more SQL Server related work, I need to be able to connect to various different SQL Servers to run queries. I have an environment where I will connect to a development domain for most of my work and a production domain for things that I do that are customer facing. I also want to be able to use Windows Authentication to connect to these various SQL Servers. This has proven to be difficult on my Mac as I do not have my Mac joined to a domain.

I have heard of other members of the community doing this, but I kept struggling. I had read this blog post several times, but nothing worked.

Then it dawned on me and I realized my error. To use Windows Authentication, you must have Kerberos working. This was the missing link. To have Kerberos working you have to have a kerberos ticket and the server you want to connect to must have a Service Principal Name (SPN) created for it.

First, I made sure to run the Microsoft Kerberos Configuration Manager for SQL Server. You can download that here. This should be run from a Windows box that is joined to the domain. When you run this against one of your SQL Servers, it will scan your Active Directory to see if an SPN exists for the SQL Server. If there is an issue, you will see the below

 

Under the status column, you can see that it says Missing. That means that an SPN is not created for this SQL Server. The Kerberos Configuration Manager will provide you a script to fix this for you. You will want to provide this script to your Domain Admin so they can run it and correct the issue. Once this has been run and an SPN has been created, you can use Kerberos for Windows Authentication. If you don’t have an SPN, Windows will fall back to NTLM, but in the Mac world, you cannot fall back to that and thus, cannot use Windows Authentication.

 

Second, after the SPN is in place, open up a terminal and run

kinit username@DOMAIN.COM

username will be what you currently use to log in to your Active Directory Domain, while domain.com should be the domain name, but it must be in ALL CAPS.

You will get asked to authenticate by inputting your password. If everything is correct, you can run the next statement of

klist username@DOMAIN.COM

You should get similar output below
Credentials cache: API:BA74140D-7B6E-49B5-93E1-EA069AC87B20
Principal: chris.lumnah@RANGERS.LAB

Issued Expires Principal
Mar 20 20:05:58 2019 Mar 21 06:05:52 2019 krbtgt/RANGERS.LAB@RANGERS.LAB

Now you can use Azure Data Studio to connect to your SQL Server on Windows using Windows Authentication.

New job…new thinking?

So I recently started a new position as a SQL Server Solutions Architect at Rubrik Inc. We concentrate on data protection of your data center. I will be concentrating on SQL Server specific protection. Hit up the website or search YouTube for some videos if you want to know more.

Backup and recovery has always been something I am interested in. It was the first skill I learned when becoming a DBA. I wrote my own database backup process using SSIS before Powershell was a thing, that was highly dynamic, configurable, and most importantly simple to support. With the introduction of the Data Domain, I learned how to optimize the performance of a backup and restore. Over my career, I have learned that backup and recovery and data protection is probably the single biggest job the DBA has.

In my new role, it is expected of me to keep current and more importantly learn new things. SQL Server 2016 introduced a feature called Stretch Database. Essentially, this feature allows you to take data in a table or tables and put part of your data in Azure and part stays local. Conceptually, this works similarly to table partitioning but mechanically it is different. Table partitioning separates your table into partitions based on a query function. All of your data is in the table, it is just reorganized into those partitions so queries can run more efficiently against smaller subsets of data. Stretch is similar in that you have a filter query that says what data stays local and what goes to Azure. The big difference here is that the data going to Azure actually leaves the local database.

Now, why would you do this? A good example of why, may be a large sales order system. For the most part, your queries may run against the most recent data. By keeping that data local, queries can return fast. Data that needs to read that old data will have to reach into Azure to get it. Stretching a database allows you to place that old data upon cheaper storage. Yes, this data will probably take longer to query, but that is by design. Place the old, less used data on cheap storage to free up the local faster storage for more often used data.

I started to think about the implications of backup and recovery. How this affects RPO and RTO.

Backup

Thankfully, to back up a database that has been stretched to Azure existing native backup scripts do not need to change. This sounds great, I can offload a bunch of data to the cloud and nothing has to change from a backup and recovery perspective.

Well not so fast.

When you do a native SQL backup, SQL Server will perform what is called a shallow backup. This means that the backup will contain only the local data and eligible for migration at the time the backup runs. Eligible data is the data is not yet migrated, but will be migrated to Azure based on the query function.

So what happens to the data that has been moved to Azure? Azure will backup that data for you automatically with storage snapshots done every 8 hours.

See this link for more details.

Restore

So I have a backup or really backups of my data, how do I do a restore? Well it depends….Where is the failure?

If the failure you are recovering from is local,

  1. Restore your local database as you normally would from a native SQL backup.
  2. Run the stored procedure sys.sp_rda_reauthorize_db to reestablish the connection between the local database and the Azure stretch database

If the failure is in Azure,

  1. Restore the Azure database in the Azure portal
  2. Run the stored procedure sys.sp_rda_reauthorize_db to reestablish the connection between the local database and the Azure stretch database

While the above looks simple, it can be bit more involved as you walk through the details  here.

Ultimately, stretch database is a cool piece of technology that makes you think differently. It forces you to think more of a recovery strategy than a backup strategy. You have to test out how to restore this type of database to understand the steps involved. You want to have a run book that shows what you need to do to recover a database that is involved with stretch. You want to plan out your backups to most effectively give you a successful recovery. Essentially, you need a recovery strategy, not a backup strategy.

 

Hmmm, sounds like something Paul Randal has been saying for a while. So basically the same thinking, just reenforced.

DSC Install of SQL Server

It has been a while since I have made a post and figured it was long over due. I figured for my first post in a while, it would be about something I have been working on lately. The automation of installing and configuring of SQL Server.

So the installation of SQL Server is now fairly straightforward. The wizard does a nice job of guiding you along the way. 2016 even includes best practice suggestions for tempdb and instance file initialization. Along the way, Microsoft as given us ways to automate the installation of SQL Server. You can sysprep an instance, but this does not really automate the installation. It just helps create a template of an instance. At the end of the day, you still need to do things manually. You can also use a configuration file to assist here. This is a great step forward, but it does not allow for all of the things you need to do to configure a SQL server.

Powershell does. Desired State Configuration (DSC) is functionality built into Powershell that allows for the installation and configuration of a SQL Server.

First, to get started, you need to have a base understanding of DSC. Without this, the rest of this post will be hard to follow, as DSC can be difficult to follow. The link below will take you to the videos on the Microsoft Virtual Academy site that Jeffrey Snover put together that explains how DSC works.

Getting Started with Powershell Desired State Configuation (DSC)

Also, the modules you will need to make DSC work in your environment can all be downloaded from the Powershell Gallery. I have also found that searching for the module on GitHub, has returned back better documentation on how each module works.This has been great in my learning of how to get DSC going.

For the example what I have put together, we will use a Push mode for our DSC script.

The snippet that is below, uses a Powershell section called Configuration. This is similar to a Powershell Function in construction and how it works.

Configuration SQLServerInstall
    {
        param
        (
            [string]$ComputerName,
            [int32]$MAXDOP,
            [int32]$MAXMemory
        )
        Import-DscResource –Module PSDesiredStateConfiguration
        Import-DscResource -Module xSQLServer
        Import-DscResource -Module SecurityPolicyDsc
        Import-DscResource -Module xPendingReboot

        Node $ComputerName
        {
            WindowsFeature NET-Framework-Core
            {
                Name = "NET-Framework-Core"
                Ensure = "Present"
                IncludeAllSubFeature = $true
            }
            xSqlServerSetup InstallSQL
            {
                DependsOn = '[WindowsFeature]NET-Framework-Core'
                Features = $Configuration.InstallSQL.Features
                InstanceName = $Configuration.InstallSQL.InstanceName
                SQLCollation = $Configuration.InstallSQL.SQLCollation
                SQLSysAdminAccounts = $Configuration.InstallSQL.SQLSysAdminAccounts
                InstallSQLDataDir = $Configuration.InstallSQL.InstallSQLDataDir
                SQLUserDBDir = $Configuration.InstallSQL.SQLUserDBDir
                SQLUserDBLogDir = $Configuration.InstallSQL.SQLUserDBLogDir
                SQLTempDBDir = $Configuration.InstallSQL.SQLTempDBDir
                SQLTempDBLogDir = $Configuration.InstallSQL.SQLTempDBLogDir
                SQLBackupDir = $Configuration.InstallSQL.SQLBackupDir

                SourcePath = $Configuration.InstallSQL.SourcePath
                SetupCredential = $Node.InstallerServiceAccount
            }
            xSQLServerNetwork ConfigureSQLNetwork
            {
                DependsOn = "[xSqlServerSetup]InstallSQL"
                InstanceName = $Configuration.InstallSQL.InstanceName
                ProtocolName = "tcp"
                IsEnabled = $true
                TCPPort = 1433
                RestartService = $true
            }
            xSQLServerConfiguration DisableRemoveAccess
            {
                SQLServer = $ComputerName
                SQLInstanceName = $Configuration.InstallSQL.InstanceName
                DependsOn = "[xSqlServerSetup]InstallSQL"
                OptionName = "Remote access"
                OptionValue = 0
            }
            UserRightsAssignment PerformVolumeMaintenanceTasks
            {
                Policy = "Perform_volume_maintenance_tasks"
                Identity = "Builtin\Administrators"
            }
            UserRightsAssignment LockPagesInMemory
            {
                Policy = "Lock_pages_in_memory"
                Identity = "Builtin\Administrators"
            }
            xPendingReboot PendingReboot
            {
                Name = $ComputerName
            }
            LocalConfigurationManager
            {
                RebootNodeIfNeeded = $True
            }
            xSQLServerAlwaysOnService EnableAlwaysOn
            {
                SQLServer = $ComputerName
                SQLInstanceName = $Configuration.InstallSQL.InstanceName
                DependsOn = "[xSqlServerSetup]InstallSQL"
                Ensure = "Present"
            }
            xSQLServerMaxDop SetMAXDOP
            {
                SQLInstanceName = $Configuration.InstallSQL.InstanceName
                DependsOn = "[xSqlServerSetup]InstallSQL"
                MaxDop = $MAXDOP
            }
            xSQLServerMemory SetMAXDOP
            {
                SQLInstanceName = $Configuration.InstallSQL.InstanceName
                DependsOn = "[xSqlServerSetup]InstallSQL"
                MaxMemory = $MAXMemory
                DynamicAlloc = $False
            }
        }
    }

To break down the script and the areas in the Configuration function, the names are thankfully fairly self explanatory.
WindowsFeature – Ensures that Windows features are present. This is a function that will initiate the installation if needed
xSqlServerSetup – This section will install SQL Server. If you look at the properties it is asking for, it will mirroring most of the properties that are in a configuration file
xSQLServerNetwork – This section is used to enable protocols, like TCP and configure what port SQL Server should use.
xSQLServerConfiguration – Is used to set values that you would normally set via sp_configure
UserRightsAssignment – Will set values in the Local Security Policy. In this case, I am setting who can Lock Pages in Memory and Perform Volume Maintenance Tasks
xPendingReboot – Will check to see if a reboot is required and reboot if needed
xSQLServerAlwaysOnService – Used to enable the Always On Service
xSQLServerMaxDop – Used to set MAXDOP
xSQLServerMemory – Used to set MIN and MAX memory values

Any place you see $Configuration.InstallSQL.* this is a value from a JSON file. I use this a parameter file that will feed this script. This allows me to easily reuse this script as many times as I need.

When the Powershell Script runs, the DSC will output a text file with an extension of MOF. DSC will then push this configuration file to the server and then proceed to configure the server as instructed. Then I use scripts that I have written or found along the years to finish the configuration. Ideally, using Powershell, I can configure the SQL Server any way I would like.

 

I have posted my entire set of scripts for installing and configuring SQL Server on Githib. You can find the scripts here. Here you can see the full example. Enjoy.

T-SQL Tuesday #85

For the last T-SQL Tuesday of the year, this month is being hosted by Kenneth Fisher (b|t) and is about backup and recovery.

Any DBA knows that backups are crucial to their job. Without good backups it is very difficult to ensure that the environment is protected from dangers that lurk all around. Any DBA knows the BACKUP and RESTORE commands fairly well. They have scripts handy that can do a backup or a restore of database ready to go at the slightest whisper of a problem. However there are new commands that are available via Powershell to assist a DBA in getting a backup or a restore completed. These can possible aid in doing those items faster, as you do not need to wait for SSMS to load. Lets take a look at these commands at their basic level. I am going to use WideWorldImporters and my SQL 2016 instance.

Restore Database

T-SQL

USE [master]
RESTORE DATABASE [WideWorldImporters]
FROM  DISK = N'E:\Backup\WideWorldImporters-Full.bak'
WITH  FILE = 1,
MOVE N'WWI_Primary' TO N'E:\SQLData\WideWorldImporters.mdf',
MOVE N'WWI_UserData' TO N'E:\SQLData\WideWorldImporters_UserData.ndf',
MOVE N'WWI_Log' TO N'E:\SQLLog\WideWorldImporters.ldf',
MOVE N'WWI_InMemory_Data_1' TO N'E:\SQLData\WideWorldImporters_InMemory_Data_1',
NOUNLOAD,
STATS = 5
GO

Powershell

$RelocateData1 = New-Object Microsoft.SqlServer.Management.Smo.RelocateFile('WWI_Primary', 'E:\SQLData\WideWorldImporters.mdf')
$RelocateData2 = New-Object Microsoft.SqlServer.Management.Smo.RelocateFile('WWI_UserData', 'E:\SQLData\WideWorldImporters_UserData.ndf')
$RelocateLog = New-Object Microsoft.SqlServer.Management.Smo.RelocateFile('WWI_Log', 'E:\SQLLog\WideWorldImporters.ldf')
$RelocateData3 = New-Object Microsoft.SqlServer.Management.Smo.RelocateFile('WWI_InMemory_Data_1', 'E:\SQLData\WideWorldImporters_InMemory_Data_1')
Restore-SqlDatabase -ServerInstance 'sql2016' -Database 'WideWorldImporters' -BackupFile 'E:\Backup\WideWorldImporters-Full.bak' -RelocateFile @($RelocateData1,$RelocateData2,$RelocateData3,$RelocateLog)

Backup Database

T-SQL

BACKUP DATABASE [WideWorldImporters]
TO  DISK = N'E:\Backup\WideWorldImporters-Full.bak'
WITH NOFORMAT,
NOINIT,
NAME = N'WideWorldImporters-Full Database Backup',
SKIP,
NOREWIND,
NOUNLOAD,
STATS = 10
GO

Powershell

Backup-SqlDatabase -ServerInstance 'sql2016' -Database 'WideWorldImporters' -BackupFile  'E:\Backup\WideWorldImporters-Full.bak'

As you can see, these basic commands are very similar. With some additional review and time, the possibilities to automate a backup or restore of a database, or a group of databases, becomes quite easy. Better yet, these commands are native T-SQL and Powershell statements.

What if you want to go deeper? What if you want to answer the following questions

  1. When was my last database backup?
  2. What is the list of the last backups that were done?
  3. When was the database restored?

DBAs have scripts that will answer these questions relatively quickly using T-SQL. If they do not, they will spend time writing the code themselves or spend time searching online for a query that someone else has published already that answers the questions. What if you want to do this via Powershell? At this time, there is no native command provided by Microsoft to answer the above questions. However, the SQL Community has taken it upon themselves to create Powershell commands that will answer the above questions and do much more.

Chrissy LeMaire (b|t), a Data Platform MVP out of Belgium, and few other DBAs and MVPs have teamed up to create a suite of Powershell commands aimed at DBA. I encourage you to check out their site, https://dbatools.io/ , and download their module. The suite covers a bunch of common DBA tasks as well as database migration tasks. Here I will highlight 3 commands that will answer the above questions.

When was my last database backup?

This is a simple question that we as DBAs get asked, and ask a lot. With the dbatools Powershell module, this question can be answered in one line.

Get-DbaLastBackup -SqlServer sql2016 | Out-GridView

The above simple command returns back the below data

capture1

What is the list of the last backups that were done?

This is another question we all need to answer from time to time. Generally, we will need to answer this question when we need to a restore and need to know what files have been created recently and where they exist.

get-dbabackuphistory -SqlServer sql2016 -Databases WideWorldImporters | Out-GridView

This above one liner, returns back the below output. This tells me that my latest backups were a transaction log and a full backup. From here I can go and perform the restore if I so desired.

capture2

What is great about a command like this is that I can take this and create subsequent restore commands in Powershell and automate the restore.

When was the database restored?

Lastly, at times we need to know when a database was restored and from what file it was restored from. Again, a Powershell one liner from dbatools comes to our rescue to quickly and efficiently answer the question.

Get-DBARestoreHistory -SqlServer sql2016 -Databases WideWorldImporters -Detailed | Out-Grid

Here is the output:

capture3

Powershell is a great tool for the DBA to become more efficient in the normal day to day work. The above commands that I have highlighted above is just scraping the surface. To learn more in depth info about these commands, please read Chrissy’s blog which explains this new functionality much deeper and with videos.

The team that is putting together the dbatools suite, is doing a great job coming up with commands that are useful now to all DBAs. I hope you explore what they have to offer and incorporate them into your daily work.