DSC Install of SQL Server

It has been a while since I have made a post and figured it was long over due. I figured for my first post in a while, it would be about something I have been working on lately. The automation of installing and configuring of SQL Server.

So the installation of SQL Server is now fairly straightforward. The wizard does a nice job of guiding you along the way. 2016 even includes best practice suggestions for tempdb and instance file initialization. Along the way, Microsoft as given us ways to automate the installation of SQL Server. You can sysprep an instance, but this does not really automate the installation. It just helps create a template of an instance. At the end of the day, you still need to do things manually. You can also use a configuration file to assist here. This is a great step forward, but it does not allow for all of the things you need to do to configure a SQL server.

Powershell does. Desired State Configuration (DSC) is functionality built into Powershell that allows for the installation and configuration of a SQL Server.

First, to get started, you need to have a base understanding of DSC. Without this, the rest of this post will be hard to follow, as DSC can be difficult to follow. The link below will take you to the videos on the Microsoft Virtual Academy site that Jeffrey Snover put together that explains how DSC works.

Getting Started with Powershell Desired State Configuation (DSC)

Also, the modules you will need to make DSC work in your environment can all be downloaded from the Powershell Gallery. I have also found that searching for the module on GitHub, has returned back better documentation on how each module works.This has been great in my learning of how to get DSC going.

For the example what I have put together, we will use a Push mode for our DSC script.

The snippet that is below, uses a Powershell section called Configuration. This is similar to a Powershell Function in construction and how it works.

Configuration SQLServerInstall
    {
        param
        (
            [string]$ComputerName,
            [int32]$MAXDOP,
            [int32]$MAXMemory
        )
        Import-DscResource –Module PSDesiredStateConfiguration
        Import-DscResource -Module xSQLServer
        Import-DscResource -Module SecurityPolicyDsc
        Import-DscResource -Module xPendingReboot

        Node $ComputerName
        {
            WindowsFeature NET-Framework-Core
            {
                Name = "NET-Framework-Core"
                Ensure = "Present"
                IncludeAllSubFeature = $true
            }
            xSqlServerSetup InstallSQL
            {
                DependsOn = '[WindowsFeature]NET-Framework-Core'
                Features = $Configuration.InstallSQL.Features
                InstanceName = $Configuration.InstallSQL.InstanceName
                SQLCollation = $Configuration.InstallSQL.SQLCollation
                SQLSysAdminAccounts = $Configuration.InstallSQL.SQLSysAdminAccounts
                InstallSQLDataDir = $Configuration.InstallSQL.InstallSQLDataDir
                SQLUserDBDir = $Configuration.InstallSQL.SQLUserDBDir
                SQLUserDBLogDir = $Configuration.InstallSQL.SQLUserDBLogDir
                SQLTempDBDir = $Configuration.InstallSQL.SQLTempDBDir
                SQLTempDBLogDir = $Configuration.InstallSQL.SQLTempDBLogDir
                SQLBackupDir = $Configuration.InstallSQL.SQLBackupDir

                SourcePath = $Configuration.InstallSQL.SourcePath
                SetupCredential = $Node.InstallerServiceAccount
            }
            xSQLServerNetwork ConfigureSQLNetwork
            {
                DependsOn = "[xSqlServerSetup]InstallSQL"
                InstanceName = $Configuration.InstallSQL.InstanceName
                ProtocolName = "tcp"
                IsEnabled = $true
                TCPPort = 1433
                RestartService = $true
            }
            xSQLServerConfiguration DisableRemoveAccess
            {
                SQLServer = $ComputerName
                SQLInstanceName = $Configuration.InstallSQL.InstanceName
                DependsOn = "[xSqlServerSetup]InstallSQL"
                OptionName = "Remote access"
                OptionValue = 0
            }
            UserRightsAssignment PerformVolumeMaintenanceTasks
            {
                Policy = "Perform_volume_maintenance_tasks"
                Identity = "Builtin\Administrators"
            }
            UserRightsAssignment LockPagesInMemory
            {
                Policy = "Lock_pages_in_memory"
                Identity = "Builtin\Administrators"
            }
            xPendingReboot PendingReboot
            {
                Name = $ComputerName
            }
            LocalConfigurationManager
            {
                RebootNodeIfNeeded = $True
            }
            xSQLServerAlwaysOnService EnableAlwaysOn
            {
                SQLServer = $ComputerName
                SQLInstanceName = $Configuration.InstallSQL.InstanceName
                DependsOn = "[xSqlServerSetup]InstallSQL"
                Ensure = "Present"
            }
            xSQLServerMaxDop SetMAXDOP
            {
                SQLInstanceName = $Configuration.InstallSQL.InstanceName
                DependsOn = "[xSqlServerSetup]InstallSQL"
                MaxDop = $MAXDOP
            }
            xSQLServerMemory SetMAXDOP
            {
                SQLInstanceName = $Configuration.InstallSQL.InstanceName
                DependsOn = "[xSqlServerSetup]InstallSQL"
                MaxMemory = $MAXMemory
                DynamicAlloc = $False
            }
        }
    }

To break down the script and the areas in the Configuration function, the names are thankfully fairly self explanatory.
WindowsFeature – Ensures that Windows features are present. This is a function that will initiate the installation if needed
xSqlServerSetup – This section will install SQL Server. If you look at the properties it is asking for, it will mirroring most of the properties that are in a configuration file
xSQLServerNetwork – This section is used to enable protocols, like TCP and configure what port SQL Server should use.
xSQLServerConfiguration – Is used to set values that you would normally set via sp_configure
UserRightsAssignment – Will set values in the Local Security Policy. In this case, I am setting who can Lock Pages in Memory and Perform Volume Maintenance Tasks
xPendingReboot – Will check to see if a reboot is required and reboot if needed
xSQLServerAlwaysOnService – Used to enable the Always On Service
xSQLServerMaxDop – Used to set MAXDOP
xSQLServerMemory – Used to set MIN and MAX memory values

Any place you see $Configuration.InstallSQL.* this is a value from a JSON file. I use this a parameter file that will feed this script. This allows me to easily reuse this script as many times as I need.

When the Powershell Script runs, the DSC will output a text file with an extension of MOF. DSC will then push this configuration file to the server and then proceed to configure the server as instructed. Then I use scripts that I have written or found along the years to finish the configuration. Ideally, using Powershell, I can configure the SQL Server any way I would like.

 

I have posted my entire set of scripts for installing and configuring SQL Server on Githib. You can find the scripts here. Here you can see the full example. Enjoy.

T-SQL Tuesday #85

For the last T-SQL Tuesday of the year, this month is being hosted by Kenneth Fisher (b|t) and is about backup and recovery.

Any DBA knows that backups are crucial to their job. Without good backups it is very difficult to ensure that the environment is protected from dangers that lurk all around. Any DBA knows the BACKUP and RESTORE commands fairly well. They have scripts handy that can do a backup or a restore of database ready to go at the slightest whisper of a problem. However there are new commands that are available via Powershell to assist a DBA in getting a backup or a restore completed. These can possible aid in doing those items faster, as you do not need to wait for SSMS to load. Lets take a look at these commands at their basic level. I am going to use WideWorldImporters and my SQL 2016 instance.

Restore Database

T-SQL

USE [master]
RESTORE DATABASE [WideWorldImporters]
FROM  DISK = N'E:\Backup\WideWorldImporters-Full.bak'
WITH  FILE = 1,
MOVE N'WWI_Primary' TO N'E:\SQLData\WideWorldImporters.mdf',
MOVE N'WWI_UserData' TO N'E:\SQLData\WideWorldImporters_UserData.ndf',
MOVE N'WWI_Log' TO N'E:\SQLLog\WideWorldImporters.ldf',
MOVE N'WWI_InMemory_Data_1' TO N'E:\SQLData\WideWorldImporters_InMemory_Data_1',
NOUNLOAD,
STATS = 5
GO

Powershell

$RelocateData1 = New-Object Microsoft.SqlServer.Management.Smo.RelocateFile('WWI_Primary', 'E:\SQLData\WideWorldImporters.mdf')
$RelocateData2 = New-Object Microsoft.SqlServer.Management.Smo.RelocateFile('WWI_UserData', 'E:\SQLData\WideWorldImporters_UserData.ndf')
$RelocateLog = New-Object Microsoft.SqlServer.Management.Smo.RelocateFile('WWI_Log', 'E:\SQLLog\WideWorldImporters.ldf')
$RelocateData3 = New-Object Microsoft.SqlServer.Management.Smo.RelocateFile('WWI_InMemory_Data_1', 'E:\SQLData\WideWorldImporters_InMemory_Data_1')
Restore-SqlDatabase -ServerInstance 'sql2016' -Database 'WideWorldImporters' -BackupFile 'E:\Backup\WideWorldImporters-Full.bak' -RelocateFile @($RelocateData1,$RelocateData2,$RelocateData3,$RelocateLog)

Backup Database

T-SQL

BACKUP DATABASE [WideWorldImporters]
TO  DISK = N'E:\Backup\WideWorldImporters-Full.bak'
WITH NOFORMAT,
NOINIT,
NAME = N'WideWorldImporters-Full Database Backup',
SKIP,
NOREWIND,
NOUNLOAD,
STATS = 10
GO

Powershell

Backup-SqlDatabase -ServerInstance 'sql2016' -Database 'WideWorldImporters' -BackupFile  'E:\Backup\WideWorldImporters-Full.bak'

As you can see, these basic commands are very similar. With some additional review and time, the possibilities to automate a backup or restore of a database, or a group of databases, becomes quite easy. Better yet, these commands are native T-SQL and Powershell statements.

What if you want to go deeper? What if you want to answer the following questions

  1. When was my last database backup?
  2. What is the list of the last backups that were done?
  3. When was the database restored?

DBAs have scripts that will answer these questions relatively quickly using T-SQL. If they do not, they will spend time writing the code themselves or spend time searching online for a query that someone else has published already that answers the questions. What if you want to do this via Powershell? At this time, there is no native command provided by Microsoft to answer the above questions. However, the SQL Community has taken it upon themselves to create Powershell commands that will answer the above questions and do much more.

Chrissy LeMaire (b|t), a Data Platform MVP out of Belgium, and few other DBAs and MVPs have teamed up to create a suite of Powershell commands aimed at DBA. I encourage you to check out their site, https://dbatools.io/ , and download their module. The suite covers a bunch of common DBA tasks as well as database migration tasks. Here I will highlight 3 commands that will answer the above questions.

When was my last database backup?

This is a simple question that we as DBAs get asked, and ask a lot. With the dbatools Powershell module, this question can be answered in one line.

Get-DbaLastBackup -SqlServer sql2016 | Out-GridView

The above simple command returns back the below data

capture1

What is the list of the last backups that were done?

This is another question we all need to answer from time to time. Generally, we will need to answer this question when we need to a restore and need to know what files have been created recently and where they exist.

get-dbabackuphistory -SqlServer sql2016 -Databases WideWorldImporters | Out-GridView

This above one liner, returns back the below output. This tells me that my latest backups were a transaction log and a full backup. From here I can go and perform the restore if I so desired.

capture2

What is great about a command like this is that I can take this and create subsequent restore commands in Powershell and automate the restore.

When was the database restored?

Lastly, at times we need to know when a database was restored and from what file it was restored from. Again, a Powershell one liner from dbatools comes to our rescue to quickly and efficiently answer the question.

Get-DBARestoreHistory -SqlServer sql2016 -Databases WideWorldImporters -Detailed | Out-Grid

Here is the output:

capture3

Powershell is a great tool for the DBA to become more efficient in the normal day to day work. The above commands that I have highlighted above is just scraping the surface. To learn more in depth info about these commands, please read Chrissy’s blog which explains this new functionality much deeper and with videos.

The team that is putting together the dbatools suite, is doing a great job coming up with commands that are useful now to all DBAs. I hope you explore what they have to offer and incorporate them into your daily work.

Backing up on-premise SQL Server databases to Azure

So SQL 2016 has been officially released in June and there are a boat load of new features. One of the first new features to review is the ability to backup your databases to Azure. Now, technically, this feature has been around since SQL 2012 SP1 CU2, but the main idea behind SQL Server 2016 is that its bigger, better, faster than previous versions. Going with that theme, 2016 has upgraded its functionality and the performance of this feature. (more…)

T-SQL Tuesday #84

This month’s #tsql2sday is being hosted by Andy Yun (b|t) and is about helping new speakers.

I am taking a shot at my very first T-SQL Tuesday. This month it all about helping new speakers. This is an interesting topic, as I am a new speaker. Just about a year ago I was voluntold to give a lightening talk at the NESQL User Group. I was sitting in the back of the room, working away on something, while Mike Hillwig was going through the normal announcements at the start of the meeting. I was not paying close attention as I was lost in thought. However, I noticed that the room went silent and I felt eyes on me. I looked up and saw Mike staring right at me with a big grin on his face. Then him just saying “…you’re up next month…”

(more…)

My First…

This is a post of firsts, and hopefully the beginning to more.

I have been a DBA for the past 10 years. Since there is really no way to get formal training in this IT discipline, I, like others is self taught. When i took the position as a DBA, it was after turning down the position initially. My impression of a DBA was that all they did was write reports. This just did not seem appealing or challenging in any sense of the word. Once I chose to take the position at the urging of my CIO, I started to look at what a DBA did. My first task was to look at job postings on Monster.com. From here I looked at the job description of what a DBA did from various companies. I started to see that there was a common theme to the tasks and skill sets. I started to read documentation on how to do basic DBA tasks. Backups, job management, security, database design 101. Once I felt comfortable with each of those skills, I was hungry for more and kept taking in as much as I could over the years.

I later stumbled across PASS. This lead me to local user groups. I attended the first of many meetings of the NESQL user group meetings. These are generally 1 hour sessions once a month that covers 1 or two topics per meeting. The sessions, while informative and providing free pizza, taught me something more important, there were other DBAs that existed and suffered from the same problems I did. I got to hear the stories of other DBAs in the same industry or other industries with the same struggles. I got to hear other DBAs tell those DBAs the solution. There was this free exchange of knowledge that was very foreign to me. Most people do not like to share data, as they feel it gives them the advantage over their colleagues. This was just an awesome concept. I would absorb this knowledge that I was hearing and take that information back to the office, test it out and implement it.

Fast forward through a few years and 3 jobs.

I am at a senior level now in my DBA career. However, I am still just as hungry to learn and get better as I was when I was started. I attend the NESQL group still, and now attending the RISSUG as well. (this one meets in RI at New England Tech, come on down. Here is the link) I still love hearing the stories of other DBAs and their issues. Trying to figure out how it applies to myself, and the solutions that come of it. Now, I try to assist and give a solution and not just sit on the sideline. I have met some great people in the user groups. These people have become my friends and mentors. Two of which, being Mike Hilwig and Andy Mallon have “volun-told” me to start speaking and blogging. This was not something that was high on my agenda ever. First, I am quite introverted, so talking to people and presenting in front of others is not something I like to do. I did my first lightening talk a month later and it went well. I enjoyed the experience, but was glad it was over and that I would not have to do it again.  After that presentation, I was pulled aside by Mike and Andy and told…”Now make that into an hour presentation and submit for SQL Saturday Boston”. The thought of this, was a bit daunting. I had attended many of these events, if you have not, and you are a DBA, you should. It is a free day of training from both local speakers, Microsoft reps, and MVPs.

Come the next SQL Saturday and I am one of the first speaker of the day. I get to the room and get set up. The room holds maybe 12-15 people and it was full. This shocked me because these people chose to see my presentation over someone else’s. This amped up the pressure I was already feeling. A few minutes later, two MVPs, Grant Fritchey and Jes Borland,  that I look up to enter the room and sit in the front row. This of course, increased the pressure. I took a couple of breaths and settled in. An hour later, it was over and I felt even better, than the last time. I got some great feedback from both Grant and Jes which I later used to refine the topic for later presentations.

As I write this, I am thinking about another first. My first time to the PASS Summit in Seattle. The PASS Summit is an expanded version of the SQL Saturday event. It lasts 3 days, 5 if you do the pre-cons,  and covers all database topics you can think of regarding SQL Server. I had grant intentions to attend every session I could. When I built my schedule, I double and triple booked myself in any session that looked remotely interesting. I was intending to take in as much knowledge as I could. I was going to be a sponge.

Then I got there. I ended up missing the first half of the first day to take care of some responsibilities in the office. This shift in schedule, opened up my morning. In doing so, I was able to take advantage of this time and sit with people from Microsoft and get 1-on-1 time to ask questions on how Azure and SQL Database worked. I must have spent about an hour asking any question I could, and they would just keep giving me answers. It filled in so many gaps I had and in turn gave me so many ideas. The more I asked, the more they answered. It was great. Later, I was able to get great 1-on-1 time with the SQL CAT Team to ask them how to fix a problem I was having. What was challenging to me, was easy to them, and they explained it easily for me to resolve my issue. I took in some sessions the rest of the week, but I stopped worrying about taking in every session. I decided to be more practical and take sessions that truly interested me and figured any other session I could not go to, I would watch on the recordings later.

The other thing that was great for my first time at the Summit, was professional development courses I took during this week. The only session I was going to take was Brent Ozar’s session on career internals. It was an amazing session on advancing your career. Right after that session I took a session by Edwin Sarmiento on what DBAs get out of blogging. This was also a fantastic session. What was bigger take-away from these sessions, was the personal 1-on-1 time I got with both. Brent stood there for at least 3o minutes answering questions from me and others. Edwin did the same with the group that stayed after. He then spent over an hour after that with me just chatting. This 1-on-1 time I got was very valuable and insightful.

In closing, the significant firsts in my DBA career have been incredible and I am happy I took those first steps. Taking that initial first step of first becoming a DBA, was the most crucial, as it provided so many more opportunities down the road. I have learned so much, benefited so much, from these initial firsts. I have also benefitted so much from the connections I have made. Now, I look to share more by hopefully blogging more, and speaking at more SQL Saturdays and user group sessions.