Pester Your SQL Servers!

In March I did my first presentation at SQL Saturday. I’d say it went very well and I was very encouraged with the attendee feedback. My presentation was on infrastructure testing with the Pester Powershell module. I didn’t go into depth on the advanced features of Pester (ie. mocking) but instead focused on how you can write Pester tests to validate SQL Server. You can find a repo of my demo and slides here. Fair warning, my presentation is heavy demo and talking, with brief slides.

As far as infrastructure testing goes, here’s a simple example that tests if the SQL Server service is running:

Describe 'SQL Server Services' {     
    It 'Should be running' {
        (Get-Service 'MSSQL$SQL1').Status | Should -Be 'Running'

And here’s an example of querying SQL internals (it test if the max memory has been changed from the default value):

$SQLInstance = 'localhost\sql1'

$Query       = `
"SELECT [name], [value] 
FROM sys.configurations
WHERE [name] IN 
('max server memory (MB)')"

$ConfigQueryRes = Invoke-SQLCMD -ServerInstance $SQLInstance -Database 'master' -Query $Query

Describe "Instance Configuration on $SQLInstance"  {

    It "Maximum Server Memory should be set" {
        $maxmem = ($ConfigQueryRes | Where {$ -eq 'max server memory (MB)'}).value
        $maxmem | Should -Not -Be 2147483647

Pester gives you the ability to break up your tests into many test scripts, thus avoiding a single lengthy Powershell script. I spoke at my presentation about how you could build a library of scripts. One for instance configuration, one for logins, one for jobs, etc… The advantage of Pester is that you can write scripts, run them, and collect results all in a consistent way.

At my present employer, I’ve written well over 1000 lines of Pester tests, spread across a dozen or so test files. However, to run all these tests is a simple call to invoke-pester, a one liner.

Setting an AlwaysOn Secondary Database Owner to SA

I’ve spent some time in a couple AlwaysOn shops and if you don’t manage it well, you can quickly accumulate secondary databases that are owned by DBA accounts. Naturally, secondaries are read-only so you can’t simply change the owner to SA. One option is to fail over the AG and change the database owner but that would require downtime. A better option would be to prevent this situation all together.

To start, SSMS allows you to script out the addition of a database to an AlwaysOn group. If you do so, you’ll find that under the hood the following actions are performed:

  1. Add the database to the AG on primary
  2. Perform a full backup on the primary
  3. Restore the full backup to the secondary
  4. Perform a log backup on the primary
  5. Restore the log backup to the secondary
  6. Begin data movement

We’re going to employ the EXECUTE AS statement, which I haven’t ever used outside of a certification exam… until now. Taking the script that SSMS generates, wrap the first restore statement (the full backup restore) in the EXECUTE AS statement. Something to the tune of:

:Connect ServerTwo

This will work even if the SA login is disabled but it will require you to be a member of the sysadmin role. Once the database has been restored to the secondary, it’s owner will be established as SA. I like this solution because it doesn’t require me to run the entire process while logged in as some other login. I can use my usual login and still keep my name off of any audits.

Who Is Using SSRS?

When I inherit an unruly SSRS environment (and they’re all unruly) I want to answer a few questions:

  1. What users are running reports?
  2. What reports are being executed?

The ReportServer database is very easy to understand but a majority of the queries I run hit the ExecutionLog table, or one of the ExecutionLog views. Microsoft has changed the schema of the ExecutionLog table over the years. They took the original ExecutionLog schema and created a view out of it, which tells us that there’s some integration point that couldn’t handle the schema change. They’ve also given us several views that perform some useful joins for us. Rather than re-invent the wheel, I almost always go straight to ExecutionLog2.

What users are running reports?

I use this query to quickly find the users that have executed a report in the last year. I change the time filter around but I like 1 year in environments where year-end reports are used.

FROM [dbo].[ExecutionLog2]
WHERE TimeStart >= GETDATE()-365

I use the results of this query to start communicated with the users and explaining the changes I may make to the SSRS environment.

Mirroring Endpoint Ownership

Creating a mirroring endpoint in SQL Server can sometimes do goofy things with permissions.

I’m facing an issue right now where two bad things are happening when one of our DBAs creates an endpoint:

  1. The creator’s account becomes the owner of the endpoint, which is not un-common in SQL Server. However, like other ownerships, this will prevent you from dropping the owning login.
  2. The creator’s account is added as an individual login to the instance. Even if that account already has access through a domain group, it still gets added as an individual (bug?). We’ve got to jump through hoops to remove the login and it can’t be removed until the endpoint’s ownership is cleared up.

We can fix it with a one-liner:

ON ENDPOINT::[MirrorMirrorOnTheWall]

Set the owner to an account that’s going to be around for a while (NOT A PERSON). If the account you chose doesn’t have sysadmin you may need to also grant connect privileges to the endpoint:

ON ENDPOINT::[MirrorMirrorOnTheWall]

With problem number one solved we are no longer restricted on dropping the creator’s login (provided the login doesn’t own anything else). We can drop the login but there’s a way to avoid this whole mess all together.

Using EXECUTE AS LOGIN we can create the endpoint as the original login:

    CREATE ENDPOINT MirrorMirrorOnTheWallFixed

Creating the owning login and then running the above script will keep things tidy. The endpoint gets created with the owner you determine and no individual logins get added to the instance.

Generating A Simple Markdown Readme From Powershell Comments

Some of the powershell functions and modules I write don’t warrant elaborate commenting. I’ve started using a standard commenting template and rather than re-do that commenting in markdown, I eventually wrote a simple script to generate markdown out of powershell comments.

I work in an environment that uses a central git repository so simple generation of a is an easy win. There’s a free module for generating markdown called platyPS but it’s a bit overkill for my lowly DBA needs. platyPS generates 1 markdown file per function. Our git repo can only display 1 per folder, so having 1 markdown file that covers multiple functions is useful.

We’ll generate a off the comments within my Oz.psm1 module which is defined as follows:

function New-Witch {
    NAME:    New-Witch
    AUTHOR:  James Livingston
    CREATED: 1939

    Creates a new wicked witch.
    'Hello, my pretty!'
function Remove-Witch {
    NAME:    Remove-Witch
    AUTHOR:  James Livingston
    CREATED: 1939

    Function to kill the wicked witch.
    'Ohhh, what a world!'

Specify the module to read and the output markdown file.

 NAME:    Generate-Markdown.ps1
 AUTHOR:  James Livingston
 CREATED: 1/16/2017

Reads the comments of a powershell module and generates a single markdown file off those comments.
$ModulePath = '.\Oz.psm1'
$OutputFile = '.\'

Import the module, capture the module name, and begin formatting our output string:

Import-Module $modulepath -Force

$modulename = (Split-Path -Path $ModulePath -Leaf).Split('.')[0]

$output = @()
$output += "# $modulename"
$output += ''

Here, I let Get-Command and Get-Help do the heavy lifting. Calling Get-Help on a cmdlet will return the standard comments within that cmdlet’s definition. Comments like .NOTES, .SYNOPSIS, or .DESCRIPTION. Loop through the commands within the module and process the help results:

foreach($cmdlet in (Get-Command -Module $modulename)) {
    $helpresults = Get-Help $cmdlet

    $name   = $helpresults.Name
    $synop  = $helpresults.Synopsis
    $notes  = $helpresults.alertSet.alert[0].Text.Split("`n")

    #Add the results to the output array
    $output += "## $name"
    $output += ''

    #I use multiple lines in .NOTES, so process each line.
    foreach ($line in $notes) {
        #The added '  ' is used to start a new line in markdown.  
        $output += ($line + '  ')

    $output += ''
    $output += "$synop"
    $output += ''

Finally, write the output array to your markdown file.

$output | Out-File -FilePath $OutputFile -Force -Encoding ascii

And here's the payoff:

# Oz

## New-Witch

NAME: New-Witch
AUTHOR: James Livingston

Creates a new wicked witch.

## Remove-Witch

NAME: Remove-Witch
AUTHOR: James Livingston

Function to kill the wicked witch.

Return Of The Blog!

I’m inspired. I just returned from PASS Summit and I’ve returned with an un-expected benefit.
I expected to learn. I expected to network. I expected to have a little bit of fun.
What I did not expect was to come home highly motivated. People around me say the motivation doesn’t last forever but it has lasted long enough for me to start a blog. So for the now, I’m here to share.

Looking forward to riding the motivation into the future.