Test Website SSL Certificates Continuously with PowerShell and Pester

One of the most common problems that our teams deal with is ensuring that SSL certificates are working correctly. We’ve all had that urgent call in telling us that the web site is down or some key API or authentication function is offline – only to find out it was caused by an expired certificate.

An easy way of preventing this situation would have been to set up a task that continuously tests your SSL endpoints (internal and external web apps and sites, REST API’s etc.) and warns us if:

  • The certificate is about to expire (with x days).
  • The SSL endpoint is using safe SSL protocols (e.g. TLS 1.2).
  • The certificate is using SHA256.

This seemed like a good task for Pester (or Operation Validation Framework). So, after a bit of digging around I found this awesome blog post from Chris Duck showing how to retrieve the certificate and SSL protocol information from an SSL endpoint using PowerShell.

Chris’ post contained this PowerShell cmdlet:

So that was the hard part done, all I needed was to add this function to some Pester tests.

Note: If you are running these tests on an operating system older than Windows 10 or Windows Server 2016 then you will need to install the PowerShell Pester module by running this command in an Administrator PowerShell console:

Install-Module -Name Pester

So after a little bit of tinkering I ended up with a set of tests that I combined into the same file as Chris’ function from earlier. I called the file SSL.tests.ps1. I used the file extension .tests.ps1 because that is the file extension Pester looks for when it runs.

The tests are located at the bottom of the file below the Test-SslProtocol function.

So, now to test these SSL endpoints all I need to do is run in a PowerShell console with the current folder set to the folder containing my SSL.tests.ps1 file:

cd C:\SSLTests\

This is the result:


This shows that all the SSL endpoint certificates being used by google.com, bing.com and yahoo.com are all valid SHA-256 certificates and aren’t going to expire in 14 days.

All I would then need to do is put this in a task to run every hour or so and perform some task when the tests fail:

At this point you will still need to use some mechanism to notify someone when they fail. One method could be to write an event into the Windows Event Log and then use Microsoft Operations Management Suite (or SCOM) to monitor for this event and send an e-mail or other alert to the appropriate administrators.

For an example showing how to use OMS to monitor custom events created by failed Pester and OVF tests, see my previous article here.

Potential Improvements

There are a number of ways you could go about improving this process, which our teams have in fact implemented. If you’re considering implementing this process then you might want to also consider them:

  1. Put the Test-SSLProtocol cmdlet into a PowerShell Module that you can share easily throughout your organization.
  2. Put your tests into source control and have the task clone the tests directly from source control every time they are run – this allows tests to be stored centrally and can be change tracked.
  3. Parameterize the tests so that you don’t have to hard code the endpoints to test in the script file. Parameters can be passed into Pester tests fairly easily.
  4. Use something like Jenkins, SCOM or Splunk to run the tests continuously.
  5. Run the tests in an Azure Automation account in the Cloud.

Really, the options for implementing this methodology are nearly limitless. You can engineer a solution that will work for you and your teams, using whatever tools are at your disposal.

At the end of the day, the goal here should be:

  • Reduce the risk that your internal or external applications or websites are using bad certificates.
  • Reduce the risk that an application or website will be deployed without a valid certificate (write infrastructure tests before you deploy your infrastructure – TDD for operations).
  • Reduce the risk you’ll get woken up in the middle of the night with an expired certificate.

So, in this holiday season, I hope this post helps you ensure your certificates won’t expire in the next two weeks and you won’t get called into fix a certificate problem when you should be lying on a beach in the sun (in the southern hemisphere anyway).

Have a good one!


Pester as an Operation Validation Framework

In this latest video on Channel 9 Jeffrey Snover (the grand wizard of PowerShell) is suggesting might be on the horizon in Windows Server 2016. In it he is saying they are looking at using Pester (or a form of it) to allow you to create Operational Validation tests for your servers and environment so that after any environmental changes are made the environment is validated automatically. This sound like a fantastic idea to me and such an obvious fit to Pester. After doing a bit of digging around it seems like this idea has been around for a while – see this post here for an example of how it can be used in practice.

Of course there does feel like there is a little bit of an overlap here with DSC, but I’m sure the implementation will play well with DSC. All of these new ideas technologies (Nano, Containers, DSC, Operational Pester tests etc) are just more tools in the “Infrastructure as Code” tool belt. So I’m very happy.

I suggest watching the whole video (found here) as it is really interesting, but if you want to just jump to the bit about Pester, it starts at about 11:48. I am really eager to see where Microsoft is going with this stuff in Windows Server 2016. Roll on TP4!

Comparing Objects using JSON in PowerShell for Pester Tests

Recently I spent the good part of a weekend putting together Pester Tests (click here if you aren’t familiar with Pester) for my LabBuilder PowerShell module- a module to build a set of Virtual Machines based on an XML configuration file. In the module I have several cmdlets that take an XML configuration file (sample below) and return an array of hash tables as well as some hash table properties containing other arrays – basically a fairly complex object structure.

A Pester Test config file for the LabBuilder module

A Pester Test config file for the LabBuilder module

In the Pester Tests for these cmdlets I wanted to ensure the object that was returned exactly matched what I expected. So in the Pester Test I programmatically created an object that matched what the Pester Test should expect the output of the cmdlets would be:

$ExpectedSwtiches = @( 
  @{ name="General Purpose External"; type="External"; vlan=$null; adapters=[System.Collections.Hashtable[]]@(
    @{ name="Cluster"; macaddress="00155D010701" },
    @{ name="Management"; macaddress="00155D010702" },
    @{ name="SMB"; macaddress="00155D010703" },
    @{ name="LM"; macaddress="00155D010704" }
  @{ name="Pester Test Private Vlan"; type="Private"; vlan="2"; adapters=@() },
  @{ name="Pester Test Private"; type="Private"; vlan=$null; adapters=@() },
  @{ name="Pester Test Internal Vlan"; type="Internal"; vlan="3"; adapters=@() },
  @{ name="Pester Test Internal"; type="Internal"; vlan=$null; adapters=@() }

What I needed to do was try and make sure the objects were the same. At first I tried to use the Compare-Object cmdlet – this actually wasn’t useful in this situation as it doesn’t do any sort of deep property comparison. What was needed was to serialize the objects and then perform a simple string comparison. The ConvertTo-JSON cmdlet seemed to be just what was needed. I also decided to use the [String]::Compare() method instead of using the PowerShell -eq operator because the -eq operator seems to have issues with Unicode strings.

The Pester test that I first tried was:

Context "Valid configuration is passed" {
  $Switches = Get-LabSwitches -Config $Config
  It "Returns Switches Object that matches Expected Object" {
    [String]::Compare(($Switches | ConvertTo-Json),($ExpectedSwtiches | ConvertTo-Json)) | Should Be 0

This initially seemed to work, but if I changed any of the object properties below the root level (e.g. the adapter name property) the comparison still reported the objects were the same when they weren’t. After reading the documentation it states that the ConvertTo-JSON cmdlet provides a Depth property that defaults to 2 – which limits the depth that an object structure would be converted to. In my case the object was actually 4 levels deep. So I needed to add a Depth parameter to the ConvertTo-JSON calls:

[String]::Compare(($Switches | ConvertTo-Json -Depth 4),($ExpectedSwtiches | ConvertTo-Json -Depth 4)) | Should Be 0

This then did pretty much exactly what I wanted. However, I also needed the comparison to be case-insensitive, so I added a boolean parameter to the [String]::Compare static call:

[String]::Compare(($Switches | ConvertTo-Json),($ExpectedSwtiches | ConvertTo-Json),$true) | Should Be 0

The end result was an deep object comparison between a reference object and the object the cmdlet being tested returned. It is by no means perfect as if the properties or contents of any arrays in the object are out of order the comparison will report that there are differences, but because we control the format of these objects this shouldn’t be a problem and should enable some very test strict cmdlet tests.

How the the Final Pester Test in Visual Studio 2015 (with POSH tools)

How the the Final Pester Test in Visual Studio 2015 (with POSH tools)

Edit: after writing a number of Pester tests using the approach I realized it could be simplified slightly by replacing the generation of the comparison object with the actual JSON output produced by the reference object embedded inline in a variable. For example:

Performing the object comparison using JSON in a variable in the test.

Performing the object comparison using JSON in a variable in the test.

The JSON can be generated manually by hand (before writing the function itself) to stick to the Test Driven Design methodology or it can be generated from the object the function being tested created (once it it working correctly) and then written to a file using:

Set-Content -Path "$($ENV:Temp)\Switches.json" -Value ($Switches | ConvertTo-Json -Depth 4)

The $switches  variable contains the actual object that is produced by the  working command being tested.

A Word of Caution about CRLF

I have noticed that when opening the JSON file in something like Notepad++ and copying the JSON to the clipboard (to paste into my Pester test) that an additional CRLF appears at the bottom. You need to ensure you don’t include this at the bottom of your variable too – otherwise the comparison will fail and the objects will appear to be different (when they aren’t).

This is what the end of the JSON variable definition should look like:

Good JSON CRLF Formatting

And this is what it should not look like (the arrow indicates the location of the extra CRLF that should be removed):

Good JSON CRLF formatting

Note: I could have used the Export-CliXML and Import-CliXML CmdLets instead to perform the object serialization and comparison, but these cmdlets write the content to disk and also generate much larger strings which would take much longer to compare and ending up with a more complicated test.

Well, hopefully someone else will find this useful!