Get the BIOS GUID of a Hyper-V VM

I’ve just spent the last few hours looking into how I can get the BIOS GUID from a Hyper-V VM from inside the Host OS. I needed this so I could use it to pre-stage devices in Windows Deployment Services. I could have used the MAC address of course, but I decided I wanted to use the BIOS GUID instead.

So after a fair bit of hunting all I could turn up was an older VBS script. I decided this wasn’t ideal and so went about investigating how I might do this in PowerShell (this is a PowerShell blog mainly after all). Well after a few minutes I came up with this (rather long) command:

$VMName = 'My VM'
(Get-CimInstance -Namespace Root\Virtualization\V2 -ClassName Msvm_VirtualSystemSettingData -Filter "ElementName = '$VMName'").BiosGUID

It uses WMI/CIM, but does seem to work nicely (don’t forget to set the name of the VM):


Good night!



Nano Server TP4

Just a quick one for Friday. After downloading the new Windows Server 2016 TP4 ISO, I quickly fired up my New-NanoServerVHD script to see how it went. Unfortunately I ran straight into a bug in the Convert-WindowsImage script. The bug in this script only occurs when the WIM file being converted only contains a single image – which as of TP4 includes the NanoServer.wim.

If you try and run this the New-NanoServerVHD script using the unfixed version of the Convert-WindowsImage script and TP4, you’ll run into the following error message:

ERROR  : The variable cannot be validated because the value $null is not a valid value for the Edition variable

In that case,

So, after reporting the error to the original script creator I went ahead and fixed the problem myself and uploaded a working version to GitHub (until it has been fixed in the official version). You can download my fixed version from here.

 Installing Nano Server TP4

So, after fixing the bug in the Convert-WindowsImage.ps1 file, here are some updated instuctions on using this script to quickly create a new Nano Server TP4 VHD or VHDx.

Create a New Nano Server VHD

It is fairly straight forward to install and use:

  1. Create a Working Folder on your computer in the case of this example I used c:\Nano.
  2. Download the New-NanoServerVHD.ps1 to the Working Folder.
  3. Download the Convert-WindowsImage.ps1 (download here) to the Working Folder.
  4. Download the Windows Server 2016 Technical Preview ISO (download here) to the Working Folder.
  5. Open an Administrative PowerShell window.
  6. Change directory to the Working Folder (cd c:\nano).
  7. Execute the following command (customizing the parameters to your needs):
.\New-NanoServerVHD.ps1 `
-ServerISO 'c:\nano\10586.0.151029-1700.TH2_RELEASE_SERVER_OEMRET_X64FRE_EN-US.ISO' `
-DestVHD c:\nano\NanoServer01.vhdx `
-VHDFormat VHDX `
-ComputerName NANOTEST01 `
-AdministratorPassword 'P@ssword!1' `
-Packages 'Containers','OEM-Drivers','Guest','IIS','DNS' `
-IPAddress ''

Available Packages in TP4

There are a bunch of new packages that are now available in TP4 for integrating into your Nano Server builds. I’m not quite sure of the exact purpose of some of them, but I’ve listed them here:

  • Compute: Hyper-V Server
  • OEM-Drivers: Standard OEM Drivers
  • Storage: Storage Server
  • FailoverCluster: FailOver Cluster Server
  • ReverseForwarders: ReverseForwarders to allow some older App Servers to run
  • Guest: Hyper-V Guest Tools
  • Containers: Support for Hyper-V and Windows containers
  • Defender: Windows Defender
  • DCB: Unsure
  • DNS: DNS Server
  • DSC: PowerShell Desired State Configuration Support
  • IIS: Internet Information Server (Web Server)
  • NPDS: Unsure
  • SCVMM: System Center VMM
  • SCVMM-Compute: System Center VMM Compute

Over and out.


PowerShell Language Support in Visual Studio Code

I feel like my birthday has come early: Microsoft has added PowerShell language support in Visual Studio Code:


I mainly use PowerShell ISE with ISE Steroids for most of my coding, but I find when I’m needing to switch between working on multiple files of different types Visual Studio Code is so much faster and slicker. Visual Studio 2015 with PoSH tools would be the ideal way to go, but the PoSH extension is quite slow with any files larger than 20KB. So I’m really excited with what this can do.


Windows 10 Build 10586 – PowerShell Problems

PowerShell Direct Broken

Unfortunately my Sunday afternoon of planned study has been slightly derailed, as last night I just upgraded my primary work machine to Windows 10 Build 10586. Everything went fine with the upgrade and seemed to be working just perfectly. However, when I started to get to work with my Hyper-V lab machines today I ran into a significant bug with PowerShell on this build:

PowerShell Direct no longer connects to any of my VMs. It pauses for approximately 30 seconds and then reports a rather mysterious error:

An error has occurred which Windows PowerShell cannot handle. A remote session might have ended.
But it was working yesterday!

But it was working yesterday!

Passing credentials that are correct or incorrect for the VM have no effect – the error message is always the same.

Fortunately connecting via plain old PowerShell Remoting still works fine, but I have many scripts that are dependent on using PowerShell Direct that are no longer functioning – including my LabBuilder project, which I use every day to get my Hyper-V lab up and running for my studies.

I’ve logged this issue in Windows Connect right here. So if you’re being affected by this issue please go and vote it up to see if it can’t be resolved quickly. This was a fantastic feature that was only added recently and it is sad to see it being broken so soon!

Encrypting Credentials in DSC MOF Files

Anyone who has put properly encrypted credentials in DSC configuration files knows that they need to be encrypted using a certificate that was issued to the Node that will be reading the configuration. This is usually fairly straight forward, but some care needs to be taken when generating the certificate for the Node to use.

I have an automated process designed for my Lab where any new VM’s that get built are automatically issued with a self-signed certificate that will be used to encrypt the DSC config files. This certificate is automatically downloaded to the host (via PowerShell Direct of course) and then used to encrypt the DSC config files for that node. All completely seamless and automatic. Until build 10586, when these certificates are no longer able to be used to encrypt the MOF file. Instead I get this error:

ConvertTo-MOFInstance : System.ArgumentException error processing property 'Password' OF TYPE 'MSFT_Credential': Certificate '8E474886A6AA72859BDC3C2FBEEFAAD7E089A5DD' cannot be used for encryption. Encryption certificates 
must contain the Data Encipherment or Key Encipherment key usage, and include the Document Encryption Enhanced Key Usage (

Ok, so this isn’t the end of the world and it is pretty clear what has changed here. After looking at my existing self-signed certificates they didn’t include the EKU (Enhanced Key Usage) of Document Encrpytion.

My previous certificates - now useless because the Document Encryption EKU is missing.

My previous certificates – now useless because the Document Encryption EKU is missing.

It seems this is now required to encrypt credentials in MOF Files. I guess this makes sense and I’m sure not that many people are going to run into the problem. But in case you do, you’ll need to reissue these certificates including the following EKU:

Document Encryption (

You also need to ensure the Key Usage contains either Data Encipherment or Key Encipherment.

Finally, I have one strong recommendation related to the topic of encrypting DSC credentials: Don’t use the built in PowerShell cmdlet New-SelfSignedCertificate to create self-signed certificates for this purpose. It creates certificates that are not compatible for other reasons (I won’t go into detail but you can look the issue up I’m sure). Instead I strongly recommend you use this script on MSDN Script Center.

Decryption Failed

Update 2015-12-18: Installing Windows Management Framework (WMF) 5.0 RTM on the DSC node resolves the Decryption Failed error described below. So if you’re experiencing this issue, install this update on any DSC nodes experiencing this problem.

Edit: Karl in his comment on this post mentioned a problem he was having where the DSC node was failing to decrypt any credentials provided in DSC MOF files created on the build 10586. He was receiving a Dercyption Failed error when the MOF was being applied to the node:


I hadn’t noticed this issue because I hadn’t been working on DSC for a week, but when I tried to apply a rebuilt MOF file I experienced the same issue.

It was also reported that the password property format of the MSFT_Credential object in the MOF seems to have changed in one of the recent releases from:

instance of MSFT_Credential as $MSFT_Credential2ref
  Password = "...Base64Password...";
  UserName = "LABBUILDER.COM\\Administrator";


instance of MSFT_Credential as $MSFT_Credential2ref
  Password = "-----BEGIN CMS-----\n...base64Password...\n-----END CMS-----";
  UserName = "LABBUILDER.COM\\Administrator";

After a full day of investigating this issue, I can confirm it has been caused by a change the Microsoft has made in the PSDesiredStateConfigration module supplied with this build (specifically the Get-EncryptedPassword function, in case you’re interested). This issue has been reported to Microsoft on PowerShell Connect here. Please go an upvote it if you’re having this problem (even if you’re not). Karl has posted this issue on Stack Overflow.

In the mean time I have posted a work around (roll back to a previous version of the PSDesiredStateConfiguration module on the Stack Overflow page) – so if you want to work around this problem, go and take a look.

DSC is Practically Broken in 10586

Update: After finishing this last post I’ve run into some critical problems with DSC on build 10586. Specifically, when I build a DSC configuration on this machine and include the PSDesiredStateConfiguration resource (by way of Import-DSCResource cmdlet) the MOF file that is created references a 1.0 version of the module – which doesn’t exist:

Version 1.0 isn't on the machine!

Version 1.0 isn’t on the machine!

Applying the MOF file to any node immediately throws an error because of course this module doesn’t exist (1.1 is the earliest version of this module that are on any of the nodes).

However, if I force the module version to 1.1 in the Import-DSCResource cmdlet then the MOF file that is created has the correct module version and can be applied to the node without any issue:

Forcing the Module Version.

Forcing the Module Version.

But of course going around all my config files and forcing the module version to 1.1 is a very unsatisfactory solution. Also, I’m not sure if it is just the PSDesiredStateConfiguration resource that has this problem or all modules. I haven’t had the time to investigate this further yet.

If you are suffering from any of these issues in build 10586, please let me know.

Thanks for reading!

Pester as an Operation Validation Framework

In this latest video on Channel 9 Jeffrey Snover (the grand wizard of PowerShell) is suggesting might be on the horizon in Windows Server 2016. In it he is saying they are looking at using Pester (or a form of it) to allow you to create Operational Validation tests for your servers and environment so that after any environmental changes are made the environment is validated automatically. This sound like a fantastic idea to me and such an obvious fit to Pester. After doing a bit of digging around it seems like this idea has been around for a while – see this post here for an example of how it can be used in practice.

Of course there does feel like there is a little bit of an overlap here with DSC, but I’m sure the implementation will play well with DSC. All of these new ideas technologies (Nano, Containers, DSC, Operational Pester tests etc) are just more tools in the “Infrastructure as Code” tool belt. So I’m very happy.

I suggest watching the whole video (found here) as it is really interesting, but if you want to just jump to the bit about Pester, it starts at about 11:48. I am really eager to see where Microsoft is going with this stuff in Windows Server 2016. Roll on TP4!

WFAS Firewall Rules Group vs. DisplayGroup

Recently I’ve been helping resolve a couple of issues with the behavior of the xFirewall resource in the xNetworking DSC Module. One of these was trying to implement both the Group and DisplayGroup parameters when creating a new Firewall Rule.

A Firewall Rule Group/Display Group.

A Firewall Rule Group/Display Group.

The obvious assumption is that the Group and DisplayGroup parameters behave in a similar fashion to the Name and DisplayName parameters. Unfortunately, this is not the case and it took me quite a lot of digging to figure out how they actually work.

Why is this Important?

This became an issue for me while trying to determine how to best implement these parameters in the xFirewall resource. It became clear that it is not actually possible to set the DisplayGroup parameter using any of the *-netfirewallrule cmdlets. So adding this parameter to the resource caused a lot of problems and was also causing quite a bit of confusion, especially as the relationship between Group and DisplayGroup is not clearly documented anywhere I could find.

You can’t even set the DisplayGroup parameter via NETSH or in the the WFAS (Windows Firewall with Advanced Security) UI. In fact the WFAS UI only shows the DisplayGroup and is labeled as Group – the actual Group is hidden completely:

WFAS UI showing the DisplayGroup (labeled as Group)

WFAS UI showing the DisplayGroup (labeled as Group)

The Relationship

So, onto the actual reason for this post: What is the relationship between Group and DisplayGroup and how does DisplayGroup get set? I’m hoping this will help someone else who is trying to understand this undocumented behavior.

There are two possible values that the DisplayGroup can have and these can never be set directly. They are:

  1. If Group does not start with an @ then the DisplayGroup will be the same as Group.
  2. If Group starts with an @ then the DisplayGroup will be a value pulled from a DLL or Windows Universal App resource. This creates the rule as a predefined rule.

For example, here is the DisplayGroup being pulled from a DLL:


And here is a one being pulled from a Universal App Resource:


So, what is the take away from all this?

The DisplayGroup can’t be set manually by you and will never be different to the Group unless the Group starts with an @.

Note: A predefined rule will prevent most settings of the rule from being changed in the WFAS UI, although they still can be changed in PowerShell:

A predefined rule can't be changed via the WFAS UI.

A predefined rule can’t be changed via the WFAS UI.

And one final piece of info: The Group (and by extension the DisplayGroup) can’t be changed for an existing rule. You can only change it by deleting and recreating the rule!

How to tell if a PowerShell variable has not been Declared

In most situations in PowerShell, I am really only interested if a variable has a value or not (e.g. not null). Checking for this is easy:

if ($myvariable -eq $null)
Write-Host -Message '$MyVariable is null'
Write-Host -Message '$MyVariable has a non-null and non-blank value'

But what if I want to know if a variable is not declared at all? The method of doing that is not so obvious and being PowerShell there are many ways of doing it. By far the clearest I think is to use the Test-Path cmdlet using the Variable provider:

if (Test-Path -Path Variable:\MyVariable)
Write-Host -Message '$MyVariable is declared'
Write-Host -Message '$MyVariable is not declared'

If there is a cleaner or officially recommended way of doing this I’d be most keen to hear about it.

File Server Resource Manager (FSRM) Classifications DSC Resource


I’ve been spending a bit of time lately working on some issues and improvements on the xNetworking DSC Resource so haven’t been spending as much time working on the series of File Server Resource Manager (FSRM) DSC Modules as I’d like. That said, I have managed to complete another module. This one is used for configuring Classification Properties and Property Values, Classification Configuration and Classification Rules.

If you missed any of the previous FSRM DSC Modules:


This module contains the following resources:

cFSRMClassification- configures FSRM Classification settings.
cFSRMClassificationProperty- configures FSRM Classification Property Definitions.
cFSRMClassificationPropertyValue- configures FSRM Classification Property Definition Values. This resource only needs to be used if the Description of a Classification Property Definition Value must be set.
cFSRMClassificationRule- configures FSRM Classification Rules.

The purpose of the resources should be fairly self explanatory, as long as you have a basic understanding of how FSRM Classifications are used.

Installing the Resource

If you have installed WMF 5.0 you can just download this directly from the PowerShell Gallery by running this command:

Install-Module -Name cFSRMClassifications

Otherwise you’ll need to download this from the Microsoft Script Center here and unzip it into your PowerShell modules path.

Using the Resource

As per the last post on these resources, rather than go into detail on using this resource, I thought I’d try and keep it short and just provide a link to the documentation. This covers the parameters available in the resources as well as some usage examples.

If you need some additional guidance or other specific examples, please feel free to let me know and I’ll do my best to help you out.

Hopefully this resource finds some use out there, but either way it has been extremely helpful to me really imprint the underlying FSRM features and usage into my own mind.


If you’re interested in contributing to this resource, providing feedback or raising issues or requesting features, please feel free (anything is appreciated). You’ll find the resource GitHub repository here where you can fork, issue pull requests and raise issues/feature requests.