Advanced Auditing with PowerShell Desired State Configuration Manager (DSC)

Greetings interweb. It’s been a while but I’m back with a new video finally.

This video focuses on Desired State Configuration Manager (DSC) and how to configure Advanced Auditing using DSC. You can certainly configure DSC using a group policy object (GPO). My use case for this is if you have a public facing web server that is in a DMZ outside the trusted side of your network. In that case, you probably don’t want that public facing Server on your domain but you also what to audit it and patch it and all of those other security processes you might run on any other server on the trusted side of the network.

For this exercise we will install DSC on a domain joined server (Pull Server) and configure that public facing server that’s in the DMZ to pull configuration from the Pull server.

Part 1: The Pull Server

All of this code is executed on the domain-joined Pull server.

Step 1: Install the required DSC modules, we need the 3 below. You can get all of them from

These 2 are for DSC itself

Install-Module -Name PSDscResources

Install-Module -Name xPSDesiredStateConfiguration


This module allows us to configure Advanced Auditing with DSC

Install-Module -Name AuditPolicyDsc

Step 2: You need a certificate on the Pull Server and any client that will connect to the pull server. This is the mechanism used to authenticate the client since it’s not in the domain.

In my case, I just installed certificate Services on my Pull Server. It’s literally the default installation, all you need is a Root CA. You could also manually create a certificate without installing certificate services but I found this more problematic than it was worth. Certificate services was the path of least resistance for me.

Certificate Services Step by Step:

Step 3: Now we need to export the certificate so we can import it on the DSC client side. You’ll need to import the certificate on any non-domain DSC client. All of this code is from, I changed some of the information to fit my naming scheme, other than that it’s almost identical.

Step 4: Now we can install Desired State Configuration Manager on the Pull Server. Almost all of this code is identical to what you will find on MSDN with a few modifications to match my naming scheme. See the video above for a more detailed explanation of what’s going on here.

Once the DSC installation is complete, check the web address for the Pull Server. It should be, https://<your server name>dc:8080/PSDSCPullServer.svc/. If you get some XML output your pull server is good to go… theoretically.

Step 5: Create the MOF files for the DSC client. This code is pretty much the same code as is in the example on the module github repo.

Step 6: We need to package the MOFs and DSC resources and put them in the correct file structure on the Pull Server in order for the clients to pull what’s needed. The $MOFPath in this script is where you created the MOF files in the script above, you may need to change this if you used a different file location.

DSC Pull Server Mission complete!

Part 2: The DSC Client

Step 1: Configure the Local Configuration Manager on the client. Before you run the code below, it’s worth testing the connectivity to the Pull server website. In my case, this is my site to the Pull Server I use the IP address for the Pull server because the client is not on the Domain so DNS is not going to work unless you add a manual entry in the Host file on the client.

Step 2: If you run Get-DscConfiguration and it comes back ok, the client should be properly registered with the Pull server.

Step 3: Now that the Pull server and the client are working, we need to figure out what the format for the CSV file is that we want to use to configure Advanced Auditing. See the video to understand how I figured out the format. My CSV is linked below.

Step 4: Once you have the CSV file complete and in the correct file location run the flowing to refresh the DSC config.

Run this to verify the Audit Policy is exactly the way you had it configured in the CSV.

Hope you enjoyed that tutorial… may the PowerShell force be with you all. 🙂





Convert text files to PDF then merge PDFs in bulk with PowerShell and iTextSharp

To demonstrate this process, I will show you how I export all the PowerShell help and convert it to PDF files that I use instead of looking at the help files in the PowerShell Console. I realize you can view the PowerShell help files in the console and online. The purpose of this script is to demonstrate how I solved what I viewed as a problem for me. You may be just fine looking through the help files in the console or online, as with any PowerShell script anyone hacks together, your mileage may vary.

I have split the code into 3 functions; 1 to export the help files to text files, 1 to convert those text files to PDFs, and 1 to merge all the PDFs in a given folder into 1 PDF. This process requires the iTextSharp .net library which you can download here ( This is probably one of the more basic implementations of the iTextSharp library, it has a lot of functionality that I didn’t need to accomplish my mission. You can find more technical information on iTextSharp here (

The Problem

For me, I hate looking through the help files in the console… for a few reasons.

  1. By default, the font is too small for me to see well. Old people problems I know, it is what it is. I find reading a pdf file much easier on the eye. I dump all the help files on a network accessible share that I can quickly access on any machine in my network.
  2. I can highlight, comment and bookmark the pdf files as needed. This may require the full version of Acrobat Pro or an equivalent, I’m not sure. My organization has Acrobat Pro so this isn’t an issue for me. Using highlighting, comments and bookmarks, I can make the help files more of a quick reference, it just works for me.
  3. I frequently work on Servers that are not accessible to the larger internet. Meaning, I can’t update help files for those machines unless I do it manually. Instead, I keep these pdf files on an external hard drive that I carry around anyway.

I know the help files are updated occasionally but it’s not often enough to make much of a difference. I usually update my help files a couple times a year. All that said, below is the code I use to make all the txt files and convert them to pdfs. I have also combined the help for each cmdlet in a module into a single pdf that I have made available here on Hit the downloads link and use if you think you can get some benefit from them.

This script exports all the help for each cmdlet to the specified file path.


The script below will convert files to PDF.


This script will merge all PDFs in a given folder into 1 PDF. You would think it would take a long time for many PDF files but I created a 20,000 page PDF file in a second or so.


This script is not part of any of the functions above but I figured I would include it. It loops through each folder recursively and merges all the PDF files in each folder.


Automate Creating Lab Virtual Machines in Azure with PowerShell

As you may or may not know, I recently decommissioned my old Dell PowerEdge 1950 server that I used for a few Lab virtual machines. While experimenting with PowerShell on these Virtual Machines, I have found myself in the situation where it would be easier to delete the Virtual Machines and re-create them instead of troubleshooting something I fouled up in the Registry. After the 2nd time rebuilding the lab VMs using the Azure website, I decided to script the process.

The script below will take input from a CSV file and create a virtual machine in your Azure subscription for each row in the CSV file. My example creates 2 virtual machines but you can obviously add as many as you need.

For a production environment in Azure, I would suggest Snapshotting the Virtual Machines. There is a good write-up of the process here: In my case, for a Lab, snapshots use more storage which costs more $$.



Hey PowerShell… Text me if my Domain Admins Group changes

This is why I Love PowerShell… It’s simple, yet functional.

From an Administrative perspective, I think we can all agree that any changes in your Domain Admins group without your knowledge would be of interest to you. If you’re in a large organization with access to enterprise management tools you probably have some widget that fires off a message to you or a group of people in the event a change is detected… or maybe you don’t.

If you’re an admin at a small business and maybe even some medium sized businesses, you may not have access to those enterprise management tools and widgets. Turns out, we can use PowerShell to monitor any group for us and notify us when a change occurs. It’s actually pretty simple.

You can even have PowerShell send you a text message… which is pretty cool.

I’m using the script to keep an eye on my Domain Admins Group but you could easily adapt it to monitor services or processes. You might want to monitor your Exchange Servers Transport service, if it stops for whatever reason send me an email and text message.

First, we have to get all the members of the Domain Admins Group and export to an xml file.

This is the script we’ll run on a schedule.

These are the Action arguments for the scheduled task.
-NoLogo -NonInteractive -WindowStyle Hidden -NoProfile -Executionpolicy bypass -file “C:\scripts\AD_Audit.ps1”


Visual Studio Code and Visual Studio Team Services Integration

For whatever reason, I seem to be migrating most of my coding activity to Visual Studio Code from the PowerShell ISE. I’m not really sure why other than the look and feel seems to be more pleasing than the ISE… to me anyway. Anywho, I have wanted to hack out some sort of code backup and version control solution for all of my code for a while now. Moving everything to Visual Studio Code seems like the prime time to get that done as well.

I know nothing about Git or Visual Studio Team services, I have zero experience with either. So, In order to get this setup, I found myself searching the Youtube for a tutorial… I didn’t find much. After a couple of hours googling and youtube video watching I finally got it to work. In order to save you the time and effort, I thought it would be good to make a video myself.