PowerShell System Monitoring Part 3. Server Status Report

If you’ve been following this series then you know that we’re on a mission to create a poor man’s monitoring system from just PowerShell scripts and a web hosting engine. So far we’ve created HTML pages that show the warnings and errors from a group of server’s event logs. We’ve also made a report that displays all of the server services set to automatic but not running.

The scripts that we’ve written so far, use parameters to specify their file output paths and input of a list of servers to scan. Later in the series, these options will help us create logical groups of systems that make our monitoring system easy to use. For example; all Exchange servers, or all Active Directory servers will be grouped together on separate pages. One of the scripts that we will write, will create the lists of servers that belong in each group. Those groups in-turn will be used to feed the scripts that are collecting our data and creating the HTML reports. It sounds confusing and it is, that’s why I’m writing about it one part at a time. It will all come together in the end, though.

Connectivity, CPU load, memory and storage consumption, are the basic metrics required for any monitoring system worth its salt. I could sit here and bang out the code needed to get the data we’re after from WMI \ CMI and output it to HTML easily enough. One of the great things about PowerShell though, is that its ubiquitous and you don’t always have to invent the wheel yourself. In this case, somebody else has already cranked out 99% of the code we need, so we’ll just modify his.

Mike Galvin, Dan Price, and Bhavik Solanki have written a wonderful PowerShell script called Windows Server Status Monitor. It uses WMI to pull CPU, memory, storage, and online status information from a list of servers and can display the results as a color coded HTML file. It also contains an alerting feature, emailed results, and can run once or continuously. You can download the script from Mike’s Blog, GitHub, or the PowerShell Gallery.

My personal project required trending data. I need to see these performance metrics over time. Mike’s script can output a CSV file instead of HTML, but it deletes the files it creates each time it is run. With just a little tweak to the CSV output file we can ensure each file is unique and therefor not deleted. If you need trending as well, make ths change. Under the comment ## Setting the location of the report output; find $OutputFile and set the variable to something like:

$Outputfile = $OutputPath\WinServ-Status-Report"+"_" + (get-date -Format M-dd-yyyy-h-m)+".csv

This change will create a CSV file that includes the date, hour, and minute in the filename. You could also include seconds or even milliseconds if you’re going to run this in a continous loop. Please be aware that you’ll need to monitor the folders you are outputing these files to. If you’re running WinServ-Status in a 5 minute loop against 20 groups of systems you’ll be making 1200 CSV files per hour. The files are small but they’ll add up over-time.

When we put all of this together a few posts down the line, you’ll see that we run WinServ-Status.PS1 twice; once to create the HTML file and again to make a CSV.  As with the other scripts in this series you’ll want to save them to C:\Program Files\WindowsPowerShell\Scripts\ to make them easier to run as a scheduled task or to call from any PowerShell session. If the “Scripts” folder isn’t in that path, just make it yourself. If you install the WinServ-Status script from the PowerShell gallery or from GitHub it will end up in that folder by default.

In the next article for this series we’ll build our lists of servers to run the data collection scripts against. After that, will be how to get IIS setup to host the HTML reports we are creating, followed by getting the scheduled tasks that run everything setup, and finally how to diplay all of this data in a sensible manner.