Monday, August 15, 2016

Combining DSC with ELK for effective infrastructure monitoring

DSC and event logs

DSC the management platform in Windows PowerShell that enables deploying and managing configuration data for software services and managing the environment in which these services run.
DSC provides a set of Windows PowerShell language extensions, new Windows PowerShell cmdlets, and resources that you can use to declaratively specify how you want your software environment to be configured. It also provides a means to maintain and manage existing configurations. When you run a DSC configured resource on a target system it will first check if the target matches the configured resource. If it doesn’t match it, it will make it so.

But if there is an error in the last DSC run? How can we have live monitoring and alerting with DSC? We’ll that’s what we’ll be solving in this post. We’ll use the combination of event logs and the ELK stack to create an infrastructure monitoring system, for our environments. DSC logs every details of the execution and details in the windows event logs. These logs can be found by using the EventViewer, by navigating to the channel Applications and Service Logs -> Microsoft -> Windows -> Desired State Configuration

We’ll use the WinlogBeat in combination with Logstash to retrieve these logs and push them to Elastic search instance. Later using Kibana we can create queries and visualizations to create a DSC dashboard for effective monitoring.

Installing and configuring ELK on Windows

Before staring the installation, we need to download the following software:

Download and extract all these softwares to the respective folders under a folder “ELK”. The contents in the demo looks like. I have some additional beats installed on my machine, but for this demo, we only need Winlogbeat. Before starting the installation, we need to have JAVA installed on the machine for Elastic search.

To install ELK, we need to start the services for elastic search, logstash and kibana. These services can be started by running the .bat files for these services.


To install and configure elastic search, navigate to the bin directory of elasticsearch and run the service.bat file with argument “install”

Once the installation is completed you can test whether everything is working fine by invoking the url . If everything is correctly setup you should see a JSON result  like

  "name" : "Rebel",
  "cluster_name" : "elasticsearch",
  "version" : {
    "number" : "2.3.1",
    "build_hash" : "bd980929010aef404e7cb0843e61d0665269fc39",
    "build_timestamp" : "2016-04-04T12:25:05Z",
    "build_snapshot" : false,
    "lucene_version" : "5.5.0"
  "tagline" : "You Know, for Search"


We will make use of the winlogbeat plugin to send events to logstash. On receiving these events, logstash will send the transaction to elasticsearch by using the output plugin.

To install the beats input plugin. Run the logstash-plugin.bat file with the argument “install logstash-input-beats”.

The next step is to configure logstash to listen on port 5044 for incoming beats connection and index into elasticsearch. You can do this by using a logstash configuration file. Create a logstash.conf file in the logstash/bin directory with the below contents.

input {
  beats {
    port => 5044

output {
  elasticsearch {
    hosts => ""
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"

Now you can start the logstash service with the configuration by passing this file name to the logstash.bat file.

.\logstash.bat -f logstash.conf


To send the DSC event logs to logstash, we need to configure winlogbeats to pull log information from the DSC operational channel. To do this, open the winlogbeat.yml file in the winlogbeats directory and add the lines for event_logs as given below.

Change the output to logstash instead of the default elasticsearch option.

Install the service by running the install-service-winlogbeat.ps1 script in the same directory.


Run the Kibana.bat file in the bin directory under kibana folder, to install the service. If the service is stared, kibana site can be accessed at the url  

You can configure the winlog index pattern by creating a new index pattern winlogbeat-* as given below. 

Once the index is created, you can now go ahead and create queries and visualizations from those queries for your DSC dashboard.

For ex:

You can create a visualization for the logs per computer by following the steps below.
  • Click on the discover tab and create a query with fields computer name and log level. You can do this by adding the computer_name and level fields as given below.

  • On the right side, you should be able to see the filtered results with the column names now.
  • Next, save the search, by choosing the save search option in the top bar and give an appropriate name

  • Next, click on the visualize tab and add a new line chart, choose the select from saved search as a source and choose the search you have saved.

  • Click on x-axis and choose “Date histogram” for aggregation field.

  • Click on Add sub buckets, click on split lines and choose “Terms” as sub-aggregation with field “computer_name”

  • Click the play button to see the graph. Save the visualization with a proper name.
  • Now click on the dashboard and add this visualization on the dashboard to see the widget in action.

No comments: