Digital Transformation Specialists

View Original

Part Two - Measuring our Digital Transformation using Splunk

Welcome to part two of "Measuring our Digital Transformation using Splunk" series. 

In part one, I set the stage on what digital transformation meant and how we could get started using the example of collecting and analyzing eNPS scores to measure corporate culture. We performed this analysis and measurement all through the Splunk platform.

In part two, I would like to go a little deeper to describe another transformational use case, which is to regularly measure the level of Cloud automation over time. I will then demonstrate how Splunk can be easily configured using scripted input to measure this use case and be your portal to all your Business Intelligence requirements.

What value is there in measuring Cloud automation?

When creating Cloud resources, many I.T. departments manually create them through the available portal services of their Cloud provider, such as the Azure Portal, the AWS Management Console or the Google Cloud Platform console.

The rationale I.T. Administrators have for manually creating Cloud resources is that it allows them the ability to better visualize and control available configuration options as they build out corporate Cloud services such as networks, virtual machines and security groups.

However, by enabling for more platform education and taking advantage of the automation of Cloud resources, we can achieve many different benefits, some of which include:

  1. Simulation of production scenarios. Automation allows I.T. Administrators to rapidly deploy and decommission isolated production environments for the purposes of testing.

  2. Rapid scale-out. Automation allows for the quick scale-out of infrastructure as necessary based on service demand, which can be challenging to perform if resources are manually created.

  3. Environment Standardization. Automation allows for a standardization of infrastructure builds for predictable performance and operational management.

  4. Foundation for self-service functionality. Building automation of Cloud resources can serve as the foundation for higher-level self-service orchestration for infrastructure deployment.

  5. Reduces I.T. Admin operational management. Automation gives I.T. Administrators more time to focus on other I.T. services that offer higher business value, such as I.T. service intelligence and insights.

There are even more automation benefits if we expand the context to include cloud-native application deployments using web app and container architectures. However, for the purposes of this blog, I will only focus on automation advantages as they relate to typical cloud virtual machine legacy workloads.

How can we measure Cloud automation using Splunk?

For virtual machine-based workloads provisioned in Microsoft Azure, what we can do is assign tags to all the resources we create, such as Resource Groups and Virtual Machines. Attaching tags to resources gives us metadata to logically organize them into a taxonomy for categorized searches. The following steps will help illustrate how:

1) Install Azure CLI on your Splunk server

The first thing we need to do is install Azure CLI on your Splunk search head. In my case, I am running Splunk on Ubuntu Linux, but Azure CLI also supports Windows, macOS and many other flavours of Linux. 

Please refer to the following Microsoft link that quickly walks you through the installation of Azure CLI for your platform and how to authorize it to connect to your Azure subscription.




With Azure CLI, we can start scripting the creation of Azure resources and ensuring that appropriate tags are assigned. The following are sample commands for the creation of an Azure resource group named webappgrp01-rg with the tag "auto-script=Yes."

See this content in the original post

2) Installing the Resource Graph extension

Now that we have installed the Azure CLI base, and we have built our scripts to create Azure resources, we will need to install the Resource Graph extension. We use the Graph API to query Azure to determine the tags assigned to Resource.

See this content in the original post

3) Query Azure for Resources Tags using Azure CLI

We can now run a command to query our Azure subscription for all resources and their tags:

See this content in the original post

The output will be in JSON format, which we will need to make sure to tell Splunk later when we configure it to run this command as a scripted input.

See this content in the original post

The next step is to create a bash script for the command making sure that it runs as the appropriately privileged user using the su command. (This works since Splunk runs as root)

See this content in the original post

We will call our script "azureAutoScript.sh." Make sure to test this bash script by running it on the command-line. If all works out, then proceed to the next step.

4) Configure Splunk for scripted inputs

We now need to configure Splunk to regularly monitor/run this script and index the data that it outputs. We do this by setting up a scripted input.


  • Create a directory for the app under: $SPLUNK_HOME/etc/apps/<appNAME> (in this case we will call our app "azureAutoCheck")

See this content in the original post
  • Create a bin folder under your app directory:

See this content in the original post
  • Copy your bash script (azureAutoScript.sh) to the bin directory:

See this content in the original post
  • Create a default folder under your app directory:

See this content in the original post
See this content in the original post
  • Create a local folder under your app directory:

See this content in the original post
  • Create an props.conf file in the local directory:

See this content in the original post

The DATETIME_CONFIG = CURRENT is required to tell Splunk that when ingesting data from the script, assign the current date to each data element. This is because our bash script does not include any time data in the outputted JSON.

5) Restart Splunk and search for your source information:

Restart Splunk: 

And start exploring our data:

6) Build your data using Splunk's Search Processing Language (SPL)

We can now use Splunk SPL to search to find all resources with the tag named "auto-script" and transform the data into a tabular format:

Next, we fill out all the NULL values of the auto-script field with "No" and count all the "Yes" and "No" based on the time they were ingested:

We are now able to calculate percentages for each "Yes" and "No":

Finally, we rename fields and display the percentage of resources that were created through automation over time:

7) Create Dashboard B.I. Reports based on data

Creating visualizations now becomes effortless. Clicking on the Visualization tab, we can select a Splunk Gauge Visualization with defined thresholds to visualize our data. (You will need to install the Number Display Viz Splunk App from the Splunkbase website)

Further format customizations allow us to add Sparklines (or trend lines) under the gauge plus many other formatting requirements:

Now we create a final visualization which we can save as a dashboard.

Define KPIs and ROIs Metrics

We can use these same procedures to create KPI thresholds and ROI calculations, which can also be placed on the dashboard for consumption by executive leadership.

Conclusion

What we have attempted to demonstrate in this blog is the versatility of the Splunk platform as a configurable portal for all your data ingestion, manipulation and Business Intelligence requirements. 

And by using key features such as add-on apps and customized scripted input, we have also shown that Splunk use cases are only defined by the business user's imagination.

In our particular example, we used Splunk and defined KPIs to measure I.T.'s ability to automate Azure Cloud resources. This use case was one of the metrics we used to gauge our digital transformation progress, which we can then use to report back to executive leadership.

If you have any questions or want to talk further about how PiiVOT can help your organization with digital transformation or have questions about what Splunk can do for you, please reach out to me at PiiVOT, or to your local Splunk representative.