Automate.Insights Part 2; Practical deep dive (standalone)

This is part 2 in a three part blog series. To jump to either one of the other posts, click the links below

In my previous post, I talked broad strokes in regards to Automate.Insights and what it's potential application could be for a software team. Here, I am going to get a little bit more in depth in regards to how it is used. 

I decided to try spinning up an ELK (elasticsearch, logstash, kibana) stack to show the capabilities Automate.Insight's  tools themselves. There is a strong community cookbook available, but for the simplistic purposes of this setup I decided to write my own standalone cookbooks. Let's get started. 

Prerequisites 

  • Workstation with 12 GB ram or greater (or adjust memory allocation in Vagrantfile)
  • ChefDK (0.6.0 or later) 
  • vagrant
  • virtualbox
  • Basic understanding of Chef
  • dummy chef-repo 
  • knife-topo plugin installed 

 

For the sake of this walk-through, you can download a tutorial repo from our github repo. I have included a Vagrantfile, along with a dummy knife.rb to use with a chef-zero server.

First, you will want to spin up your vagrant instance(s). This will spin up 3 Ubuntu 14.04 instances on your machine configured to a private ip address. You will only need the first server (ai-elkstack-1, ipaddress: 10.0.1.2) for a standalone elkstack, so you can specify just that one if you would like.  

 vagrant up 

Next, to spin up a chef-zero server on your workstation, simply run.

 chef-zero -d -H10.0.1.1 

This will run a chef-zero server as a background process on your workstation. If you need to kill that chef-zero server later, you will need to run 

 ps aux | grep chef-zero 

and kill the listed process id for the chef-zero server

Next, you will need to pull your cookbook dependencies in and upload them to the chef-server. I have included a Berksfile in the repo that will allow you to pull in dependencies. To pull and upload, run 

berks vendor
knife cookbook upload --all --cookbook-path berks-cookbooks/

This should give you all the cookbooks that you need. 

If you try to run knife cookbook upload without specifying the cookbook path, you will get several errors saying that it cannot find the cookbooks. This is because the default cookbooks path is cookbooks/, and they don't exist there. If you follow the procedure above you will not encounter these errors.

Standalone Server

For this, we are going to show an example of a elk server piping syslogs to the elasticsearch server for processing. 

First, you will want to spin up an elkstack using the cookbooks from our dummy repo. 

knife bootstrap 10.0.1.2 -x vagrant -P vagrant --sudo -N df_box_elkstack --bootstrap-version 12.0.3 -r "recipe[df_java],recipe[df_elasticsearch],recipe[df_kibana],recipe[df_kibana::kibana_nginx],recipe[df_logstash],recipe[df_logstash::logstash_forwarder]"

This cookbook is

  • installing elasticsearch, logstash, and kibana from packages
  • installing nginx to do a reverse proxy for the kibana UI
  • installing and configuring logstash-forwarder to output syslogs to logstash 
  • logstash-forwarder is formatting the logs using regexp 

Note1There have been some issues with package installation timing out (logstash and java). You simply need to rerun the bootstrap command if it fails.

Note2: There can be an issue on bootstrap if you already have entries for the private ip addresses in your ~/.ssh/known_hosts file. If you receive an error similar to the following on bootstrap, look to delete entries in your known_hosts file

 
Connecting to 10.0.1.2
ERROR: Net::SSH::HostKeyMismatch: fingerprint 90:00:17:0e:a6:79:5a:0f:aa:97:ae:43:61:67:cb:07 does not match for "10.0.1.2"

When you have successfully converged your node, simply type in 10.0.1.2 into your browser to verify that everything is working.

(Kibana 4 general UI look. You can see the Time-field dropdown showing it is connected to Logstash.) 

 

To see visualization, simply choose the @timestamp field in the timefield drop down, then click the discover tab. there, you will see visualized logs of activity. 

Now that we know there is a working elkstack server, you will want to be able to upload it to the Automate.Insights UI to take your existing infrastructure into a parsable format. Run

knife topo export df_box_elkstack --topo elkstack > elkstack.json

This in turn

  • reads the node data from the chef-server
  • gives your node topology a name called elkstack
  • exports all the data into an Automate.Insights friendly json format. 

Once this is done you can upload it to the Automate.Insights server. 

First, Create a new business system. We will call this one df_elkblog. You can call it whatever you'd like.

Then, you will want to select that business system, and add a new blueprint

Next, click the prime topology from Chef, and load the json file that you just exported. 

Going through the various loading options, you can make adjustments to your topology as you see fit, including adding software type, node name, and attributes that you want available for adjustment.

(you have a chance to define which recipes make which software for a more human readable format)

(selecting the node name is as straight forward as it gets. It's a nice feature)

(This page is for adjusting software versions. Our cookbooks are simplistic, and don't have this as an option)

(It is wise to select which attributes you would want more immediate access to and which will change quickly. you can search the ones you want very easily in the UI)

(example of completed field)

 

Then voila, we have a software system that has a fully creatable runlist, with attributes, that you can export back to chef when ready. To do so, input the host node that you intend to push this topology to. 

(Please note! I have the openSSL certs for the server specifically tied to 10.0.1.2 ip address. If you do not pick that host IP, the TLS handshake will not work, and your installation will fail)

Next, you will click export to Chef, and copy that to a json file in your dummy chef-repo. We decided to name it elkstack_standalone.json

(ignore cluster name, just a dummy image)

Once that has downloaded, it is time to import your topology into a format that your chef-server can work with. We have 

knife topo import elkstack.json

Note: I have stored a copy of a working topology file if yours somehow does not work properly it is in topologies/elkstack_standalone.json if you just want to use that. 

Knife-Topo in the background will create 

  • A topology data bag containing the topology data
  • A topology cookbook containing the node attributes 

With a freshly created vagrant instance, now we can just run 

knife topo create elkstack --bootstrap --sudo -xvagrant -Pvagrant

(Note: There have been some issues with logstash package installation timing out. You simply need to rerun the last command and click when it prompts you to)

And there you go. Using the topology cookbook we are able to spin up an exact version of the elkstack server we already created. You can validate by following the previous steps in Kibana.

For the next segment, we will be building on a more real world example with multiple nodes. Click here to jump to that tutorial.