Friday 18 September 2015

Developer’s Guide to Install Elasticsearch, Logstash and Kibana


In this I will talk about how to perform log analytics using Elasticsearch, Logstash and Kibana. To start with, we will see how to install these softwares on Windows.

Prerequisites:
  1. ·         elasticsearch-1.4.4
  2. ·         kibana-4.0.1-windows
  3. ·         logstash-1.5.0.rc2


Install Elasticsearch on Windows
Elasticsearch is a search engine tool/platform which allows us to save the documents to be search in certain format and provides APIs to do full text search capabilities. In the recent times, because of its features like Open Source, Scalability, ease of use, it has become very popular among developer community.

Install Elastic Search is every easy, here are the steps for the same

For this demo, we are going to use “elasticsearch-1.4.4”. Unzip and extract the content to the suitable directory.



This will start the Elasticsearch service at http://localhost:9200.

Install Logstash on Windows

Logstash is useful utility when it comes to playing with Logs. It gives you in built-in features to read from various file formats and perform some operations with it. One of the best feature it has is, you can read the logs in certain format (e.g. Apache Logs, SysLogs etc.) and put them into Elastic search.
Unzip the downloaded “logstash-1.5.0.rc2” in any folder.

To enable use of Logstash from any directory, add the path to system variable using environment variables.

>set LOGSTASH_HOME=D:\ELK\logstash-1.5.0.rc2
>set PATH=%PATH%;D:\ELK\logstash-1.5.0.rc2\bin

And that's it, logstash is ready to use



Install Kibana 4 on Windows

Kibana is a JavaScript library which allows us to create beautiful dashboard reports using elasticsearch data.

Here we are going to use “kibana-4.0.1-windows” as it is compatible with current release of elasticsearch that we are using.
Prior to Kibana 4, we need to have a web server running but with Kibana 4, we get it embeded.
Unzip the “kibana-4.0.1-windows” file at any location.

Kibana configuration is very easy, simply edit config/kibana.yml to add the elasticsearch url and done.

Open config/kibana.yml and update property elasticsearch_url: "http://localhost:9200".

To start Kibana, execute

 

A server would get started and you could see the GUI at http://localhost:5601/



Developers Guide to Analyse Logs Using Elasticsearch, Logstash and Kibana


In this I will talk about how to leverage the analysis of logs using Elasticsearch, Logstash and Kibana.


Prerequisites:
·         Installation of Elasticsearch, Logstash and Kibana is complete as per previous article here.


Load logs data into Elasticsearch using Logstash
We are going to write one Logstash configuration which reads the data from Apache Logs file.


Create a sample log file as below and save it in “D:” directory.

ApacheLogs.log
71.141.244.242 - kurt [18/May/2011:01:48:10 -0700] "GET /admin HTTP/1.1" 301 566 "-" "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3"
134.39.72.245 - - [18/May/2011:12:40:18 -0700] "GET /favicon.ico HTTP/1.1" 200 1189 "-" "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; .NET CLR 2.0.50727; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729; InfoPath.2; .NET4.0C; .NET4.0E)"


Next thing is to write logstash conf file as shown below.

Name this file as "logstash-apache.conf" and save this file in bin folder of Logstash installation folder.


input {
  file {
    type => "apache"
    path => [ "D:/ApacheLogs.log" ]
    start_position => "beginning"
  }
}

filter {
  grok {
    match => { "message" => "%{COMBINEDAPACHELOG}" }
  }
  date {
    match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
  }
}

output {
  elasticsearch { host => localhost }
  stdout { codec => rubydebug }
}


The above script first reads the apache logs file on given path, parses it into Apache logs format and add it to Elasticsearch.

Now let's execute this config using logstash and insert records into elasticsearch.



Now it will start reading from the file and also start inserting data into Elasticsearch.

Now go to Google Crome "Sense" extension and execute following command to view above log data.
















Now go to Kibana dashboard and configure indexes and Timestamp attribute as shown in the screenshot below.







Now simply go to Visualize tab to create graphs as you like and save them. Next is to view this saved Visualization in the Dashboard tab.