In
this I will talk about how to leverage the analysis of logs using
Elasticsearch, Logstash and Kibana.
Prerequisites:
·
Installation of Elasticsearch, Logstash and Kibana is
complete as per previous article here.
Load logs data into Elasticsearch
using Logstash
We
are going to write one Logstash configuration which reads the data from Apache
Logs file.
Create
a sample log file as below and save it in “D:” directory.
ApacheLogs.log
71.141.244.242 - kurt
[18/May/2011:01:48:10 -0700] "GET /admin HTTP/1.1" 301 566
"-" "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.3)
Gecko/20100401 Firefox/3.6.3"
134.39.72.245 - -
[18/May/2011:12:40:18 -0700] "GET /favicon.ico HTTP/1.1" 200 1189
"-" "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1;
Trident/4.0; .NET CLR 2.0.50727; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729;
InfoPath.2; .NET4.0C; .NET4.0E)"
Next
thing is to write logstash conf file as shown below.
Name
this file as "logstash-apache.conf"
and save this file in bin folder of Logstash installation folder.
input {
file {
type => "apache"
path => [ "D:/ApacheLogs.log" ]
start_position => "beginning"
}
}
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}"
}
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z"
]
}
}
output {
elasticsearch { host => localhost }
stdout { codec => rubydebug }
}
The
above script first reads the apache logs file on given path, parses it into
Apache logs format and add it to Elasticsearch.
Now
let's execute this config using logstash and insert records into elasticsearch.
Now
it will start reading from the file and also start inserting data into
Elasticsearch.
Now
go to Google Crome "Sense"
extension and execute following command to view above log data.
Now
go to Kibana dashboard and configure indexes and Timestamp attribute as shown
in the screenshot below.
Now
simply go to Visualize tab to create graphs as you like and save them. Next is
to view this saved Visualization in the Dashboard tab.
No comments:
Post a Comment