view · edit · history · print


"the swiss-army knife for logs."

ELK stack = Elasticsearch, Logstash and Kibana


  • logstash-1.4.0 <== central log facility to collector and process ==> get started
  • elasticsearch-1.0.1 <== log search engine (based on Apache Lucene) and log data persistent storage <= this can be integrated with back-ends like hadoop etc...
  • Kibana <== log visualisation
  • Lumberjack (Now called logstash-forwarder, now called FILEBEAT) OR Redis <= (optional) logs data broker (sent events)
  • FILEBEAT; encryption and client certificates authentication optional; documentation.
  • more BEATS:

howto logshipment

  • ship them yourself and import into logstash using file watch
  • rsync over ssh to a file which logstash watches (rsync's update feature will append new lines to the file, or should...)
  • Lumberjack (over SSL)
  • Other SLL options using: amqp, stomp
  • logtail - perl, from flat files to redis
  • syslog/rsyslog
  • logstash to logstash forwarding ( => “hostname-of-centralized-log-server”). Logstash does the job as a log shipper, but you might consider replacing it with Lumberjack / Logstash Forwarder
  • Woodchuck

etc... (see log-shippers)

  • see also NXLOG to be syslog receiver and outputting JSON to logstash (reduces load and avoid UDP drops)
  • scaling ELK: HAproxy.

About redis

Redis (REmote DIctionary Server) is a networked, in-memory, key-value data store with optional durability. "Redis does not support encryption. In order to implement setups where trusted parties can access a Redis instance over the internet or other untrusted networks, an additional layer of protection should be implemented, such as an SSL proxy."

logstash configuration example

Made up of three parts:

  • file input, that will follow the log
  • grok filter, that would parse its contents to make a structured event
  • output, e.g. to elasticsearch_http. Firthermore you can use Kibana or its native UI to explore those logs.


input {
  file {
    path => “/var/log/apache.log”
    type => “apache-access”
    start_position => “beginning”

filter {
  if [type] == “apache-access” {
    grok {
      match => [ "message", "%{COMBINEDAPACHELOG}" ]

output {
  elasticsearch_http {
    host => “servername”
    port => 80
    index => “app token“


rough notes


Example log shipment

Example log shipment


admin · attr · attach · edit · history · print
Page last modified on January 13, 2016, at 03:46 AM