0
0
Elasticsearchquery~5 mins

Logstash overview in Elasticsearch

Choose your learning style9 modes available
Introduction

Logstash helps collect and organize data from many places so you can understand it better. It makes messy data neat and ready to use.

You want to gather logs from different servers into one place.
You need to change or clean data before saving it.
You want to send data to Elasticsearch for searching.
You want to combine data from many sources into one stream.
You want to monitor your systems by analyzing collected data.
Syntax
Elasticsearch
input {
  plugin_name {
    # plugin settings
  }
}

filter {
  plugin_name {
    # plugin settings
  }
}

output {
  plugin_name {
    # plugin settings
  }
}

Logstash uses three main parts: input to get data, filter to change data, and output to send data.

Each part uses plugins that do specific jobs like reading files or sending data to Elasticsearch.

Examples
This example reads a system log file and prints the data to the screen in a readable format.
Elasticsearch
input {
  file {
    path => "/var/log/syslog"
    start_position => "beginning"
  }
}

output {
  stdout {
    codec => rubydebug
  }
}
This example gets data from Beats, parses Apache logs, and sends the data to Elasticsearch.
Elasticsearch
input {
  beats {
    port => 5044
  }
}

filter {
  grok {
    match => { "message" => "%{COMMONAPACHELOG}" }
  }
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]
  }
}
Sample Program

This simple Logstash config reads input from the keyboard, adds a greeting field, and prints the result.

Elasticsearch
input {
  stdin {}
}

filter {
  mutate {
    add_field => { "greeting" => "Hello, Logstash!" }
  }
}

output {
  stdout {
    codec => rubydebug
  }
}
OutputSuccess
Important Notes

Logstash runs as a service and reads its configuration from files.

Filters can change, add, or remove data fields to make data useful.

Outputs send data to many places like files, Elasticsearch, or other systems.

Summary

Logstash collects data from many sources and prepares it for analysis.

It uses input, filter, and output sections to organize data flow.

It is useful for monitoring, searching, and understanding data from systems.