Logs are one of the most crucial sources of data for debugging, monitoring, and performance analysis in contemporary application development, particularly in microservices and distributed systems. Logs become dispersed when apps run across several servers or containers, making problem tracing challenging. This issue is resolved by centralized logging, which gathers logs from several services and stores them in one location for search, analysis, and visualization.

The ELK Stack, which includes Elasticsearch, Logstash, and Kibana, is one of the most widely used centralized logging solutions. This article offers a thorough, step-by-step tutorial on how to utilize the ELK stack for centralized logging, complete with examples, practical applications, benefits, and best practices.

What is ELK Stack?
ELK stands for:

  • Elasticsearch → Search and analytics engine
  • Logstash → Data processing and pipeline tool
  • Kibana → Visualization and dashboard tool

These three components work together to collect, process, store, and visualize logs.

How ELK Stack Works
Flow of Data

  • Applications generate logs
  • Logs are collected using agents (e.g., Filebeat)
  • Logstash processes and transforms logs
  • Elasticsearch stores and indexes logs
  • Kibana is used to search and visualize logs

This pipeline enables real-time log monitoring and analysis.

Step 1: Install Elasticsearch

Elasticsearch is responsible for storing and indexing logs.

  • sudo apt update
  • sudo apt install elasticsearch

Start and enable the service:

  • sudo systemctl start elasticsearch
  • sudo systemctl enable elasticsearch

Explanation

  • Elasticsearch runs as a service
  • It stores logs in a searchable format
  • It provides fast querying capabilities

Step 2: Install Logstash
Logstash collects and processes logs before sending them to Elasticsearch.
sudo apt install logstash

Create a Logstash Configuration
input {
  beats {
    port => 5044
  }
}

filter {
  json {
    source => "message"
  }
}

output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "app-logs-%{+YYYY.MM.dd}"
  }
}


Explanation

  • Input listens for logs from Beats (like Filebeat)
  • Filter parses logs into structured JSON
  • Output sends logs to Elasticsearch

Step 3: Install Kibana
Kibana provides a UI to visualize and analyze logs.
sudo apt install kibana


Start the service:
sudo systemctl start kibana
sudo systemctl enable kibana


Access Kibana:
http://localhost:5601

Explanation

  • Kibana connects to Elasticsearch
  • It allows searching logs using queries
  • It provides dashboards and visualizations

Step 4: Install Filebeat (Log Shipper)
Filebeat is used to collect logs from applications and send them to Logstash.
sudo apt install filebeat

Configure Filebeat
filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /var/log/*.log

output.logstash:
  hosts: ["localhost:5044"]


Start Filebeat:
sudo systemctl start filebeat
sudo systemctl enable filebeat


Explanation

  • Filebeat reads log files
  • Sends logs to Logstash
  • Lightweight and efficient

Step 5: Generate Logs from Application
Example in a .NET application:
using Serilog;

Log.Logger = new LoggerConfiguration()
    .WriteTo.File("logs/app.log", rollingInterval: RollingInterval.Day)
    .CreateLogger();

Log.Information("Application started");


Explanation

  • Logs are written to a file
  • Filebeat reads the file
  • Logs are sent to ELK pipeline

Step 6: Visualize Logs in Kibana

  • Open Kibana dashboard
  • Create an index pattern (e.g., app-logs-*)
  • Use Discover tab to search logs
  • Build dashboards for monitoring

Example Queries

  1. Search errors: level: "Error"
  2. Filter by date: @timestamp > now-1h

Real-World Use Cases

  • Monitoring microservices logs in production
  • Debugging distributed systems
  • Tracking user activity
  • Detecting security issues

Advantages of ELK Stack

  • Centralized log management
  • Real-time log analysis
  • Powerful search capabilities
  • Scalable and flexible

Disadvantages of ELK Stack

  • Requires setup and maintenance
  • Can consume significant resources
  • Learning curve for beginners

Best Practices

  • Use structured logging (JSON format)
  • Rotate logs to avoid disk issues
  • Secure Elasticsearch with authentication
  • Monitor ELK performance
  • Use dashboards for better insights

Summary
For contemporary applications, centralized logging with the ELK stack is a crucial procedure. It facilitates the effective collection, processing, and analysis of logs by developers and DevOps teams. You may create a robust logging system that enhances debugging, monitoring, and overall system stability by integrating Elasticsearch, Logstash, and Kibana.

HostForLIFE ASP.NET Core 10.0 Hosting

European Best, cheap and reliable ASP.NET Core 10.0 hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.