DockerLab: Building a Containerized ELK Stack #
The ELK Stack (Elasticsearch, Logstash, Kibana) is a powerful open-source platform for log analysis, visualization, and monitoring. By containerizing ELK using Docker and Docker Compose, you can create a portable, scalable, and resource-efficient homelab that simplifies experimentation and deployment.
Why Log Analysis Matters #
Logs are system-generated records that capture events, services, and applications running on your machine. Analyzing logs helps:
- Identify security threats and malicious processes
- Detect performance bottlenecks
- Gain insights into system and application behavior
- Support SOC (Security Operations Center) workflows
Why ELK Stack? #
The ELK Stack offers a complete solution for log management and analysis:
- Elasticsearch – A distributed search and analytics engine that stores logs centrally.
- Logstash – Ingests, transforms, and forwards logs with custom filters.
- Kibana – Provides powerful visualization and dashboards for real-time insights.
Unlike paid SIEM tools, ELK is free, open-source, and customizable, making it ideal for homelabs, SOC practices, and cybersecurity enthusiasts.
Why Containerization? #
Running ELK on Docker containers offers several advantages:
- Portability – Deploy on any system without complex setups
- Resource Efficiency – Containers consume fewer resources than VMs
- Scalability – Easily scale with orchestration tools like Kubernetes
- Infrastructure as Code (IaC) – Reproducible deployments using YAML
Step-by-Step: Containerized ELK Stack Setup #
1. Install Docker & Docker Compose #
On Ubuntu or any Linux distribution:
docker run hello-world
This confirms Docker is installed correctly.
- Setup Project Directory Create a new directory and download the required files:
mkdir elk && cd elk
-
Download the official .env and docker-compose.yml files from Elastic’s repo:
-
.env
-
docker-compose.yml
-
- Configure Environment Variables
- Set your desired ELK version in .env:
STACK_VERSION=8.15.0
(Avoid using latest for stability.)
- Define secure passwords:
ELASTIC_PASSWORD=yourpassword
KIBANA_PASSWORD=yourpassword
- Add Logstash Service
- Edit docker-compose.yml to include Logstash:
logstash:
depends_on:
es01:
condition: service_healthy
kibana:
condition: service_healthy
image: docker.elastic.co/logstash/logstash:${VERSION}
labels:
co.elastic.logs/module: logstash
user: root
volumes:
- logstashdata01:/usr/share/logstash/data
- certs:/usr/share/logstash/certs
- ./logstash.conf:/usr/share/logstash/pipeline/logstash.conf:ro
environment:
- NODE_NAME=logstash
- xpack.monitoring.enabled=false
- ELASTIC_USER=elastic
- ELASTIC_PASSWORD=${ELASTIC_PASSWORD}
- ELASTIC_HOSTS=https://es01:9200
command: logstash -f /usr/share/logstash/pipeline/logstash.conf
ports:
- "5044:5044/UDP"
mem_limit: ${MEM_LIMIT}
- Define Volumes
- Add persistent storage to docker-compose.yml:
volumes:
certs:
driver: local
esdata01:
driver: local
esdata02:
driver: local
esdata03:
driver: local
kibanadata:
driver: local
logstashdata01:
driver: local
- Create logstash.conf
- Define input, filters, and output:
input {
udp {
host => "0.0.0.0"
port => 5044
}
}
filter {
# Add parsing filters here if needed
}
output {
elasticsearch {
index => "logstash-%{+YYYY.MM.dd}"
hosts => ["https://es01:9200"]
user => "elastic"
password => "CHANGEME"
ssl_enabled => true
cacert => "usr/share/logstash/certs/ca/ca.crt"
}
}
- Launch the ELK Stack
- Run in detached mode:
docker compose up -d
- Or specify directory:
docker compose -f /path/to/docker-compose.yml up -d
-
Next Steps
-
Configure Beats or Syslog agents to forward logs
-
Add custom Logstash filters for parsing structured/unstructured data
-
Explore Kibana dashboards for visualization
-
Reference Files
-
.env
-
docker-compose.yml
-
logstash.conf
-
-
Conclusion #
- This Dockerized ELK Stack provides a powerful, scalable, and portable log analysis homelab. It’s perfect for learning log analytics, SOC workflows, and cybersecurity monitoring without expensive SIEM tools.
Stay tuned for the next article where we dive into real-world log analysis with Kibana visualizations.