Docker Log Management Using EFK Stack

Kalana Geesara
4 min readAug 18, 2021

--

Nowadays most of the applications are deployed as docker containers. When the applications are dockerized it is important to have a log management system to identify and debug the issues or bugs which arise with the system. There are several tech stacks that are available for this purpose. Early days most of us used ELK Stack which is based on Logstash for this purpose. But due to various reasons like higher memory consumption, Logstash was replaced by technologies like fluentd which has less memory consumption and supports hundreds of plugins

In this blog, we discuss setting up EFK stack in local for docker log management. EFK stands for Elasticsearch, Fluentd, and Kibana. Let’s have a brief introduction about the services which are used.

  1. Elasticsearch

Elasticsearch is a service capable of storing, searching, and analyzing large amounts of data. It can act as a database as the data is stored in the form of an index, document, and field. It can also be a search engine as it searches and analyses the data using filters and patterns. It is an open-source tool built on Apache Lucene. every feature of Elastic search is available as a REST API.

2. Kibana

Kibana is an open-source analytics and visualization platform which is designed to work with Elasticsearch. You use Kibana to search, view, and interact with data stored in Elasticsearch indices. You can easily perform advanced data analysis and visualize your data in a variety of charts, tables, and maps. Kibana makes it easy to understand large volumes of data. Its simple, browser-based interface enables you to quickly create and share dynamic dashboards that display changes to Elasticsearch queries in real-time.

3. Fluentd

Fluentd is an open-source data collector for a unified logging layer. Fluentd allows you to unify data collection and consumption for better use and understanding of data. Fluentd treats logs as JSON, a popular machine-readable format. It is written primarily in C with a thin-Ruby wrapper that gives users flexibility. According to fluentd official site, it is called a unified logging layer that can collect logs from various sources. More details can be found here.

Local Setup

Here we will discuss only the local setup of the EFK stack

Project architecture

Prerequisites

  1. Docker

That’s all you need.

Project folder structure

Step 01 : Create the Docker Compose file to setup the services

Docker-compose file consists of 4 services like httpd, elasticsearch, fluentd and kibana.

httpd service creates a docker container which we are going to monitor the logs. If you want to run the image using docker run command you can use the following command.

docker run -d --log-driver=fluentd  --log-opt fluentd-address=localhost:24224 --log-opt tag="httpd.access" -p 81:80 httpd

Fluentd service is built using a Dockerfile which we will be discussing in the next step.

Elasticsearch service is built using the 7.13.4 image version and Kibana service is built using the 7.13.4 version. You can change the versions as you like in your implementation.

Step 02: Create the Fluentd Dockerfile

Step 03: Create the Fluentd Config file

The above config file consists of the following directives:

  1. source directive determines the input sources. There are various input plugins such as tail, http and forward which can be used according to the use case.
  2. match directive determines the output destinations. The match directive looks for events with matching tags and processes them. The most common use of the match directive is to output events to other systems. For this reason, the plugins that correspond to the match directive are called output plugins.

Step 04: Run the Services

Use command docker-compose up -d to up all the services. Visit http://localhost:5601/.

Step 05: Setup Kibana to Visualize logs

First, check whether that Index is created in Index Management .

create index

If the index is not created send a request to httpd container using curl http://localhost:81/

After that create an Index pattern using the above index.

create index pattern

After creating the index pattern go to the discover to analyze the logs.

Docker logs in Kibana

Happy analyzing and debugging.

You can find the project repo here

--

--

Kalana Geesara
Kalana Geesara

Written by Kalana Geesara

Software Engineer at Accelaero. Keen to find and learn new technologies

No responses yet