Setting up ELK with Spring Boot Microservice

One of the important phases in IT is post production phase and one of the major challenge is to identify issue in post-production. When multiple application spit out different logs in different system, it is important to collate them at one place for IT team to manage .ELK comes to the rescue . In this document will cover on what is ELK and how to aggregate the logs from different micro services and push it to one common location.

What is ELK ?

ELK is an acronym for Elastic Search, Logstash and Kibana. It is an opensource software owned by Elastico .

Elastic Search is an Apache Lucene based search engine which stores, search and analyze data of huge volumes in almost real time.  Elastic search can be installed on premises or else can be used a SAAS application.

Logstash is the Log aggregator which has a pipeline to take the input , filter the data and send the output . Logs stash can take the logs from various sources using different input plugins and send the output in a desired manner.

Kibana is a software to visualize the Elastic Search data. It comes as a plugin with Elastic Search .

Elastic Search and Kibana can be deployed as a cloud service and hosted in AWS or GCP . It can also be installed in on premises infrastructure. In this document we will us docker image of ELK and setup in EC2.

Design Architecture

Capture

In the above design , different micro-services will be spitting out logs . We will have the Syslog driver to push the logs generated from different micro-services to Logstash, which will again filter the logs and push it to Elastic Search . All the aggregated logs will be visible in Kibana.

Setting up of ELK in EC2

We will be setting up the ELK in EC2 Ubuntu machine using official docker images. Log in to EC2 server and create a directory ‘elk’ in path /home/ubuntu/.

Install docker in EC2 by following the steps mentioned here :

https://docs.docker.com/install/linux/docker-ce/ubuntu/

  • Navigate inside the directory elk, create a file docker-compose.yml
version: '2'
services:
    elasticsearch:
        image: docker.elastic.co/elasticsearch/elasticsearch:6.3.2
        ports:
            - '9200:9200'
            - '9300:9300'
    kibana:
        image: docker.elastic.co/kibana/kibana:6.3.2
        ports:
            - '5601:5601'
        depends_on:
            -  elasticsearch
    logstash:
        image: docker.elastic.co/logstash/logstash:6.3.2
        ports:
            - '25826:25826'
        volumes:
            - $PWD/elk-config:/elk-config
        command: logstash -f /elk-config/logstash.config
        depends_on:
            -  elasticsearch
  • Elasticsearch uses a mmapfsdirectory by default to store its indices. The default operating system limits on mmap counts is likely to be too low, which may result in out of memory exceptions.

On Linux, you can increase the limits by running the following command as root

Run the command to allocate maximum memory :

sudo sysctl -w vm.max_map_count=262144
  • Run docker-compose up to spin up all the containers of ELK.
  • Validate if Kibana is up by hitting the port 5601. You should see the below page :

Untitled

  • Setup the index pattern in Kibana .
  • telnet [IP of logstash] [port of logstash] and enter any text .
    • Ex: telnet 52.207.254.8 25826

You can see the text in Kibana . That means the connectivity is set for ELK.

Next we will see how we can push logs from Microservices to ELK .

Set up the Syslog driver :

In order to send the logs from the microservices hosted in EC2 , we can use syslog driver to push the log to logstash .

I am using the below project for the logs . We will be running this project in EC2.

https://github.com/jokumar/task-planner

  • We need to make a change in rsyslog.conf present in the Ubuntu machine.
vi /etc/rsyslog.conf
  • Uncomment the below lines:
 Untitled.png

Now add the below lines in logback.xml of the spring boot project:

<appender name=”SYSLOG” class=”ch.qos.logback.classic.net.SyslogAppender”>

<syslogHost>{logstash host }</syslogHost>

<port>{ logstash port 25826 }</port>

<facility>LOCAL1</facility>

<suffixPattern>[%thread] %logger %msg</suffixPattern>

</appender>

 

The above setup will push the logs to Logstash.

 

If the project is build using docker then we need add the drivers in docker run command:

–log-driver syslog –log-opt syslog-address=tcp://{logstashhost}:{logstashport}

 

On starting the server , and hitting the api you can see the logs in Kibana

Untitled.png

Digiprove sealCopyright secured by Digiprove © 2019 Geeks 18

1 Comment

Leave a Reply

Your email address will not be published.


*