I had to prepare an Elastic Search training that contains a theoretical and a hands on part. For the hands on part, we want the people to have locally on their laptops the following software:

  • An ElasticSearch node
  • Kibana to build dashboard
  • A Jupyter notebook in order to use Python to play with ElasticSearch
  • A Nodered instance in order to use JavaScript to play with ElasticSearch
  • A Grafana Instance

Thanks to docker and docker-compose, all of these can be set in a few minutes.

#COMPOSE ELK5

version: '2'
services:

##############################
  nodered:
    image: nodered/node-red-docker:0.19.2
    container_name: nodered
    ports:
      - "1880:1880"

##############################
  monitordocker:
    image: snuids/monitordocker:v0.4.5
    container_name: monitordocker
    links:
      - esnode1
    environment:
      - ELASTIC_ADDRESS=esnode1:9200
      - POLLING_SPEED=30
      - PYTHONUNBUFFERED=0
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock
    restart: always

##############################
  cerebro:
    image: lmenezes/cerebro:0.8.1
    container_name: cerebro
    ports:
      - 9000:9000
    links:
      - esnode1

##############################
  esnode1:
    image: docker.elastic.co/elasticsearch/elasticsearch:6.4.1
    ports:
      - "9200:9200"
    container_name: esnode1

##############################
  kibana:
    image: snuids/kibanatraffic:v6.4.1
    container_name: kibana
    ports:
      - 5601:5601
    environment:
      - ELASTICSEARCH_URL=http://esnode1:9200
      - TZ=Europe/Paris
    restart: always

#############################
  anaconda:
    image: snuids/anaconda:v0.0.1
    container_name: anacondab
    ports:
      - 8888:8888
    volumes:
      - /Users/snuids/Documents:/opt/notebooks/snuids

#############################
  grafana:
    image: grafana/grafana:5.3.2
    ports:
      - 3001:3000

Save the text above in a file named docker-compose.yml and start the containers using the following command:

docker-compose up -d

Depending on your operating system, it is possible that the elastic search node wont start because your operating system parameters does not fulfil the elastic search requirements.

Use the following command to check that all the containers are running:

docker ps -a

Simply use the following command to check the log of a stopped container:

docker logs NAMEOFTHECONTAINER

So it the elastic search node did not start simply type:

docker logs esnode1

in order to figure out what’s wrong.

A classic issue is a too small value of the “vm.max_map_count” system parameter. On linux, simply issue the following command in order to set it properly.

sysctl -w vm.max_map_count=262144

Reissue the “docker-compose up -d” command until all the containers are up.

Logstash

Logstash can be added to the mix by adding the following lines in the bottom of the docker-compose file:

  logstash:
    image: docker.elastic.co/logstash/logstash:6.4.1
    container_name: logstash
    volumes:
      - /home/PATHONYOURLOCALSYSTEM/logstash/config:/usr/share/logstash/config
    command: logstash -f /usr/share/logstash/config/logstash.conf<span id="mce_SELREST_end" style="overflow:hidden;line-height:0;"></span>

Don’t forget to create the following files in your logstash folder:

  • logstash.conf (Logstash job definitions)
  • logstash.yml (Can be empty)