We already saw that we can monitor Docker via the Elastic Stack in this previous post. In this post we will update the monitoring script in order to also store the docker events in Elastic Search.

In order to monitor docker via Elastic, you need an elastic stack running, you can use the following post in order to get one running.

It is easy to get information from docker via python. Simply import the docker library and create a new docker client using the following line:

client = docker.from_env()

The client will try to connect to the docker socket which should work if your python process runs on the same machine as docker.

Reading the events is quite easy as well. We can iterate the events using the following loop:

def worker():
    """thread worker function"""
    global event_bulk_body,lock
    print ('Worker Thread Starting')
    print("*="*20)

    for event in client.events():
        lock.acquire()
        try:
            event1 = event.decode('utf-8')

            for event2 in event1.split("\n"):
                if "{" in event2:
                    event_bulk_body += '{ "index" : { "_index" : "%s-%s", "_type" : "event"} }\n' %(DOCKER_EVENTS.replace("*",""),datetime.now().strftime("%Y.%m"))
                    event3=json.loads(event2)
                    event3['@timestamp']=int(time.time())*1000
                    event_bulk_body += json.dumps(event3)+'\n'
        except Exception as e:
            print("Unable to read events.")
            print(e)

        lock.release()

Note that, as the events() function is blocking, the above code must be run in a specific thread. In order to add the events to the bulk action string (event_bulk_body), a lock is used, ensuring that the main thread and this worker thread are not appending the event_bulk_body variable at the same time.

The worker thread is started using the following code:

t = threading.Thread(target=worker)
t.start()

The lock is created using the following line:

lock = threading.Lock()

The full code can be found here.

Templating

In order to store the events we need an index template in order to ensure that the data types are correctly understand by Elastic. We can use an index mapping such as:

{
  "event": {
    "properties": {
      "@timestamp": {
        "type": "date"
      },
      "Action": {
        "type": "keyword"
      },
      "Actor": {
        "properties": {
          "Attributes": {
            "properties": {
              "com": {
                "properties": {
                  "docker": {
                    "properties": {
                      "compose": {
                        "properties": {
                          "config-hash": {
                            "type": "keyword"
                          },
                          "container-number": {
                            "type": "keyword"
                          },
                          "oneoff": {
                            "type": "keyword"
                          },
                          "project": {
                            "type": "keyword"
                          },
                          "service": {
                            "type": "keyword"
                          },
                          "version": {
                            "type": "keyword"
                          }
                        }
                      }
                    }
                  }
                }
              },
              "container": {
                "type": "keyword"
              },
              "exitCode": {
                "type": "keyword"
              },
              "image": {
                "type": "keyword"
              },
              "name": {
                "type": "keyword"
              },
              "signal": {
                "type": "keyword"
              },
              "type": {
                "type": "keyword"
              }
            }
          },
          "ID": {
            "type": "keyword"
          }
        }
      },
      "Type": {
        "type": "keyword"
      },
      "from": {
        "type": "keyword"
      },
      "id": {
        "type": "keyword"
      },
      "scope": {
        "type": "keyword"
      },
      "status": {
        "type": "keyword"
      },
      "time": {
        "type": "long"
      },
      "timeNano": {
        "type": "long"
      }
    }
  }
}

Note that the monitordocker.py code will create all the required index templates for you.

Docker

It is easy to create a container with the monitoring code using the following Docker file:

FROM python:3.6
MAINTAINER snuids

RUN pip install docker
RUN pip install elasticsearch

COPY ./*.py /opt/monitordocker/

WORKDIR /opt/monitordocker

CMD ["python", "monitordocker.py"]

You can also use a prepackaged image by adding the following lines to your docker-compose file:

  monitordocker:
    image: snuids/monitordocker:v0.4.1
    container_name: monitordocker
    links:
      - esnode1
    environment:
      - ELASTIC_ADDRESS=esnode1:9200
      - POLLING_SPEED=30
      - PYTHONUNBUFFERED=0
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock
    restart: always

This container will automatically create the appropriate index templates for the two document collections:

 

  • docker_events*
  • docker_stats*