Swimlane Releases elk-tls-docker to Simplify the Testing and Deployment of Elastic Stack

4 Minute Read


At Swimlane, we love to automate but we also love building and sharing open-source software (OSS) to help security teams. We are proud to announce that we have released a new open-source project called elk-tls-docker to make it easier for you to test and deploy Elastic Stack by automating the creation of several Elastic open-source software solutions.

Elk-tls-docker assists with setting up and creating an Elastic Stack using either self-signed certificates or using Let’s Encrypt certificates (using SWAG). This project was built so that you can test and use built-in features under Elastic Security, like detections, signals, cases, Elastic Endpoint and other features.

This docker-compose project will create the following Elastic containers based on version 7.9.2:

  • Elasticsearch
  • Logstash
  • Kibana
  • Packetbeat
  • Filebeat

There are two different modes you can run elk-tls-docker in. The first, and most common, is the standard development mode. The second is a pseudo-production environment. The main difference between these two modes is that the development mode will generate self-signed certificates and the “production” mode will generate certificates using Let’s Encrypt.

Development Environment

By default, elk-tls-docker will assist with setting up a development environment to test out some of the amazing features of Elastic Security. The first thing you need to do is clone or copy the repository to your local system:

git clone https://github.com/swimlane/elk-tls-docker.git elk-tls-docker
cd elk-tls-docker

You will need the entire repository to use elk-tls-docker as all files are required.

Along with the repository you will need to make sure you have Docker and Docker Compose installed as well. You can install them using apt-get and following these directions.

You will also need to copy the provided .env-example file and make a new .env file.

Once you have that configured (or the defaults), run the following command to generate self-signed certificates:

 docker-compose -f docker-compose.setup.yml run --rm certs

Next, run the following to set up your development environment:

 docker-compose up -d

After docker-compose downloads all the required images and builds the containers, you can visit Kibana by going to:

Note, it may take a few minutes but once Kibana starts up you will be directed to the login page which you enter in the provided username and password within your .env file

That’s it, you now have a development environment for Elastic Security!

Lets Encrypt and Pseudo-Production Environment

You can also utilize Let’s Encrypt certificates for a production-like environment and access Kibana, Elasticsearch, or other services externally but there are a few more requirements:

  • A registered domain name
  • DNS Records set up correctly and pointing to your host IP
    • Nameservers pointing to your hosting environment
    • A record pointing to your system’s IP
    • CNAME record created for subdomain configuration
  • An Ubuntu or other host system that can run Docker and Docker Compose

The essentials are the same, but there are additional steps that must be taken. Luckily, I have documented these processes at a high level here and provided an example walkthrough here. The basic steps are:

  1. Set up DNS Records to point to your host IP
    1. Nameservers pointing to your hosting environment
    2. A record pointing to your systems IP
    3. CNAME record created for subdomain configuration
  2. Modify your .env by:
    1. Specifying your DOMAIN name
    2. Specifying SUBDOMAIN or SUBFOLDER value
    3. Setting STAGING to false
  3. Run the following commands in order:
docker-compose -f docker-compose.setup.yml run --rm certs
docker-compose -f docker-compose.production.yml up 
docker-compose down or press ctrl + x/c 
docker-compose -f docker-compose.setup.yml run --rm certs # yes again
docker-compose -f docker-compose.production.yml up -d # yes again

That’s it! You should now be able to access your Kibana instance externally at https://{subdomain}.{domain.com}

Additional Setup

Once you have an environment set up and running, you may want to do some additional, optional configuring to fully utilize some of the great Elastic Security features:

  • Creating a New Superuser: By default, elk-tls-docker uses the built-in user elastic and the provided password you set in your .env file. If you would like to create an additional or different Superuser account you can utilize the provided Python script to do so. You must make sure that the _HOST and _PASSWORD are set correctly and additional information in the _BODY is set to your requirements.

  • Modify Index Mapping: When utilizing Elastic Security within Kibana, the built-in reports are mapped to Elastic Common Schema data fields but you may need to modify the built-in detections index key mappings. Based on testing and other research you can modify the index mapped fields using the provided script.

  • Send Test Data to Elasticsearch: If you want to immediately start testing elk-tls-docker then you can use the provided script to send data to Elasticsearch utilizing another open-source project of ours called soc-faker. You can stream as many fake Elastic Common Schema documents as you want, just modify the script with the number of documents to send as well as your credentials to Elasticsearch.

  • Send Test Data to Filebeat: Lastly, you can also test that Elastic filebeat is set up correctly by sending text data (or a file) via sockets straight to Filebeat which will be processed by Logstash and eventually in Elasticsearch. Check out the provided script here.

Thanks for checking out elk-tls-docker! We hope it helps you set up and begin testing Elastic Stack in no time!

Gartner: Create a SOC Target Operating Model to Drive Success

“Security and risk management leaders often struggle to convey the business value of their security operations centers to non security leaders, resulting in reduced investment, poor collaboration and eroding support…” — Access this Gartner SOC Operating Model report – courtesy of Swimlane.

Read More

Request a Live Demo