A Journey of a Thousand Miles Begins with a Single Step

Accessing Mustache Arrays Element

The QA (Quality Assurance) team use simulators like Astrex to check and test respective changes and features. I was asked if I could bring the simulator logs into our Elasticsearch, for a real time purpose. Tailing log files is still difficult, except if you can use bash.

Read more

Parse XML content with Logstash

A customer of mine, requires xml data as separate field data for further investigation. The data itself is part of a log message that is processed by Logstash. Logstash provides the powerful XML filter plugin for further parsing.

Read more

Ship Docker Container Logs to Elasticsearch with Fluentd

By default, Docker captures the standard output (and standard error) of all your containers, and writes them in files using the JSON format. It is advised to set a max size, otherwise you will run out of disk space. Having unified logging with Elasticsearch allows you to investigate logs in a single point of view. Sending the logs to Elasticsearch from the Docker containers is quite easy. Fluentd is a data collector, which a Docker container can use by omitting the option --log-driver=fluentd.

Read more

Add Geo Points with Logstash Translate Filter

Storing data in Elasticsearch with city names, offers the capability to display in Kibana the distribution of the data in geographical map. To use that feature, you have to declare a geo_point type in your index mapping. I named the field location. To translate the city names to their respective geo point I use the logstash translate filter. Having a small dictionary, logstash will just take the value for your input city. You could also use zip codes, but this would require a more detailed data source. For the demonstration of the translation plugin it is sufficient.

Read more

Reindex data from remote cluster

At work I still run the Elasticsearch Cluster in version 5.6.4. While I’m eager to upgrade and keep up the pace, I don’t always have the chance to upgrade immediately. A customer of mine needed a small set of data in Excel. Elasticsearch 6 or moreover Kibana 6 offers the CSV export in the X-Pack extensions. To use that functionality, I needed to export a fragment of desired data from my production cluster. Since the Reindex API allows us to read data from remote and write it, I simply ramped up my private cluster in v 6.1.1 with Docker and started the reindexing.

Read more

No keep alive in Nginx

Providing a HTTP health check service with Nginx, is straightforward. If you do ensure that Nginx closes the HTTP connection instead keeping it alive. The basic option therefore is:

Read more

Pretty print duration

Performing a reindex job in Elasticsearch gives you the time the job took.

Read more

Delete all messages of a chat room

Find ObjectId of the chat room

> db.rooms.find({slug:"elk"}).pretty()
        "_id" : ObjectId("59a666cfa9886c002c30b404"),
        "owner" : ObjectId("59a547c2aed276003facf84f"),
        "name" : "Elasticschrott",
        "slug" : "elk",
        "description" : "Everything about the Elasticsearch Universe, including Logstash, Beats",
        "private" : false,
        "lastActive" : ISODate("2017-08-31T08:46:33.690Z"),
        "created" : ISODate("2017-08-30T07:18:39.885Z"),
        "messages" : [ ],
        "participants" : [ ],
        "archived" : true,
        "__v" : 0

Read more

Unarchive letschat chat room

A chat room in letschat was archived. To revive it we can alter the document in the MongoDB instance.

Read more