Loading...

Fix timestamp parse failure in Elasticsearch

:heavy_exclamation_mark: This post is older than a year. Consider some information might not be accurate anymore. :heavy_exclamation_mark:

If you use logstash to send logs to elasticsearch in JSON format, you may experience a _timestampparsefailure for this date format [24/Jan/2017:09:04:07 +0100]. This date format is often used in the access.log e.g. JBoss EAP. To solve that, just add an additional date format pattern into the index template.

The date [24/Jan/2017:09:04:07 +0100] has following format [dd/MMM/yyyy:HH:mm:ss Z]. The format is derived from JodaTime. We can add the new pattern to the index template, that defines the data mapping.

A demonstration: Create index for test.

PUT testdata
{
  "settings": {
    "number_of_shards": 1
  },
  "mappings": {
    "_default_" :{
      "properties": {
        "@timestamp": {
          "type":   "date",
          "format": "\[dd/MMM/yyyy:HH:mm:ss Z\]"
        }
      }
    }
  }
}

Now we create a test entry, and look if Elasticsearch complains.

POST testdata/logs
{
  "@timestamp": "[24/Jan/2017:09:04:07 +0100]",
  "message" : "Salut Phillipe"
}

Now we query the created log entry.

GET testdata/logs/_search
{
  "query": { "match_all": {}}
}

And the result:

TODO
Please remember the terms for blog comments.