This post is older than a year. Consider some information might not be accurate anymore.
Used: logstash v1.5.1
To send data from a RESTful Webservice, there is the input plugin exec. It executes a command for a defined interval.
Using this logstash.conf
produced a json parse failure in Elasticsearch.
input {
exec {
command => "C:\Daten\tools\logstash-1.5.1\bin\metrics.bat"
interval => 10
codec => json
}
}
output {
elasticsearch {
node_name => test
host => localhost
}
}
Elasticsearch receives this message:
C:\bin\curl\curl.exe http://localhost:8080/jolokia/read/metrics:name=trx.process.approved
{"request":"mbean":"metrics:name=trx.process.approved","type":"read"},"value":{"Count":14},"timestamp":1434641808,"status":200}
Well that’s not json! As the docs says:
The @message of this event will be the entire stdout of the command as one event.
My solution is to use the filter plugins split()
, drop()
and json()
.
input {
exec {
command => "C:\Daten\tools\logstash-1.5.1\bin\metrics.bat"
interval => 10
codec => plain
}
}
filter {
split {
}
if [message] =~ "^{" {
# convert to json
json {
source => "message"
}
}
else {
# drop all lines that are not json
drop {}
}
}
output {
elasticsearch {
node_name => test
host => localhost
}
}
- split will take the message and parse it into three separate files
- check with a regex if the line begins with the json delimiter “{“
- if the line is the json part convert it to json and let elasticsearch parse it properly
- else drop the line