hi,
i have been asking around everywhere, and it seems i fell victim to a very basic misunderstanding somehwhere, and i can’t seem to be able to figure out where on my own.
i have a rancher cluster, with several environments and with heaps of services running, and i want to collect log data from those. also, this is running on top of EC2, so i built a small filebeat container like this:
filebeat.autodiscover:
providers:
- type: docker
hints.enabled: true
templates:
- condition:
not.contains:
docker.container.labels.log_enabled: "false"
config:
- type: docker
containers.ids:
- "${data.docker.container.id}"
json.message_key: log
json.keys_under_root: true
json.add_error_key: false
processors:
- add_cloud_metadata: ~
- add_docker_metadata: ~
output.logstash:
hosts: ["${LOGSTASH_HOST}:${LOGSTASH_PORT}"]
bulk_max_size: 1024
i changed the logstash output to a file output temporarily in between, and it writes nice json lines, with all the metadata i want.
in graylog, i create a Beats input, and configured the variables above to point to that input. using tcpdump, i verified that filebeat is sending more or less identical json objects to this input.
i have not other configuration (yet). so i can see the incoming messages in “All Messages” when logged as admin. the thing is that those message do not have any metadata in them. all the extra fields are gone.
i am using:
docker.elastic.co/filebeat/filebeat:6.5.1 (also tried 5.6.13)
and
graylog/graylog:2.4.6-1
i also looked at the source code for the beats plugin, and it very much looks like
BeatsCodec.addFlattened()
should just add all fields from the JSON object as dot-separated flattened fields to the message. but nothing, nada, zilch, 没有. i am hitting a wall here, and would be extremely grateful if anyone could point out to me what i’m not getting.
regards
ruben