Ingesting logs from remote Kafka server

Hi all!

I’m at the moment trying to integrate Kafka with Graylog. Currently, my ingestion pipeline looks like this:

Custom Python producer --> Kafka --> Filebeat --> Graylog Beats Input

Filebeat seems to be picking up the messages correctly from Kafka and sending them to my Graylog Beats input but no events are displayed in searches. When I start Filebeat there’s observable activity in the input. Filebeat output messages look OK when printed to the console.

Does anyone has experience with this? Any help on how to troubleshoot this issue will be highly appreciated.

Filebeat stats summary:
2020-01-14T07:40:26.445Z INFO [monitoring] log/log.go:145 Non-zero metrics in the last 30s {“monitoring”: {“metrics”: {“beat”:{“cpu”:{“system”:{“ticks”:50,“time”:{“ms”:51}},“total”:{“ticks”:170,“time”:{“ms”:179},“value”:170},“user”:{“ticks”:120,“time”:{“ms”:128}}},“handles”:{“limit”:{“hard”:4096,“soft”:1024},“open”:8},“info”:{“ephemeral_id”:“42630310-1005-4c15-99e9-d5dc85fdafae”,“uptime”:{“ms”:30025}},“memstats”:{“gc_next”:11024624,“memory_alloc”:8742888,“memory_total”:34880968,“rss”:42164224},“runtime”:{“goroutines”:39}},“filebeat”:{“events”:{“added”:117,“done”:117},“harvester”:{“open_files”:0,“running”:0},“inputs”:{“kafka”:{“bytes_read”:83909,“bytes_write”:13276}}},“libbeat”:{“config”:{“module”:{“running”:0},“reloads”:1},“output”:{“events”:{“acked”:117,“batches”:14,“total”:117},“read”:{“bytes”:2176},“type”:“logstash”,“write”:{“bytes”:45133}},“pipeline”:{“clients”:1,“events”:{“active”:0,“published”:117,“retry”:80,“total”:117},“queue”:{“acked”:117}}},“registrar”:{“states”:{“current”:0}},“system”:{“cpu”:{“cores”:4},“load”:{“1”:0,“15”:0.05,“5”:0.03,“norm”:{“1”:0,“15”:0.0125,“5”:0.0075}}}}}}

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.