Description of your problem
I have graylog-server installed with an input from kafka and the problem is that messages arrive from client machines, but then you cannot see the events that have arrived from kafka. It’s as if kafka’s input events don’t send them to elasticsearch.
Attachedp screenshot of the input metrics. Where you can see events coming.
Throughput / Metrics
1 minute average rate: 3 msg/s
Network IO: 0B 0B (total: 17.1MiB 0B )
Empty messages discarded: 0
this is the error that appears in the graylog-server logs:
2021-09-13T15:57:27.876+02:00 ERROR [DecodingProcessor] Unable to decode raw message RawMessage{id=87df9a20-149a-11ec-85b8-066b2386eaea, messageQueueId=31997, codec=gelf, payloadSize=1884, timestamp=2021-09-13T13:57:27.874Z} on input <61376bcb3ac6f7199070e417>.
2021-09-13T15:57:27.876+02:00 ERROR [DecodingProcessor] Error processing message RawMessage{id=87df9a20-149a-11ec-85b8-066b2386eaea, messageQueueId=31997, codec=gelf, payloadSize=1884, timestamp=2021-09-13T13:57:27.874Z}
java.lang.IllegalArgumentException: GELF message <87df9a20-149a-11ec-85b8-066b2386eaea> has invalid "host":
Description of steps you’ve taken to attempt to solve the issue
Operating system information
- Ubuntu 20.04
- Debian
Package versions
- Graylog - 4.1.3-1
- MongoDB - 4.4.8
- Elasticsearch - 6.8.18
- kafka - 2.13-2.8.0