Docker logging and empty mandatory "short_message" field

We currently have our docker containers sending logs to our Graylog server using the GELF driver. We have a UDP input setup on the Graylog server and largely speaking, things seem to be working OK. We do, however, get a LOT of error messages on the Graylog log console that look like the below example:

2019-03-12 15:39:19,896 ERROR: org.graylog2.shared.buffers.processors.DecodingProcessor - Unable to decode raw message RawMessage{id=0088fe71-44dd-11e9-a116-0242ac120004, journalOffset=1499391876, codec=gelf, payloadSize=372, timestamp=2019-03-12T15:39:19.895Z, remoteAddress=/172.18.0.1:49958} on input <5b61e7592ab79c00019f5868>.
2019-03-12 15:39:19,896 ERROR: org.graylog2.shared.buffers.processors.DecodingProcessor - Error processing message RawMessage{id=0088fe71-44dd-11e9-a116-0242ac120004, journalOffset=1499391876, codec=gelf, payloadSize=372, timestamp=2019-03-12T15:39:19.895Z, remoteAddress=/172.18.0.1:49958}
java.lang.IllegalArgumentException: GELF message <0088fe71-44dd-11e9-a116-0242ac120004> (received from <172.18.0.1:49958>) has empty mandatory "short_message" field.
        at org.graylog2.inputs.codecs.GelfCodec.validateGELFMessage(GelfCodec.java:252) ~[graylog.jar:?]
        at org.graylog2.inputs.codecs.GelfCodec.decode(GelfCodec.java:134) ~[graylog.jar:?]
        at org.graylog2.shared.buffers.processors.DecodingProcessor.processMessage(DecodingProcessor.java:150) ~[graylog.jar:?]
        at org.graylog2.shared.buffers.processors.DecodingProcessor.onEvent(DecodingProcessor.java:91) [graylog.jar:?]
        at org.graylog2.shared.buffers.processors.ProcessBufferProcessor.onEvent(ProcessBufferProcessor.java:74) [graylog.jar:?]
        at org.graylog2.shared.buffers.processors.ProcessBufferProcessor.onEvent(ProcessBufferProcessor.java:42) [graylog.jar:?]
        at com.lmax.disruptor.WorkProcessor.run(WorkProcessor.java:143) [graylog.jar:?]
        at com.codahale.metrics.InstrumentedThreadFactory$InstrumentedRunnable.run(InstrumentedThreadFactory.java:66) [graylog.jar:?]
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181]

Is this just something we should ignore? Is there something we are missing with our setup that we need to address? Everything in our setup is pretty basic as it’s just a few lines of yaml in the docker compose file that points the container to the Graylog server, and otherwise just a basic GELF based input on UDP to receive the logs. We do get logs on the server and they are searchable, but the noise on the Graylog log console from the above type of messages is way overboard. We would never be able to troubleshoot anything with Graylog due to the volume of these entries.

If needed, I can provide more details about our setup. We saw / see this when running Graylog 2.4, 2.5 and 3.0, and also with ElasticSearch 5.6 and 6.5. Basically we see all the same results from before we upgraded and now after upgrade to the latest version. Graylog itself is running on docker.

The error message tells you already. The GELF Messages you send have a missing field. As a consequence the received message is dropped.

How to fix? Check the sender, update the sender, make a bug report at the sender.

Thanks. I’ll work with our developers to see if we can isolate which container is generating this and then we’ll go from there to see if someone can look into.

Any news?

It is known bug in docker: https://github.com/docker/for-linux/issues/354

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.