Messages go to journal but never proccesed

I am running Graylog, Logstash, Elastic and mongodb with a docker-compose file that i found and crafted to use logstash as well.
The original is here
i have just added logstash and connected.
But i managed to ship my logs from a remote machine, all seem to work but graylog journal gives the following
-92,807 unprocessed messages are currently in the journal, in 1 segments.
17 messages have been appended in the last second, 0 messages have been read in the last second.

It does not proccess the messages.
Any hints?


when you journal gets a minus -xxx message than the journal is corrupt and you should remove the journal and check if the disk where you try to place that has enough space available.

Thank you, i removed the journal, cleared some space and i dont have - in the messges although not its full of

2019-10-29 11:24:11,817 ERROR: org.graylog2.shared.buffers.processors.DecodingProcessor - Error processing message RawMessage{id=a19ae4b0-fa3e-11e9-90f8-0242ac150005, journalOffset=60953, codec=gelf, payloadSize=804, timestamp=2019-10-29T11:24:11.771Z, remoteAddress=/}
graylog_1 | java.lang.IllegalArgumentException: GELF message (received from has invalid “host”:

read the error till the end: has invalid “host”:

your sending GELF messages that are not valid

Thank you once more .

Here is my logstash.conf which is pretty straight forward

 input {
      beats {
        port => 5044

output {

  udp {
     host => ""
     port => "12201"
#  stdout {
#    codec => rubydebug
#  }

and through filebeat i am sending a diagnostic.log of a weblogic soa domain
Is this Valid?

Thank you once more.

let me ask two questions:

  • why do you not send the beats direct to Graylog?
  • why do you think your output is GELF to Graylog with this configuration?

If my output is to elastic then how to i configure graylog to listen to it . What do i put in they graylog interface?
Idealy i would ilke to have filebeat send the diagnostic.log of the weblogic to a logstash in the stack and from there somehow to be indexed and be visible to graylog.

he @ncostis

ok - I think I got it.

In the beats configuration the output logstash can also be used to send to a Graylog BEATS input. The naming might be confusing in the Beats configuration - but that is something we can’t change. Simple because that is not our product.

So configure a BEATS Input on Graylog and point the LOGSTASH output to this configured input on Graylog.

You do not need a logstash between that communication. In addition the ingest to Elasticsearch and the Processing is done by Graylog. That is also the only way that you can use Graylog to visualize the data. The data needs to be ingested via Graylog to show and search them via Graylog, because that way the needed meta data is added.

Is that more clear now?

Yes it makes more sense now.
Thank you
I will try that !!!


Weirdly my initial setup worked when i install them piece by piece, but with containers it does not !

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.