Reliably send structured logs from java app without grok filters

Hi all,

I would like to reliably and securely send the logs from a java application to Graylog without relying too much on error-prone conversions, grok filters, pipelines, etcetera.

My wishes:

  • minimal delay between generating log event and seeing it in Graylog
  • All custom fields (MDC, PID, java version, region, …) should end up in separate fields in Graylog (and hence in ES)
  • minimise the number of log events that get lost when Graylog is temporarily down or unreachable
  • Avoid excessive use of string parsing (grok filters, complex pipelines, overly complex parsing by filebeat, …)
  • Avoid having to change filebeat and/or Graylog config every time we add a new field to our log events

Status so far:

The app is using log4j2 but switching to logback would be OK if it would solve all requirements.
I add a GELF input (TLS) and configured this appender http://logging.paluch.biz/examples/log4j-2.x.html in the log4j2 config of the app.
This is working fine: all custom fields arrive in Graylog without any complex config.
=> So the first two wishes are met :slight_smile:
But this appender drops log events when it cannot reach Graylog.

So I wrapped the Gelf appender in a FailoverAppender.
https://logging.apache.org/log4j/2.x/manual/appenders.html#FailoverAppender

My idea was to buffer events locally when the Gelf appender fails to talk to Graylog:

  • the failover appender would temporarily log to file
  • use something like Filebeat to forward the events when Graylog (or the network) is back

But I am having trouble to get all custom fields to Graylog.

I tried these layouts:

=> I only see these fields in Graylog: facility, file, input_type, name, offset, source, tags, type, message and timestamp

The console output of filebeat does print the other fields (like threadId, level, loggerName, …) but they don’t make it to Graylog.

When I disable json parsing by Filebeat, the “message” field contains a JSON object with all my custom fields.
I tried to write some Pipeline rules to extract the fields, but I encountered some issues (will start a new topic about that) and I think there must be a more elegant solution.

Do you have any suggestions for a solution that meets the requirements above ?
I am new to filebeat so maybe I am missing some features that could be helpful ?
Or maybe filebeat is not the right tool for this job ?
Ideally something that can tail structured (JSON, GELF,…) files and send the events to the GELF input ?

I noticed log4j has a GELF layout, but it is probably only used to send events directly to the GELF input, not for locally buffering to file ?

I have full control of the logging app and could switch logging frameworks or use a custom appender / layout.

It feels wrong to treat the log events as “just strings” and have to individually parse all fields out of it when the data we start with (on the client side) is already structured.

Any advice would be greatly appreciated.
Maarten

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.