Pipeline grok syntax

I’m completely new to Graylog so forgive my ignorance. I’m trying to import logs from various applications using Filebeats. I setup the very first log and, as expected, it needed to be parsed. I created a Grok Extractor for the filebeats input and that worked like a charm. But… I have different formats I’m going to be sending to Graylog.

I think I’m supposed to use Pipelines to handle this situation. So I added a document_type field in Filebeats to indicate the type of log file coming in and Graylog sees that field just fine. I created a Pipeline and looked for how to use that field and tried to create this rule:

rule “parse log4j"
when has_field(“document_type”) && to_string($message.document_type) == “log4j"
then
set_fields(grok(”%{TIMESTAMP_ISO8601:timestamp}\s+%{LOGLEVEL:loglevel}\s+%{NOTSPACE:classname}%{GREEDYDATA:message}”, $message))
end

But that gets a syntax error. How should I be doing this? And, another caveat, the extractor didn’t overwrite the timestamp that Graylog uses, so hopefully someone has a pointer for that as well.

Thanks

Not sure about using pipelines. Haven’t made much use of those yet. Where possible it is generally easier and less stressful on graylog if the logs are already structured before shipment. You can do this with filebeat or nxlog. Filebeat works great in this that you can have all of this done for you before shipping to graylog. You can have multiple logtypes/inputs and multiple outputs in whatever format you want. I recommend JSON being a JSON extractor on the graylog side makes short work of ingesting logs. GELF is even better as there is no extractor needed. Just a GELF input and BOOM! Everything comes in already parsed and formatted in the correct fields dictated in your NXLOG/FILEBEAT config. Not sure if filebeat supports GELF output but JSON is for sure doable. NXLOG does support GELF output. Use it almost everywhere we can. I’m sure there are other possible methods that can work but this works exceedingly well in our implementation.

Filebeat doesn’t support grok or parsing of log files. At least not that I can see.

Got it. The answer to both my questions. In Filebeats add a field ‘document_type’ and set to the type of log file you want parsed. In the below example I’m parsing the type ‘log4j’. In order to get the timestamp to be recognized as the “valid” one, you have to parse it as a date. In addition I’m saving the full original log line as “original_message”.

Create the rule and add it to the ‘all messages’ stream.

rule "parse log4j"
when
has_field(“document_type”) && to_string($message.document_type) == "log4j"
then
let message_field = to_string($message.message);
let parsed_fields = grok(pattern: “%{TIMESTAMP_ISO8601:tx_timestamp}\s+%{LOGLEVEL:loglevel}\s+%{NOTSPACE:classname}%{GREEDYDATA:message}”, value: message_field);
set_fields(parsed_fields);
let date = parse_date(to_string(parsed_fields.tx_timestamp), “YYYY-MM-dd HH:mm:ss,SSS”, “EST”);
set_field(“timestamp”, date);
set_field(“original_message”, message_field);
end

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.