Would anothe roption be to use something like NXLog imfile module to read the file and then send it to a GELF input. Would this work as the json should contain the fields.
Alternatively, would it be better to use a GROK filter or pipleine rules to parse out the fields.
Test and setup go-audit
Either log to syslog or file in say /var/log/go-audit/audit.log
Convert existing audit.d rules to go-audit format at a later stage
Can you tell me the level of JSON support in syslog messages within Graylog? I would expect that the mixture of normal syslog style and JSON syslog style will break things.
So I will probably try to use file format with NXlog to a nice Gelf Input which should resolve the issue.
What do you think? Or are key-value pair the way to go in your opinion?
adding nxlog to the game would add a new moving part that can fail …
I would do that only if really needed, because messages are truncated (or similar). Is your goal to group all messages that belongs to a specific action into one message?
1MB is a pretty large message size, when most messages I have seen are usually 1.5. - 2k at most with large messages due to Cisco firewalls.
I will try the method outlined in the link above, but we are using a custom auditd rules configuration, so it may need some tweaking. I will let you know.
One last thing has just come to mind, our auditd logs are coming in over syslog using rsyslog to a syslog UDP input in graylog.
Using a pipeline rule to match them in the initial stage as per the blog post would it be better to tag (as in post) or route them to a different stream?
Then we ran filebeat in interactive mode (/usr/share/filebeat/bin/filebeat -c -e ) whilst debugging parts of the full configuration file. Within our config we set the logstash output as well as the document_type variable in the prospector to document_type: auditd
This then makes the rules work as per the blog post.