Centralized log collection with Graylog

My Graylog server is busy collecting logs from both linux (rsyslog), Mac (syslog) and Windows (nxlog using syslog) clients. Being a government agency, we need to archive logs for a certain time. We have an Arcsight log collector that I would like to send all incoming Graylog messages to. It’s running syslog on udp/514 just to make this discussion easier. I configured the syslog output on the Graylog server and aimed it at a Linux machine running rsyslog for incoming udp/514 message. The messages show up on the rsyslog server but there are a few extra fields at the beginning of each log entry. I’ve set the output to plain, structured, ref and full message types, but I don’t get the same log as the Graylog server received. I would think I could just bounce the messages off the incoming stream directly to the remote syslog server, but it’s not working.

  1. Is it possible to use the rsyslog service to accept incoming messages on Graylog and have those as input? That way I could easily add “.. @machinename” to the rsyslog.conf and send the messages to the remote server.

If not, I could set up the archive server and then use the same idea to send the messages to Graylog. I’m not sure if the Arcsight collector will do that, but I think it will. I would just have to change firewall rules and some IP addresses to get traffic where it needed to go.

Any thoughts?

Here is a sample of the original message (dimmitt is the machine name):
Mar 29 15:45:44 dimmitt pcscd: commands.c:959:CmdGetSlotStatus Card absent or mute

Here is the log after Graylog munges it somehow:
Mar 29 15:45:44 dimmitt user-level - dimmitt pcscd: commands.c:959:CmdGetSlotStatus Card absent or mute

if you need archiving, maybe the archiving plugin is something you like to try.

But to answer your question, you can always have as many syslog servers that just forward messages to one or more target without problems.

You could clone the stream you are using to process the logs. Basically use one stream for processing (assuming you are using a pipeline for processing) and logging to an elastic index and use one stream (a clone of the first one) to send the logs to a specific output. I use the syslog output to send the log files to the local syslog daemon which is picked up by a local nxlog daemon who then writes it to a file which rotates on x MB or x time. It should be possible to do this remotely instead locally.

There are some caveats (mostly with multiple graylog instances, which i’m still figuring out) but some advantages are that you can actually import the files again, you can manage the nxlog instance from graylog (setup a archiving tag) and retention is possible based on time and size of files. I’ve created an input directory where I can drop the unpacked files, if a file is dropped there its read again by the local nxlog instance and send to the graylog2 server. If a structure format is used like RFC5425 logs can be restored using the nxlog module for this log format. It should be possible to do all of this securely, its even possible to encrypt the archived log files if required. Note that if you are using pipelines check which is triggered first stream handling or the pipeline in you graylog instance, you wanna put streams first if logs should be send unprocessed.