I have a general questions about getting a custom log source into Graylog.
Say my log source is a file which I am sending into Graylog using NxLog. One option would be to use the JSON module of NxLog to create JSON logs and send the logs in to a GELF input.
If I took this option, would I not have to write an extractor to get the fields out I wanted as JSON is supported?
If I choose not to use the JSON module of NxLog but send the raw logs into a GELF input, I am assuming I would need an extractor to create fields. Am I Correct.
you would need - in any choosen way - to parse the logfiles, or better the lines you submit. This could be done on the sender site and you then transfer the messages as key-value so only the kv extraction is needed. Or you just throw in the messages into a raw input and extract the wanted information on Graylog.
How you work depends on where you have more ressources available and what is best known to you.
If you use a raw text input then you will need to parse the logs. You can set up some kind of extractor on the raw plain text input or use a pipeline processor to parse the logs.
I personally choose to use grok pattern extractors on the input about 90% of the time, due to ease of use and versatility.
Yeah, just look at each fields and find a grok pattern that will match all possible inputs. You can also use regex to create new grok patterns if none of the standard ones work.