Custom Log Source


(Jake Smith) #1

Dear All,

I have a general questions about getting a custom log source into Graylog.

Say my log source is a file which I am sending into Graylog using NxLog. One option would be to use the JSON module of NxLog to create JSON logs and send the logs in to a GELF input.

If I took this option, would I not have to write an extractor to get the fields out I wanted as JSON is supported?

If I choose not to use the JSON module of NxLog but send the raw logs into a GELF input, I am assuming I would need an extractor to create fields. Am I Correct.

Cheers
Jake


(Jan Doberstein) #2

He Jake,

you would need - in any choosen way - to parse the logfiles, or better the lines you submit. This could be done on the sender site and you then transfer the messages as key-value so only the kv extraction is needed. Or you just throw in the messages into a raw input and extract the wanted information on Graylog.

How you work depends on where you have more ressources available and what is best known to you.


(Jake Smith) #3

Hi Jan,

If I go for throwing the messages into a raw input. I will need to write an extractor to extract fields I want?

Is one option to use a pipeline rule with regex’s applied to the input to create fields?

Cheers

Jake


(GT) #4

If you use a raw text input then you will need to parse the logs. You can set up some kind of extractor on the raw plain text input or use a pipeline processor to parse the logs.

I personally choose to use grok pattern extractors on the input about 90% of the time, due to ease of use and versatility.

G


(Jake Smith) #5

G,

Cool so I can use a grok pattern extractor to create fields.

Cheers

Jake


(GT) #6

Yeah, just look at each fields and find a grok pattern that will match all possible inputs. You can also use regex to create new grok patterns if none of the standard ones work.

G


(system) #7

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.