I have deployed GrayLog 3 and everything is ok for UDP SYSLOG messages.
I wanted to enhance the functionality by forwarding FreeIPA logs using the sidecar/beat approach.
From the documentation it seemed like a straightforward config however i haven’t been able to send any log to the GrayLog.
In addition when i enter “graylog-sidecar -debug” i get the following output
[ConfigFile] YAML config parsing failed on /etc/graylog/sidecar/sidecar.yml: yaml: line 100: did not find expected key. Exiting.
See that stickied post about asking good questions But not to be too mean; can you paste your sidecar config (minus any passwords, tokens, etc.) on pastebin and link it here? Or paste it here but be sure to format it properly with the </> button on the message editor.
Because right now there isn’t enough info to answer anything
I found the issue, after working for 14 hours… it was the identation…
Fixed it and now IPA logs are forwarded OK to GrayLog.
Next step parsing of events and from what i can see there is no ready made pipeline
I have successfully forwarded my openldap logs to graylog with the filebeat.
In order to split them and get more visibility i have tried the pipeline approach.
For the pipeline a dedicated stream has been configured called “ipaAccessLogStream”.
This is matched when there is a word “slapd” within the directory of the obtained log. This works ok.
Then the pipeline rules are created.
For rule 0 a simple match is required based on the field “message” included within the filebeat input.
Within the Graylog search i can see the field, however, for some weird reason i cannot get it a match.
What happens then is a grok filter which seams ok through simple Grok testing.
How can i troubleshoot this?
Paste a few sample log entries, along with your pipeline rules (don’t forget to format with </> button), and we can take a look and see what’s going on. Or not going on. Or maybe it’ll just fix itself
Go to the System > Grok Patterns page and enter the entire pattern you use there and name it, so you can just grok("%{MYSHINYPATTERN}", ...) for ease of use
Also try to set the (if I recall correctly) only_named_captures parameter to true to avoid picking up random things
For the Grok pattern, I do believe certain things need to be escaped if used in a function, unsure of whether it would need it when stored as a grok pattern (Via System > Grok Patterns) but… when in doubt, escape all the things anyway.
You can also try re-creating the pattern through the grok pattern editor (System > Grok Patterns, big button top right called “Create Pattern”) and see if that lets you get a working one assembled.
We did the grok pattern approach but read that it is better to go for pipelines as to allow per stream processing capabilities.
Just for your info when i try to simulate the pipeline through a raw log and select the beats client and stream the processing is returned as OK.
The only issue i have found is that most likely the
has_field("message")
is the issue.
Anyway, i’ll try and play around with more match conditions and see if this fixes things, if not most likely i will revert to simple grok parsing instead of pipelines.
No, no, what I meant is that instead of defining the grok pattern in a pipeline rule is to create it in the grok editor and save it, so you can refer to it more easily
Pipelines are wonderful things, we use about 70 of them in our setup
If you want to debug the pipeline rule, you can always just use true as the condition instead of has_field() - then it will trigger on everything entering that particular stream. Also make sure the pipeline is in fact connected to the stream you expect to see these events come in on