Hello,
I have set up Graylog to process different logs from the same machine with Filebeat and Graylog pipelines. However, the pipelines don’t seem to filter anything with the grok I set up in them and instead the full original messages go through.
My configs / rules look as follows:
Sidecar-Filebeat:
# Needed for Graylog
fields_under_root: true
fields.collector_node_id: {sidecar.nodeName}
fields.gl2_source_collector: {sidecar.nodeId}
filebeat.inputs:
- type: log
enabled: true
paths:
- "PATH/TO/GLASSFISH/LOG"
fields.log_type: "glassfish"
- type: log
enabled: true
paths:
- "PATH/TO/JETTY/LOG"
fields.log_type: "jetty"
output.logstash:
hosts: ["localhost:5044"]
path:
data: /var/lib/graylog-sidecar/collectors/filebeat/data
logs: /var/lib/graylog-sidecar/collectors/filebeat/log
And my Jetty-Pipeline (format is the same for all pipelines with different keywords and patterns obviously)
rule "Divide and jetty"
when
true
then
let message_field = to_string($message.message);
let parsed_fields = grok(pattern: "%{TIMESTAMP_ISO8601}|%{LOGLEVEL} ?|%{GREEDYDATA:Handler}| - %{TIMESTAMP_ISO8601:time} %{LOGLEVEL:level} ?# %{NUMBER:msg_nr}: (?<thread>[%{NUMBER}] (%{NUMBER})) %{GREEDYDATA:msg}", value:message_field, only_named_captures: true);
set_fields(parsed_fields);
end
This pipeline is applied to my all messages stream and according to Graylog, there’s through-put, but no extracted output.