Just getting started migrating our on-prem Graylog (Open) instance to the cloud and have discovered that I cannot use all my lovingly-crafted extractors. Apparently I now have to figure out how to manage the same task with a pipeline.
So let’s start with issue #1 - I previously used a split and index extractor to identify the correct source for the logs from our ASAs because currently this is what I get:
I would try using a GROK pattern within the pipeline rule. Here’s an example I use for messages from a BIND9 instance:
rule "Bind Split Message for Query Type"
when
has_field("QueryData") and contains(to_string($message.QueryData)," IN ")
then
set_fields(
fields:
grok(
pattern: "query: %{DATA:Query2} %{DATA:QueryDirection} %{DATA:QueryType} %{DATA:ResponseCode} \\(%{IPV4:NatDstIP}\\)",
value: to_string($message.QueryData)
)
);
end
In your case you could look for “Rhodes_ASA” (or maybe just “_ASA”) and if found, parse the message with a GROK to meet your needs.
This is what I ended up using (seems to do the job):
rule "ASA Pipeline Extract Source"
when has_field("facility")
AND (contains(to_string($message.facility),"local3") || contains(to_string($message.facility),"local1"))
then
let asa_src = grok(
pattern: "%{CISCOTIMESTAMP} %{WORD:source}",
value: to_string($message.message),only_named_captures: true
);
set_field("source",asa_src.source);
end