Cisco Meraki Logging field conversion

I’m trying to refine the incoming syslogs from 30 Meraki’s. Key=Value works for most of the message, but I am trying to convert some data into columns where the value changes based on which Meraki is sending the data.
Snip:

<134>1 1634220216.200448655 MERAKI1 flows src=0.0.0.0 dst=224.0.0.251 protocol=udp sport=420 dport=420 pattern: allow all

For the above, I can K=V the SRC, DST, protocol, sport, and dport. But I can’t seem to find a way to turn “MERAKI1”, “flows”, and “pattern: allow all” into fields that would contain these values (all of which can change).

One solution might be to use a pipeline rule to convert all spaces into commas and then turn each csv into a generic column, but I have no idea how to do most of that.

Ref: using Graylog 4.2/ElasticSearch 7.10.2 on latest Ubuntu, Input is RAW UDP.

A pipeline rule would work - you could snip out the non K=V parts with GROK/regex and then K=V the leftovers.

I’ve tried using Regex Replacement extractor to convert \s into , and then use a CSV to Column converter in the hopes that it will break all of these items into their own fields, but so far it doesn’t work and I like the idea of using Pipeline rules to format the Message data, but don’t know how, as I can’t find any solutions that tackle the pesky whitespaces. I wish Meraki was a bit more diligent in their K=V fields (or at least provided options for syslog output format (CEF, JSON, etc). But alas.

GROK handles spaces pretty easily. Use an online GROK debugger here to test/create an extractor or use in pipeline - Use GREEDYDATA to skip over the KV part and handle it separately (before/after), There is also a solution in the Marketplace here from this guy … but it is older so its highly likely you would have to pull it apart to get anything good from it.

I like working in pipelines so I can’t help much in extractors. With a pipeline I can use the initial stage to figure out what type of log is coming in, flag it, then use subsequent stages/rules that take actions based of of how the message was flagged.

1 Like

Thank you for pointing me in the right direction. I’ll share the wealth here in order to help others with a similar issue if I discover it with your links.

I was not able to adapt any of the existing methods, so I created grok patterns which get the job done. I have no idea how to integrate this into Pipeline rules, but using these as Extractors seems to work better than anything else (certainly better than Meraki actually adhering to RFC 5424).

  • MERAKI_BLOCKED_CATEGORIES (blocked_categories)=([ 0-9_A-Za-z-]*)
  • MERAKI_CATEGORY (category0)=’([ A-Za-z-]*)’
  • MERAKI_DECISION decision=\w*
  • MERAKI_EVENT events %{MERAKI_NAME}
  • MERAKI_NAME (?:[a-zA-Z]+[a-zA-Z0-9]*)
  • Meraki_Result (?:[a-zA-Z]+: [0-9a-zA-Z].*)
  • MERAKI_SECURITY_MESSAGE message: [a-zA-Z0-9 -]*
  • MERAKI_SPI [a-zA-Z0-9():]+

I also used this generic K=V parser pipeline rule.

///////////////

rule "key_value_parser"


when

has_field("message")


then

set_fields(


fields:

key_value(

value: to_string($message.message),

trim_value_chars: "'/.",

trim_key_chars:"",

delimiters:" ",

kv_delimiters:"="

)

);

end

////////////////

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.