Pipeline with multiple rule configuration

I have a log file that has 40 different types of entries that are possible. There are a few that contain most of the 40 entries. At no time will a single log entry contain more than one type of entry. The selection criteria is only part of the entry so a match needs to be done based upon that the line contains the criteria and is not the criteria.

What would be the best type of approach to use in this situation? Is there something like a CASE statement?

Can you give some examples? 40 different types in a single file will make it pretty complex.

What’s the expected outcome? Do you want these messages routed to different streams, or a single stream (and therefore a single index)?

How are you planning on picking up the files? Filebeat I presume?

I personal would use the processing pipeline and build rules for all the possible combination - if you can identify easy what kind of parsing do you need.

Here are some of the conditions:

  • Error while trying to get directory listing
  • Broken pipe
  • remote side probably shut down
  • Connection reset by peer
  • read returned error 34
  • No routes defined
  • Closing connection
  • unknown segment

File pickup done by filebeat

The outcome is that different conditions will trigger different alerting steps.

that will be complex processing pipelines - but that is possible.

If you can parse (grok/regex) out the condition word(s) to fields, you could then compare the results in a lookup table that would give you a new field name of the alert/stream you want to route to… That way it would be less pipeline/rules and instead banging at a cached lookup table. It might be more efficient to build it that way, but not sure if it is more efficient for Graylog processing…

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.