Rules for routing messages to different indexes

1. Describe your incident:

Hello everyone.
I am starting with graylog and I would need someone to advise me on how to correctly route incoming messages.

I have several Sophos XG and UTM firewalls. As the logs are different, I have two INPUT, one for each type of firewall and in each of them I have configured the extractors I need.
I have also configured several Stream that move messages from the “Default index” to a specific index.

Each Stream:
Title: Stream1
Index Set: Index1
Remove matches from ‘Default Stream’: true

Stream rules:
A message must match all of the following rules:
source must match exactly X.X.X.X.X

Some of these Stream has some output configured.

So far everything seems to be fine.

Now I would need the messages received from a source to be stored in a different index depending on the value of the type_event field extracted in the extractors.

The summary would be

If source=X.X.X.X.X and the value of the Type_Event field is one of the list (type1, type2, …), then store the message in Index2.
The rest of the messages with source=X.X.X.X.X.X to be stored in Index1

The problem is that the condition “Type_Event is one of the list (type1, type2, …)” and “source must match exactly X.X.X.X.X” I don’t know how to apply them in the same stream.

I think I should use pipelines but I don’t know how to do it.

I have a single node with version open 5.2.3

I would appreciate any help or advice.

2. Describe your environment:

  • OS Information: Ubuntu 22.04.3 LTS

  • Package Version: 5.2.3+9aee303

3. What steps have you already taken to try and solve the problem?
I have tried various combinations of rules in the streams and using pipelines but they have not worked well.

4. How can the community help?
I hope to receive some indication or example with which I can implement the rule I need.

How many options will type_event have, and are you wanting a stream for each one or how does the values get mapped to the streams?

Hello everyone.

Thank you Joel for your interest.

I will try to provide more details.

I have in total 20 different streams, all of them, except for the third one I show below, are like these:

    Index Set: Index_LT_01
    Remove matches from ‘Default Stream’: True
    Source must match exactly

    Index Set: Index_LT_02
    Remove matches from ‘Default Stream’: True
    Source must match exactly
Stream_03: (it does not have any associated rule, nor any output)
    Index Set: Index_ST
    Remove matches from ‘Default Stream’: True

I have a pipeline associated to each stream. As an example I will put those associated to the stream_01/02.

Pipeline Stream_01
  Pipeline connections: Stream_01
  Stage 0: 
    Continue processing on next stage when: All rules on this stage match the message.
  Rules: (some rules like this:)
	rule "Rule 01.00"
	  ( has_field("source") && to_string($message."source") == "" )
	  ( has_field("action") && to_string($message."action") == "Drop" )
	  route_to_stream(name : "Stream_03", remove_from_default : true );
	  remove_from_stream( name : “Stream_01");

Pipeline Stream_02
  Pipeline connections: Stream_02
  Stage 0: 
    Continue processing on next stage when: All rules on this stage match the message.
  Rules: (some rules like this:)
	rule "Rule 02.01"
	  ( has_field("source") && to_string($message."source") == "" )
	  ( has_field("log_type") && regex("^(Anti-Spam|Content Filtering|SD-WAN|System Health)$",to_string($message."log_type")).matches == true)
	  route_to_stream(name : "Stream_03", remove_from_default : true);
	  remove_from_stream(name : "Stream_02");

Pipeline rules similar to those shown I have 4 or 5 for each pileline.
Everything is correct, except that the messages that I delete from the Stream_01/02 and I add them to Stream_03 are not routed to the Output_01/02 destinations.

I need the messages that go into Stream_01 and Stream_02, due to the rules associated with each Stream, to arrive at the destinations that are set in the outputs associated with each stream and after that they are removed from the Index_LT_01 or Index_LT_02 indexes and moved to the Index_ST index.
It would be better if the messages could be saved directly in Index_ST and not have to be removed from Index_LT_01 and Index_LT_02.

I hope I have been more clear and concrete with what I need to solve.
Thank you very much in advance for your interest.

Are you using extractors to parse out the fields?

It looks like you also have stream rules in the streams properties to route data into them.

And then you also have pipeline rules you posted.

Is that correct?

Hi, Joel

Yes, I’m using extractors to parse out the fields necessary in the pipeline rules. The extractors are in the inputs.

In the streams there is a rule that filters by source (for example: “Source must match exactly”) in order to route the events to the appropriate outputs.

The sources of each rule can be multiple, but the same ip is not in rules of different streams.

In the pipeline are the rules for the messages to be stored in the Index_ST through Stream_03.

I have no problem to do it in another way, as long as the messages can be routed to the appropriate outputs according to their source ip and according to the different values of the message fields could be stored in one index or another.

Messages coming from IP should be routed to Output_01 and stored in the Stream_01 or Stream_03 indexes according to their type.

Messages originating from IP should be routed to Output_01 and Output_02 and stored in the Stream_02 or Stream_03 indexes according to their type.



Is it not possible to manage the indexes in which the messages are stored and the outputs to which they are sent without using of the streams?


No, streams are built for this exact purpose and are the only option, every stream is attached to an index so the stream you choose will decide the index. Also outputs are attached to streams and will output all messages sent to that stream.

if a message is routed to two different streams for any reason, is a copy stored in each of the indexes associated with the streams?"

It can be yes, and if you run a search without limiting to a specific stream you will get two sets of messages returned in the results (ie duplicates)

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.