Pipeline processing continues after messages getting dropped in previous stage of pipeline

Hi All,

We are using Graylog 3.1.1, Elasticsearch 6.8.1 & Mongo 3.6.13 versions in our environment. I have multiple stages in graylog pipeline rules,

First stage,

In stage-1 I am trying to get id from lookup table and assigning the value to the field “lookupValue”

rule "lookupValue"
    to_string($message.logType) == "filebeat"
  let host = to_string($message.hostname);
  let id = lookup_value("lookupfilebeat", host);
  set_field("lookupValue", id);

Second stage

In stage-2 if my lookupValue is equals to the value from the current msg then I am dropping the entire message itself.

rule "drop_lookup_Value"
    to_string($message.Value) == to_string($message.lookup_Value)

Third stage

In stage-3 if my Value field and lookup_Value is not same then I am creating new msg and routing it to different stream but now the problem is though the messages are getting dropped in stage 2 but pipeline is processing the dropped messages in stage3 and new message are getting pushed to testbeat stream and ideally the message should not pass through the next stage if it gets dropped in previous stage.

Please correct me if I am doing anything wrong in pipeline rules and let me know your thoughts.

rule "route-to-stream"
    to_string($message.Value) != to_string($message.lookup_Value)
 	let msg = create_message("", "");
    let hostname = to_string($message.hostname);
    let LABEL = "HOST";
    set_field("LABEL", LABEL, "", "", msg);
    set_field("id", hostname, "", "", msg);
    route_to_stream("testbeat", "", msg);

Ganeshbabu R

just for reference.

this is a known bug with an github issue: https://github.com/Graylog2/graylog2-server/issues/4855

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.