Pipeline processing continues after messages getting dropped in previous stage of pipeline

Hi All,

We are using Graylog 3.1.1, Elasticsearch 6.8.1 & Mongo 3.6.13 versions in our environment. I have multiple stages in graylog pipeline rules,

First stage,

In stage-1 I am trying to get id from lookup table and assigning the value to the field “lookupValue”

rule "lookupValue"
when
    to_string($message.logType) == "filebeat"
then
  let host = to_string($message.hostname);
  let id = lookup_value("lookupfilebeat", host);
  set_field("lookupValue", id);
end

Second stage

In stage-2 if my lookupValue is equals to the value from the current msg then I am dropping the entire message itself.

rule "drop_lookup_Value"
when 
    to_string($message.Value) == to_string($message.lookup_Value)
then
	drop_message();
end

Third stage

In stage-3 if my Value field and lookup_Value is not same then I am creating new msg and routing it to different stream but now the problem is though the messages are getting dropped in stage 2 but pipeline is processing the dropped messages in stage3 and new message are getting pushed to testbeat stream and ideally the message should not pass through the next stage if it gets dropped in previous stage.

Please correct me if I am doing anything wrong in pipeline rules and let me know your thoughts.

rule "route-to-stream"
when
    to_string($message.Value) != to_string($message.lookup_Value)
then
 	let msg = create_message("", "");
    let hostname = to_string($message.hostname);
    let LABEL = "HOST";
    set_field("LABEL", LABEL, "", "", msg);
    set_field("id", hostname, "", "", msg);
    route_to_stream("testbeat", "", msg);
end

Thanks,
Ganeshbabu R