Could someone tell me if it is expected behaviour for messages that have been routed to a different stream and removed from the current one to still be processed by later stages in the pipeline attached to the original stream?
I was trying to reduce noise by having an early stage match on certain keywords and remove the messages and then have later stages use regex to break the message down into fields. But it’s clear that it’s running that stage on messages that have been removed in the first instance.
I’m not currently dropping the message but am routing it to a ‘debug’ stream/index that only has an hour retention period, just so that I can watch for messages being removed in error. If it was being dropped, presumably it wouldn’t keep processing it, but is it normal that it would keep running through all steps of the current pipeline in this scenario?
If so then I might turn my pipelines around and do field extraction first, then pattern match on a shorter string to look for keywords to drop…