Question about creation and testing of pipeline rules?

Before you post: Your responses to these questions will help the community help you. Please complete this template if you’re asking a support question.
Don’t forget to select tags to help index your topic!

Describe your incident:
We have several GELF inputs to receive log messages from various servers in our network. Those GELF inputs also have extractors attached to them (if you see the “manage extractors” button) that I believe are dependent on the GELF inputs, and in the messages stream we will see the extractors doing their job by parsing out the applicable information from the various fields. I want to create pipeline rules to replace our extractors, but in doing so and in my testing I do not wish to disrupt our current processing set-up. Is it possible to do testing by;

  1. Create a new GELF input and a new stream, and have the other GELF inputs be excluded from this stream, and connect this stream and new GELF input to the pipeline I will create so that the new stream only processes pipeline rules?
  2. On a VM download a IDS tool like Suricata, and install FluentD and configure it to send log messages to the new GELF input?

Describe your environment:

The Graylog Server OS is CentOS Linux 7 (Core)
The Graylog Version v4.2.6
RPM is Linux 3.10.0-1160.53.1.el7.x86_64

Sounds reasonable. If your pipeline takes a lot of processing power, then that could affect the throughput of other streams.
The test messages will end up in OS along with real messages. I’d prefer doing it on a separate test system for that reason.