In attempting to write pipeline rules to properly break up and label all these values, the issue I’m running into is that there doesn’t seem to be any functionality that allows me to loop through all lines ingested to apply these rules. Using a lookup table to temporarily store and process won’t work for me either, as the CSV I’m ingesting uses no quote characters and the CSV data adapter doesn’t seem to like this.
Is there any way to loop through pipeline rules in the current graylog version that I’m not seeing? Or am I approaching this the wrong way entirely?
You would send each line from the CSV file to Graylog (for example with Filebeat) and can process each message with a Grok pattern matching the columns of your CSV file.
My apologies for my ignorance and hopefully it’s okay I’m responding a week after the thread was opened - I can’t seem to get a Grok pattern figured out for the CSV I’m ingesting. I can’t seem to find any samples or anyone else needing the same things on the forum, either. I feel like I must be looking at this the wrong way.
I wrote a small grok pattern just as a sort of test to see if it would parse out the first field of my data, and it does - however upon trying to save the pattern I get this error kicked back from Graylog: