I have built a GROK extractor for a certain type of logs, which also works in the extractor preview, but the extractor still has 0 hits.
Example:
<166>2024-07-25T06:50:56Zasa : %ASA-6-113010: AAA challenge received for user USERNAME from server SERVER_IP.
GROK_PATTERN:
<%{GREEDYDATA:ID}>%{GREEDYDATA:DATE}T%{GREEDYDATA:TIME}Z%{GREEDYDATA:HOST} : %{GREEDYDATA:ASA_Number}: AAA challenge received for user %{GREEDYDATA:USERNAME} from server SERVER_IP.
The GROK pattern seems fine. I tested it in a pipeline rule and it works with your example.
Are you seeing anything in the Processing and Indexing Failures stream (if you have an enterprise license)?
Do incoming messages differ from your example in any way?
There are of course incoming logs that are different from each other but I assumed that I could simply create multiple extractors for the different logs and then the log would be used to decide which extractor to select.
I don’t either, based on the log data and grok pattern you shared.
One way to get to the bottom of it would be to create a pipeline rule with grok-matching instead. Then you can insert debug statements, output intermediate match values as message fields, etc. to figure out where things are going sideways.
Ok, I previously loaded the extractor onto the input as normal but it caused problems.
I have now created a pipelinerule and it works fine.
When
if string value in ‘Message’ contains ‘AAA challenge received for user’
Then
Match grok pattern ‘<%{GREEDYDATA:ID}>%{GREEDYDATA:DATE}T%{GREEDYDATA:TIME}Z%{GREEDYDATA:HOST} : %{GREEDYDATA:ASA_Number}: AAA challenge received for user %{GREEDYDATA:USERNAME} from server SERVER_IP.’ to field ‘message’ and set fields for matches