GROK Pattern works in the extractor preview, but the logs are not processed

Hello,

I have built a GROK extractor for a certain type of logs, which also works in the extractor preview, but the extractor still has 0 hits.

Example:

<166>2024-07-25T06:50:56Zasa : %ASA-6-113010: AAA challenge received for user USERNAME from server SERVER_IP.

GROK_PATTERN:

<%{GREEDYDATA:ID}>%{GREEDYDATA:DATE}T%{GREEDYDATA:TIME}Z%{GREEDYDATA:HOST} : %{GREEDYDATA:ASA_Number}: AAA challenge received for user %{GREEDYDATA:USERNAME} from server SERVER_IP.

extractor preview:

“ID": 166,
“DATE": ‘2024-07-25’,
“TIME": ‘06:50:56’,
“HOST": ‘asa’,
“ASA_Number": ‘%ASA-6-113010’,
“USERNAME": ”USERNAME”

This is probably not the best example, but the point is that it works.

To test, I even searched for the same log in the search bar and the log is not processed.

Then I waited to see if the next logs that came in would be processed and unfortunately that wasn’t the case either.

I thought that maybe my grokpattern was not working but every time I check different logs through the extractor it works in the extractor preview.

Maybe someone already had the same problem and could help me.

We are using one Graylog-node 6.0.4 on a Ubuntu 22.04.4 LTS system.

Best regards

The GROK pattern seems fine. I tested it in a pipeline rule and it works with your example.

Are you seeing anything in the Processing and Indexing Failures stream (if you have an enterprise license)?
Do incoming messages differ from your example in any way?

Hello Patrick,

Thank you for your reply.

We do not have an enterprise license.

There are of course incoming logs that are different from each other but I assumed that I could simply create multiple extractors for the different logs and then the log would be used to decide which extractor to select.

Or am I misunderstanding this?

Best regards

The extractor is associated with an input. It tries to process all messages that are received by that input.

Wouldn’t each message in the input have to be checked by each extractor and then see which extractor the message matches?

If there are multiple extractors for an input, then each will be run against every message received on that input.

Is it a problem?

If so, how can I solve it?

Not a problem. Just mentioned that for clarification.

Thank you very much.

However, I do not understand this phenomenon, why the logs are not output in the correct format.

I don’t either, based on the log data and grok pattern you shared.
One way to get to the bottom of it would be to create a pipeline rule with grok-matching instead. Then you can insert debug statements, output intermediate match values as message fields, etc. to figure out where things are going sideways.

I can’t explain it but I really didn’t anything but now all the extractors I created are working.

However, the problem persists when I create new extractors.

I could of course stick to the tactic and not change anything on the extractors for another 14 days and maybe it will work but that’s kind of weird :slight_smile:

Ok, I previously loaded the extractor onto the input as normal but it caused problems.

I have now created a pipelinerule and it works fine.

When
if string value in ‘Message’ contains ‘AAA challenge received for user’

Then
Match grok pattern ‘<%{GREEDYDATA:ID}>%{GREEDYDATA:DATE}T%{GREEDYDATA:TIME}Z%{GREEDYDATA:HOST} : %{GREEDYDATA:ASA_Number}: AAA challenge received for user %{GREEDYDATA:USERNAME} from server SERVER_IP.’ to field ‘message’ and set fields for matches

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.