Hi @Ponet,
It’s not exactly using multiple regex patterns, no. It’s a java date converter you can use to specify the date and time and convert properly so that the incoming timestamp field is correct on inital ingestion and you don’t have to worry about running additional pipelines or extractors to correct date / time issues. You can see examples of folks using it here and here. It’s similar to how you might try to coerce a string into an integer (%{NUMBER:time:int} (?:%{NUMBER:bytes:int}|-)).
For how the grok parser works I think it’s easier to not think of that as a normal regex. Putting it in quotes or escaping the periods doesn’t resolve this particular issue. I think if we were in a pipeline doing something like this with parse_date:
rule "Extract and convert timestamp"
when
has_field("my_application")
then
let message_field = to_string($message.message);
let parsed_fields = grok(pattern: "^%{MY_MESSAGE}", value: message_field, only_named_captures: true);
set_fields(parsed_fields);
let new_date = parse_date(to_string($message.app_timestamp), "yyyy-MM-dd HH:mm:ss,SSS", "en-US", "America/New_York");
set_field("timestamp", new_date);
end
You would be absolutely correct.
But the grok patterns themselves don’t require that - I’m using that exact same format with multiple time stamp formats but they all end in ss.SSS when using milliseconds. I think this is the first one I’ve dealt with using ss,SSS instead which brought me here.
If there are no good solutions I may just end up running it through an additional pipeline rule to see if parse_date will do what I want but I’m trying to avoid introducing an extra processing step.
Cheers!