Issues with date type converter format string

Hi there,

Description of your problem

I’m receiving nginx error log messages on a syslog input and try to read the timestamp from the message.
I created an extractor that extracts timestamp from the message:


Extractor works^^

Then I added a date-time converter and configured it like this:

Unfortunately this always results in this error message:

Value <2021/10/21 09:35:16> caused exception: Invalid format: "2021/10/21 09:35:16" is malformed at "/10/21 09:35:16".

Description of steps you’ve taken to attempt to solve the issue

After reading the docs, I tried a bunch of other format strings like:
yyyy//MM//dd HH:mm:ss
yyyy’/MM’/dd HH:mm:ss
but they always result in the same error message.

From all examples given in the docs and the placeholder in the graylog web UI I’d say that my first format string from the screenshot above should be correct. Am I missing something?

Environmental information

Operating system information

cat /etc/lsb-release 
DISTRIB_ID=Ubuntu
DISTRIB_RELEASE=20.04
DISTRIB_CODENAME=focal
DISTRIB_DESCRIPTION="Ubuntu 20.04.3 LTS"

Package versions

dpkg -l | grep -E ".*(elasticsearch|graylog|mongo).*"
ii  elasticsearch-oss                    7.10.2                                amd64        Distributed RESTful search engine built for the cloud
ii  graylog-4.2-repository               1-4                                   all          Package to install Graylog 4.2 GPG key and repository
ii  graylog-server                       4.2.0-3                               all          Graylog server
ii  mongodb-org                          4.0.27                                amd64        MongoDB open source document-oriented database system (metapackage)
ii  mongodb-org-mongos                   4.0.27                                amd64        MongoDB sharded cluster query router
ii  mongodb-org-server                   4.0.27                                amd64        MongoDB database server
ii  mongodb-org-shell                    4.0.27                                amd64        MongoDB shell client
ii  mongodb-org-tools                    4.0.27                                amd64        MongoDB tools

Logs don’t show anything useful, just messages that I’m updating the extractor settings :slight_smile:

Thanks in advance!

Hello there!
So i got the exact same problem after i updated to graylog 4.2.0
Seems to be a problem with the update.

Hello

I noticed your Date Format string is yyyy/MM/dd HH:mm:ss.

Not 100% sure but I think it might be from your forward slashes. Maybe try this configuration
yyyy-MM-dd HH:mm:ss.SSS

Below worked in my lab.

Hope that helps

Hi,

I followed your suggestion and used: yyy-MM-dd HH:mm:ss
Unfortunately with the same result.

I mean, I need to use the slashes? My input string is 2021/10/22 08:42:47 for example, and I can’t change it. That’s also why I need to use the Date converter.

Sadly this didnt work out for me as well.
I also tried to change the date thats delivered to graylog by my powershell script but nothing worked.
My config is pretty much the same as the one from @aft2d.

Ok so when trying to copy the needed timestamp from the message and convert it into another field i.e. timestamp2 everything works just fine.

grafik

You can notice the difference of the format between the two timestamps.
Maybe this matters?

Hello,

How did you go about converting it? Pipeline?

If you have a field called “timestamp” already you can covert it via pipeline but I believe If you creating a regular expression you cant use the same field you have to call it something else. Perhaps that is where the issue is.

Hello,

Can you show by example the date/time from nginx message your trying to extract?

On another note if you created a new field for you Regex then instead of using a converter maybe a pipeline would help.

https://docs.graylog.org/docs/functions#parse-date

Maybe something like this.

rule “parse date”
when
has_field(“some_field”)
then
let new_date = parse_date(to_string($message.some_field), “yyyy-MM-dd’T’HH:mm:ssZZ”));
set_field(“install_datetime”, new_date);
end

Hope that helps

EDIT:

So I’ve been testing this in my lab, just so happens I have nginx installed on a server. Using you Regex configuration from the picture you posted.

^.*:\s(\d\d\d\d/\d\d/\d\d\s\d\d:\d\d:\d\d)\s.*$

It seams not to be working

but this works

^.*(\d\d\d\d/\d\d/\d\d\s\d\d:\d\d:\d\d)\s.*$

EDIT:
I see your issue now.

I’ve been testing this in my lab for a while and what I did was create a Syslog UDP input for my Nginx logs. Then I configured a regex extractor for testing both of the following configuration which gave me the same results.

^.*:\s(\d\d\d\d/\d\d/\d\d\s\d\d:\d\d:\d\d)\s.*$
^.*(\d\d\d\d[/.-]\d\d[/.-]\d\d\s\d\d:\d\d:\d\d)\s.*$

Started sending logs again from a remote Nginx server to Graylog and as you can see it created the field new_timestamp and added another field called gl2_processing_error Invalid format: “2021/10/22 22:53:31” is malformed at “/10/22 22:53:31”. I believe that’s what you were stating, correct?

My next test was to send the Nginx log from the input to a stream and use the pipeline above but unfortunately that also did not work correctly. My pipeline knowledge is not that good to convert the Date/time yyyy/mm/dd into something of yyy-mm-dd.

What you want could be done through a pipeline. It should be able to convert Date/Time and create a new field for you but I just don’t know how yet. There are other community members here that are very knowledgeable with this type of issue. I’ll keep trying and see what I can come up with unless someone here jumps in.

I can reproduce this. If you choose any name out of timestamp it works.
If you use a flex date converter instead you even have the same format as in the default timestamp field. But the issue persists.

A guess:
Maybe it’s related to message processing order. Like: Extractor → timestamp parser → converter
This would result that at the step where graylog tries to parse the timestamp, the timestamp would be in the 2021/10/22 08:42:47 format and unable to read it?

I found another post with the same issue. Might want to check it out.

Not sure if this helps, but I came across this post earlier today with a similar issue with Nginx

Hello guys!
Sorry i was very busy the last days and had no time to test out stuff.
But i finally got it working again using a mix of extractor and pipeline.

Heres how i did it:

  1. Create an extractor to copy the timestamp from the message into a second timestamp field.

  1. Create a pipeline on the stream that gets the messages you want the timestamps to get changed.

grafik

Edit: You can also use has_field(“msg_timestamp”) instead of true if you want to.

  1. Change the message processor order under System > Configurations to the following:

Its important that the Message Filter Chain comes before the Pipeline Processor!

Of course you could do everything with the pipeline instead of using a mix but at this point i was just to lazy. I hope i didnt forget anything. If so just ask me and i will try to help you out.
Thanks for the help guys!

1 Like

Good Job :+1:

Can you show the results?

Hello there!
Here are the requested results:

So what im doing is send a syslog message with powershell to the graylog server:

In this case the message gets routed into a test stream. You can see that the timestamp has been updated:

Also i changed the pipeline to delete the “msg_timestamp” field after the right timestamp has been updated:
grafik

If you need something more just tell me!

Great to hear :+1:

But is this the intended way to do this?
In my opinion, this is a work-around, but not a solution for the actual issue?

Well as i said this problem occurred after the update to version 4.2.0.
Before the update i only used the extractor to update the timestamp.
Im not really sure if there is an intended way to do this.
For me this is just a workaround and im still hoping that with the next patch i can use the extractor to update the timestamps again.

Hello

I believe you are correct. Do to Nginx error.log file Date/Time you have to either convert the message before sending it or use a pipeline.
Not sure why you would need to covert the date/time from the message field I noticed that in my lab it is already correct. Here is my Nginx error log message that is in the input. No extractors or pipelines.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.