Hello,
Ah I see, So here is the scoop on that. Elasticsearch index template has a field called “timestamp”, the problem is trying to manipulate the “timestamp” field, as you can see you will get a gl2_processing error. So if you need to correct the timestamp field there will be two fields for the timestamp ( as shown above).
Only other work around I have seen in the forum is
- Create a new Index mapping template,
- Use a different extractor that is capable of using a DATE converter
- Send raw message to Graylog and convert the fields needed
- Leave the two timestamp fields.
Since the field timestamp_temp origin is from rsyslog.
This could be from a couple different issues.
- Daylight-saving time
- Synchronize computer time
- Elasticsearch uses UTC for timestamp
- User settings for time zone
- GL Server Date/time configuration, etc…
I would make sure all device have correct date/time even the user logged into the Web UI.
My apologies I haven’t used JSON log formatting, but I am familiar with the errors/troubleshooting. As I stated earlier JSON extractor cannot add DATE converters to JSON extractors. So I assume is this what you tried to do? I’m not sure what time zone is correct (UTC, GMT =+1, -6)?
What I was suggesting was something simpler, send the logs to Graylog then use pipeline. If I’m correct, it seams that Elasticsearch created a timestamp field already.
Note: this is for Central America (the region stretching between Mexico and Colombia),
rule "replace timestamp"
when
true
then
let result = regex("([0-9-T.:]+)", to_string($message.message));
let new_date = parse_date(to_string(result["0"]), "yyyy-MM-dd'T'HH:mm:ss.SSS","CDT");
set_field("timestamp", timestamp_temp);
end
Replaced invalid timestamp value in message
You maybe able to find your answer here.