For example In this screenshot should Authorization was successful
before Requset finished in 56.999 ms 200 application/json; charset=utf-8
To be placed, unfortunately, this is not the case with the Timestamp field.
Where am I wrong?
thanks a lot.
Be patient, it is a community support, noone get selary for it.
I suggest to check your elasticsearch, and the field type. Meybe it doesn’t handle the millis.
Or try to use the default timestamp field.
Hi agian
I’m using a pipeline get Timestamp from message body and save to new filed(New_Time)
my pipeline:
rule "CreateNewFormatTimestamp"
when
from_input("5e8c0edb50401d5943c76fcf")
then
//"Timestamp":"2020-04-14T17:26:47.6146998+04:30"
let GetTime = ($message.Timestamp);
//2020-04-14T16:39:01.214
let GetTime = substring(to_string(GetTime), 0, 23);
set_field("New_Time", parse_date(value:to_string(GetTime), pattern:"yyyy-MM-dd'T'HH:mm:ss.SSS"));
end
check the function with the original timestamp field.
After check the timestamp field mapping in elasticsearch.
check the date format in source field and in yours.
(just random tips)
Another case, when i create new field with this pipeline:
rule "CreateNewFormatTimestamp"
when
from_input("5e8c0edb50401d5943c76fcf")
then
let GetTimeApp = ($message.Timestamp);
set_field("AOrginTimeStampApp", GetTimeApp);
end
in Elasticsearch type field is date but in graylog (3.2.4 version) type is Unknown.
2 things
please check the mappint to the original timestamp filed.
Check it with lower letters, in my system the timestmp field written by with lowercase letters (I’m not sure it can cause problem).
+1 you can check the pipeline with debug functions.
+1 and I see you showed an index’s mapping. Are you sure the message is in this index?
+1 did you do it on a wildcarded index name, and you rotated the indices after the mapping?
rule "CreateNewFormatTimestamp"
when
from_input("5e8c0edb50401d5943c76fcf")
then
let GetTime = ($message.timestamp);
let debug_message = concat("orginal timestamp is: ", to_string($message.timestamp));
debug(debug_message);
end
debug orginal timestamp:
INFO [Function] PIPELINE DEBUG: orginal timestamp is: 2020-04-17T14:20:51.400Z
Are you sure the message is in this index? yes @macko003 i’m sure.
but… you are using $message.Timestamp and debugging against $message.timestamp. If capitalization matters, and I think it does, you need to debug against the capitalized Timestamp… or change all things to lowercase timestamp.
Also - since you are working in a search that is related to fields in the message, stick with fields in the message rather than the capitalized Timestamp that looks like a Record Timestamp.
Might be wrong but that’s what it looks like to me.
Hi @tmacgbay. you right, GetTime in post number 8, is a Timestamp from the application, but after suggest @macko003 I used original timestamp then used in post number 9 let GetTime = ($message.timestamp);
so last debug it shows us original timestamp.
let GetTime = ($message.timestamp);
let debug_message = concat("orginal timestamp is: ", to_string($message.timestamp));
debug(debug_message);
Hi.
This problem still exists. Is there no solution? Is this a problem just for me and @encarta?
I want to make sure; my configuration; my timestamp; my elasticsearch; my application log or is everything else wrong or is it a problem with GaryLog; for example a bug?
thanks agian.
my best guess right now is that you have a compound value. Which means that at somepoint you ingested the timestamp field with a different type than date like string and after that elasticsearch can’t sort anymore.
Please have a look in to your field list in the sidebar and check the displayed type there:
Thanks, @konrad. I checked. timestamp filed is correct and type is date, even Gettime field is correct too (the second screenshot). (GetTime field separated using a pipeline).
this is a known issue and it is fixed. Means in the next 3.2 release (hopeful next week) this will be gone
So this is also part on 3.3 and that was the reason @konrad did not see this in his dev. environment. But as this was “small” it slipped out of sight from me and I did not remember.