i am using graylog version 2.5 and elasticsearch version 6.5.4, right now i am getting some issue with timestamp
{"type":"mapper_parsing_exception","reason":"failed to parse field [apiTimestamp] of type [date]","caused_by":{"type":"illegal_argument_exception","reason":"Invalid format: \"12/Jun/2019:07:45:59.055\" is malformed at \"/Jun/2019:07:45:59.055\""}}
can someone please suggest me how to fix this error and how to check which logs creating this error because multiple applications sending logs to my graylog
i am using logstash to send logs to graylog. is this what you are looking for ?
input {
file {
path => “/opt/lsy/log/test/test.log”
type => “testlog”
}
}
filter {
if [type] == “testlog” {
multiline {
negate => true
pattern => “^[%{TIMESTAMP_ISO8601}]”
what => ‘previous’
}
grok {
match => [“message”, “^[%{TIMESTAMP_ISO8601:logdate}]%{SPACE}[%{LOGLEVEL:severity}]%{SPACE}[%{DATA:thread}]%{SPACE}[%{DATA:class}]%{SPACE}%{NUMBER:lineNumber}%{SPACE}|%{SPACE}[%{DATA:componentName}]%{SPACE}|%{SPACE}[%{DATA:componentVersion}]%{SPACE}|%{SPACE}[%{DATA:applicationId}]%{SPACE}|%{SPACE}[%{DATA:clientId}]%{SPACE}|%{SPACE}[%{DATA:customerId}]%{SPACE}|%{SPACE}[%{DATA:busId}]%{SPACE}|%{SPACE}[%{DATA:userId}]%{SPACE}|%{SPACE}[%{DATA:testd}]%{SPACE}|%{SPACE}[%{DATA:apiTimestamp}]%{SPACE}|%{SPACE}%{GREEDYDATA}”]
}
date {
match => [ “logdate”, “ISO8601”]
remove_field => [ “logdate” ]
}
}
if [severity] != “INFO” and [severity] != “WARN” and [severity] != “ERROR” {
drop {}
}
}
the grok for %{DATA:apiTimestamp} will take everything. So if you want to have a timestamp at this, use the pattern for this timestamp that matches. Or create a custom mapping in Elasticsearch for the field apiTimestamp that is of the type string that you are able to ingest everything.
i have multiple application which is feeding logs in graylog, so is it possible we can check which application or container sending wrong date format ?
second question: if i got mapper_parsing_exception for date will that log ignored by the graylog because i can not see specific timestamp log in graylog
after making a custom mapping i can see the culprit services but i am wondering in logstash configuration apiTimestamp is defined [%{DATA:apiTimestamp}] then why elasticsearch treating him as a date i think it should treat as string
You think that the data type is decided on extract - but the dataype is decided on field creation in Elasticsearch.
If the first ingested value is a number, the field will be a number. If it is a date, the field will be a date. Only having a custom mapping is forcing a specific type. It does not matter what and how you extract - it does matter what you ingest.
Also after index rotation the game starts over again!