1. Describe your incident:
Recently migrated my Graylog deployment from our 1.5TB disk to our 9TB disk due to space constrictions.
Everything seems working after a little bit of ball-ache, we’re receiving logs from our sidecars and it all seems to be okay on that front.
However, I’m getting a large number of indexing errors which we weren’t receiving before. All in regards to parameters being mapped to the incorrect variable type such as the [date] variable being a string.
An example of the error message is below, with all the others in our list (13,450 over 24/hrs) being very similar with the [date] type not being an actual date.
I’m also seeing no logs for elasticsearch in my data/log/elasticsearch directory after pointing to it in the config
ElasticsearchException[Elasticsearch exception [type=mapper_parsing_exception, reason=failed to parse field [winlogbeat_winlog_event_data_param1] of type [date] in document with id '****'. Preview of field's value: 'Software Protection']]; nested: ElasticsearchException[Elasticsearch exception [type=illegal_argument_exception, reason=failed to parse date field [Software Protection] with format [strict_date_optional_time||epoch_millis]]]; nested: ElasticsearchException[Elasticsearch exception [type=date_time_parse_exception, reason=Failed to parse with all enclosed parsers]];
I’m sure its because I’ve not moved a directory to the proper location, I saw online that I needed to change the path variables in elasticsearch.yaml, move the elasticsearch lib/log directories
Am I missing something?
2. Describe your environment:
-
OS Information: Ubuntu 20.04
-
Package Version: 4.3.6 (Noir)
-
Service logs, configurations, and environment variables:
3. What steps have you already taken to try and solve the problem?
Done research, seems to just be a mapping issue but I can’t find where to change the mapping if its on the graylog GUI or in linux itself.
Sorry if I’ve worded this poorly, appreciate any help.