Problelm with parsing a timestamp field

1. Describe your incident:

I get a lot of errors like this:

OpenSearchException[OpenSearch exception [type=mapper_parsing_exception, reason=failed to parse field [_message_timestamp] of type [date] in document with id ‘c58701c1-a0dd-11ee-a408-0242c0a8b007’. Preview of field’s value: ‘2023-12-22 15:21:24.336’]]; nested: OpenSearchException[OpenSearch exception [type=illegal_argument_exception, reason=failed to parse date field [2023-12-22 15:21:24.336] with format [strict_date_optional_time||epoch_millis]]]; nested: OpenSearchException[OpenSearch exception [type=date_time_parse_exception, reason=Failed to parse with all enclosed parsers]];

2. Describe your environment:

  • OS Information: Ubuntu and docker-compose. I’ll be of course happy to provide the YAML if needed. I’m not using any extractors.

  • Package Version: v5.2, with OpenSearch v2.11.1

Yesterday when I started up this particular input after first migrating from Elasticsearch (where I had the exact same problem), I didn’t have this problem, and I thought that OpenSearch did the trick. Today I have it again. Perhaps it has something to do with the index mapping getting determined dynamically with the entry that happens to come first. I’m using daily indices, so int__0 today, int__1 tomorrow etc.

I noticed that yesterday’s index had that field as “keyword”, whereas in today’s index it was “date”. However for another index that doesn’t have this problem, it’s also “date”…

This problem has been going on for days, it has become a major headache and is ruining my Xmas mood :frowning: . I have tried introducing a template but it made things worse.

I find it puzzling that a working setup is not readily available for Graylog+Kubernetes, it’s not like I’m trying something that no one has tried before, it’s such a common scenario…

My filebeat.yml looks like this:

filebeat.inputs:
- type: container
  paths:
    - /var/log/containers/*.log
processors:
  - add_kubernetes_metadata:
      host: ${NODE_NAME}
      matchers:
      - logs_path:
          logs_path: "/var/log/containers/"
  - decode_json_fields:
      fields: ["message"]
      target: "_message"
      overwrite_keys: true
      add_error_key: true      
  - timestamp:
      field: _message_timestamp
      target_field: _message_timestamp
      layouts:
        - '2006-01-02 15:04:05.999'
      test:
        - '2023-12-18 15:16:35.781'
  - add_cloud_metadata:
  - add_host_metadata:
output.logstash:
  hosts: ["10.65.82.185:5045"]

check below thread help you

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.