Failed to parse date field


thanks for your help!

My input is GELF TCP, basically docker stderr forwarded by nxlog from the host. There were some random issues with docker logspout so we ended up with this arrangement. I don’t know if it would work better with Docker sending logs directly to Graylog with gelf but our Docker CE doesn’t support multiple drivers so that’s why we currently have this setup.

I do have 1 extractor there which I believe is considered for the message as it contains string “alert” but it hasn’t been written for that line:

    Will only attempt to run if the message includes the string type='alert'
    list_separator: ,
    kv_separator: =
    key_separator: _
    key_whitespace_replacement: _

That was goog, I hadn’t thought of that and extractors at all.

Now I could try to see what that extractor does with a problematic message but I can’t load it there, no matter what I put to messageID and index fields it says “Not found”…

So, a secondary question: regarding this elasticsearch/graylog.log entry, what should I enter into messageID and index fields on Edit extractor page to try the extractor on it??

[2022-01-24T08:29:17,686][DEBUG][o.e.a.b.TransportShardBulkAction] [2CEfq1M] [graylog_38][2] failed to execute bulk item (index) index {[graylog_deflector][message][f40190d2-7cde-11ec-ae42-0050568815d4],source[{"gl2_accounted_message_size":356,"SourceModuleType":"im_file","level":6,"log":"Debug: request, error, close","gl2_remote_ip":"a.b.c.d","gl2_remote_port":60606,"streams":"000000000000000000000001","5c94dee4f7f28c69cba861e1","5ed618ec5d8362036e809302"],"gl2_message_id":"01FT5CCGJXPW229RDSK4PEBY49","source":"worker01","message":"{\"log\":\"Debug: request, error, close \\n\",\"stream\":\"stderr\",\"time\":\"2022-01-24T06:29:14.462979Z\"}","gl2_source_input":"5bbc795d11973e0a7f631721","EventReceivedTime":"2022-01-24 08:29:15","SourceModuleName":"mgmt-api","stream":"stderr","gl2_source_node":"cb4a9a8c-e692-4e55-947f-8fc89edef6b4","time":"2022-01-24 08:29:14","timestamp":"2022-01-24 06:29:15.000"}]}

You are correct, the list with timestamp issues is rather long… :smiley: