I am using nxlog to parse and ship Windows IIS logs to Graylog and noticing that when I create a simple stream to grab logs with a specific collector ID and try to send these logs to said stream, they do not arrive, stream shows no logs. When I check System >> Overview >> Indexer failures I see lots of errors like this:
ElasticsearchException[Elasticsearch exception [type=mapper_parsing_exception, reason=failed to parse field [time] of type [date] in document with id 'e1a4a471-28a2-11ed-97c8-00155d202636'. Preview of field's value: '20:32:10']]; nested: ElasticsearchException[Elasticsearch exception [type=illegal_argument_exception, reason=failed to parse date field [20:32:10] with format [strict_date_optional_time||epoch_millis]]]; nested: ElasticsearchException[Elasticsearch exception [type=date_time_parse_exception, reason=Failed to parse with all enclosed parsers]];
When I pause the stream, the messages do begin to appear in the ‘All Messages’ stream and the index failures go away. Then when I start the stream again, the logs fail to index, nothing shows up in the stream, and the index failures start again.
Seems to be something wrong with the ability to parse the date/timestamp.
From what I gather, ES expects them to be in format ‘strict_date_optional_time’ with optional epoch timestamp:
date_optional_time or strict_date_optional_time A generic ISO datetime parser, where the date must include the year at a minimum, and the time (separated by T), is optional. Examples: yyyy-MM-dd'T'HH:mm:ss.SSSZ or yyyy-MM-dd.
but IIS logs date/time look like this for example:
How can I manipulate the date/timestamp on the client side before shipping to Graylog so that ES can properly parse it and index and the logs show up in my stream? Please let me know if you need to see portions of my configs etc.