JSON Extractor Problem 2


(Habib The Sweet) #1

Hello. I have the same problem like in Problem with JSON extractor

I have JSON messages:

{"timestamp":"2017-11-13T17:09:32.878Z","level":"INFO","message":"Exchange [creators/createAllExchangersForUser]: Currencies to gen: ","meta":""}

After enabling JSON extractor for message field I have errors in my server log:

2017-11-13T16:58:56.475Z ERROR [BlockingBatchedESOutput] Unable to flush message buffer
java.lang.ClassCastException: Cannot cast java.lang.String to org.joda.time.DateTime
	at java.lang.Class.cast(Class.java:3369) ~[?:1.8.0_151]
	at org.graylog2.plugin.Message.getFieldAs(Message.java:384) ~[graylog.jar:?]
	at org.graylog2.plugin.Message.getTimestamp(Message.java:189) ~[graylog.jar:?]
	at org.graylog2.indexer.messages.Messages.propagateFailure(Messages.java:181) ~[graylog.jar:?]
	at org.graylog2.indexer.messages.Messages.bulkIndex(Messages.java:145) ~[graylog.jar:?]
	at org.graylog2.outputs.ElasticSearchOutput.writeMessageEntries(ElasticSearchOutput.java:111) ~[graylog.jar:?]
	at org.graylog2.outputs.BlockingBatchedESOutput.flush(BlockingBatchedESOutput.java:129) [graylog.jar:?]
	at org.graylog2.outputs.BlockingBatchedESOutput.writeMessageEntry(BlockingBatchedESOutput.java:110) [graylog.jar:?]
	at org.graylog2.outputs.BlockingBatchedESOutput.write(BlockingBatchedESOutput.java:92) [graylog.jar:?]
	at org.graylog2.buffers.processors.OutputBufferProcessor$1.run(OutputBufferProcessor.java:191) [graylog.jar:?]
	at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:176) [graylog.jar:?]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_151]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_151]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_151]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_151]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_151]

The timestamp field is vaild. How I can resolve this issue?

Graylog 2.3.2+3df951e on localhost (Oracle Corporation 1.8.0_151 on Linux 4.4.0-78-generic)


(Jochen) #2

“timestamp” is a special field and has to be an actual date/time and not a string.

You can either use a “Key prefix” in your JSON extractor, so that the field will be extracted to another name (e. g. “timestamp” → “custom_timestamp”, “level” → “custom_level”) or write a pipeline rule which extracts the JSON payload with parse_json() and then converts the “timestamp” field to an actual date/time with parse_date().


(Habib The Sweet) #3

But It is actual log timestamp, not some kind of custom timestamp. Can this extractor substitute the timestamp from JSON field?


(Jochen) #4

No, it’s a string which contains something resembling an ISO-8601 timestamp in Zulu time.

You have to use parse_date() to convert it into a “real” date/time instance as described in my previous post.


(Habib The Sweet) #5

It is absolutely valid ISO 8601 timesatmp https://www.regexpal.com/97766
Why have I use some pipeline rules or extract timestamp to another field when Graylog should parse this normally?


(Jochen) #6

A string object is not a date object, that’s why you have to use parse_date().


(Habib The Sweet) #7

Well. I just renamed timestamp field from JSON and have got next error:

ERROR [Messages] Failed to index [3] messages. Please check the index error log in your web interface for the reason. Error: One or more of the items in the Bulk request failed, check BulkResult.getItems() for more information.

{"type":"mapper_parsing_exception","reason":"failed to parse [level]","caused_by":{"type":"number_format_exception","reason":"For input string: \"INFO\""}}


(Jochen) #8

The “level” field should be numeric, mirroring the syslog severity levels:


(Habib The Sweet) #9

Looks like there is a problem with elasticsearch indices. I have two inputs - kafka and nginx (from filebeat). My nginx logs already mapped fields with different types.

Is it possible to write kafka input to another index?


(Habib The Sweet) #10

It is okay now. I just created new index set and new stream with “Remove matches from ‘All messages’ stream” option. Now JSON parsing as expected. Thank you


(system) #11

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.