Index set must not be null

Hi All

I am continuously receiving below error message in my server log. Not able to identify, please help me to fix this.
Its related to stream - kappointment api errors
Graylog Version - 2.4.6
Elasticsearch Version - 5.6

2018-10-15T17:23:05.844Z ERROR [AlertScanner] Skipping alert check <null/4d4ca28f-634e-4675-9f3c-639f062927c7>: index set must not be null! (stream id=59662556e4b0c2a97f846913 title="kappointment api errors") (IllegalStateException)
2018-10-15T17:24:05.842Z ERROR [AlertScanner] Skipping alert check <null/4d4ca28f-634e-4675-9f3c-639f062927c7>: index set must not be null! (stream id=59662556e4b0c2a97f846913 title="kappointment api errors") (IllegalStateException)
2018-10-15T17:25:05.743Z ERROR [AlertScanner] Skipping alert check <null/4d4ca28f-634e-4675-9f3c-639f062927c7>: index set must not be null! (stream id=59662556e4b0c2a97f846913 title="kappointment api errors") (IllegalStateException)
2018-10-15T17:26:05.750Z ERROR [AlertScanner] Skipping alert check <null/4d4ca28f-634e-4675-9f3c-639f062927c7>: index set must not be null! (stream id=59662556e4b0c2a97f846913 title="kappointment api errors") (IllegalStateException)

Also one more repetition of logs getting on another node. Invalid format “Mon”

2018-10-15T18:17:03.789Z WARN  [Messages] Failed to index message: index=<graylog_644> id=<8306a745-d0a6-11e8-98f2-12f9741d7c02> error=<{"type":"mapper_parsing_exception","reason":"failed to parse [time]","caused_by":{"type":"illegal_argument_exception","reason":"Invalid format: \"Mon\""}}>
2018-10-15T18:17:03.789Z ERROR [BlockingBatchedESOutput] Unable to flush message buffer
java.lang.ClassCastException: Cannot cast java.lang.Long to org.joda.time.DateTime
	at java.lang.Class.cast(Class.java:3369) ~[?:1.8.0_141]
	at org.graylog2.plugin.Message.getFieldAs(Message.java:481) ~[graylog.jar:?]
	at org.graylog2.plugin.Message.getTimestamp(Message.java:226) ~[graylog.jar:?]
	at org.graylog2.indexer.messages.Messages.propagateFailure(Messages.java:197) ~[graylog.jar:?]
	at org.graylog2.indexer.messages.Messages.bulkIndex(Messages.java:161) ~[graylog.jar:?]
	at org.graylog2.indexer.messages.Messages.bulkIndex(Messages.java:128) ~[graylog.jar:?]
	at org.graylog2.outputs.ElasticSearchOutput.writeMessageEntries(ElasticSearchOutput.java:111) ~[graylog.jar:?]
	at org.graylog2.outputs.BlockingBatchedESOutput.flush(BlockingBatchedESOutput.java:129) [graylog.jar:?]
	at org.graylog2.outputs.BlockingBatchedESOutput.forceFlushIfTimedout(BlockingBatchedESOutput.java:154) [graylog.jar:?]
	at org.graylog2.periodical.BatchedElasticSearchOutputFlushThread.doRun(BatchedElasticSearchOutputFlushThread.java:82) [graylog.jar:?]
	at org.graylog2.plugin.periodical.Periodical.run(Periodical.java:77) [graylog.jar:?]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_141]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [?:1.8.0_141]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_141]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) [?:1.8.0_141]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_141]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_141]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_141]
2018-10-15T18:17:04.877Z WARN  [Messages] Failed to index message: index=<graylog_644> id=<83a928d4-d0a6-11e8-98f2-12f9741d7c02> error=<{"type":"mapper_parsing_exception","reason":"failed to parse [time]","caused_by":{"type":"illegal_argument_exception","reason":"Invalid format: \"Mon\""}}>
2018-10-15T18:17:04.877Z ERROR [BlockingBatchedESOutput] Unable to flush message buffer
java.lang.ClassCastException: Cannot cast java.lang.Long to org.joda.time.DateTime
	at java.lang.Class.cast(Class.java:3369) ~[?:1.8.0_141]
	at org.graylog2.plugin.Message.getFieldAs(Message.java:481) ~[graylog.jar:?]
	at org.graylog2.plugin.Message.getTimestamp(Message.java:226) ~[graylog.jar:?]
	at org.graylog2.indexer.messages.Messages.propagateFailure(Messages.java:197) ~[graylog.jar:?]
	at org.graylog2.indexer.messages.Messages.bulkIndex(Messages.java:161) ~[graylog.jar:?]
	at org.graylog2.indexer.messages.Messages.bulkIndex(Messages.java:128) ~[graylog.jar:?]
	at org.graylog2.outputs.ElasticSearchOutput.writeMessageEntries(ElasticSearchOutput.java:111) ~[graylog.jar:?]
	at org.graylog2.outputs.BlockingBatchedESOutput.flush(BlockingBatchedESOutput.java:129) [graylog.jar:?]
	at org.graylog2.outputs.BlockingBatchedESOutput.writeMessageEntry(BlockingBatchedESOutput.java:110) [graylog.jar:?]
	at org.graylog2.outputs.BlockingBatchedESOutput.write(BlockingBatchedESOutput.java:92) [graylog.jar:?]
	at org.graylog2.buffers.processors.OutputBufferProcessor$1.run(OutputBufferProcessor.java:191) [graylog.jar:?]
	at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:176) [graylog.jar:?]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_141]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_141]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_141]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_141]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_141]

Thanks

both errors are not related.

The first, is something where you should check if the stream has a index set that is present given as location to store messages.
The second indicates that your messages you receive has the date in a format that is not what it should be in the field time. You need to reformat the date or make the field time a string to be able to place everything in it.

@jan

How can we reformat the date or make the field time a string?

@jan Would you please help me in this ^^ or is there any problem if I leave this error and warning as it is?

to reformat the timestamp you can go with the parse_date() processing pipeline function. Like given in the description at the documentation.
A custome elasticsearch mapping would give you the option to have the field as a string.

ok will do

Thanks @jan

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.