Can't fetch Index Sets or Streams : IOException

Description of your problem

Hi everyone ! Just went into some trouble when updating today…

I updated my Graylog TEST platform from 4.1.2+20cd592, codename Noir To 4.2. Everything worked well, but when I tried to do so on my production environment, I’m getting this :

I only updated one of the node by now. Is it a version mismatch ? I don’t really know and I don’t want to make thins worse.

Description of steps you’ve taken to attempt to solve the issue

Tried to look for some information online. But I’m no Java nor MongoDB expert, so I’m just affraid to break thing even more.

I read IOException encountered while reading from a byte array input stream #9274 but I don’t have the same “caused by”…

Environmental information

192.168.1.20 = Graylog UI Load Balancer- Apache
192.168.1.21 = Graylog Inputs Load Balancer - NGINX
192.168.1.22 = Graylog Node 1
192.168.1.23 = Graylog Node 2
192.168.1.24 = Graylog Node 3
192.168.1.25 = Elasticsearch 1
192.168.1.26 = Elasticsearch 2
192.168.1.27 = Elasticsearch 3

Operating system information

  • Debian GNU/Linux 10 (buster)

Package versions

  • Graylog = 4.2.1+5442e44, codename Noir NODE 1
  • Graylog = 4.1.2+20cd592, codename Noir NODE 2 & 3
  • MongoDB = version v5.0.1 NODE 2 & 3
  • MongoDB = version v5.0.3 NODE 1
  • Elasticsearch = “number” : “7.10.2”, “build_flavor” : "oss"

Relevant server.log part :

Caused by: com.fasterxml.jackson.databind.exc.InvalidFormatException: Cannot deserialize value of type `org.graylog2.indexer.indexset.IndexSetConfig$TemplateType` from String "failures": value not one of declared Enum instance names: [events, messages, gim_v1]
 at [Source: de.undercouch.bson4jackson.io.LittleEndianInputStream@78dae059; pos: 775] (through reference chain: org.graylog2.indexer.indexset.IndexSetConfig["index_template_type]

And the whole part :

server.log
2021-11-10T16:31:57.574+01:00 ERROR [AnyExceptionClassMapper] Unhandled exception in REST resource
java.lang.RuntimeException: IOException encountered while reading from a byte array input stream
        at org.mongojack.internal.stream.JacksonDBDecoder.decode(JacksonDBDecoder.java:67) ~[graylog.jar:?]
        at com.mongodb.DBDecoderAdapter.decode(DBDecoderAdapter.java:49) ~[graylog.jar:?]
        at com.mongodb.DBDecoderAdapter.decode(DBDecoderAdapter.java:29) ~[graylog.jar:?]
        at com.mongodb.operation.CommandResultArrayCodec.decode(CommandResultArrayCodec.java:52) ~[graylog.jar:?]
        at com.mongodb.operation.CommandResultDocumentCodec.readValue(CommandResultDocumentCodec.java:60) ~[graylog.jar:?]
        at org.bson.codecs.BsonDocumentCodec.decode(BsonDocumentCodec.java:84) ~[graylog.jar:?]
        at org.bson.codecs.BsonDocumentCodec.decode(BsonDocumentCodec.java:41) ~[graylog.jar:?]
        at org.bson.codecs.BsonDocumentCodec.decode(BsonDocumentCodec.java:41) ~[graylog.jar:?]
        at com.mongodb.internal.connection.ReplyMessage.<init>(ReplyMessage.java:51) ~[graylog.jar:?]
        at com.mongodb.internal.connection.InternalStreamConnection.getCommandResult(InternalStreamConnection.java:413) ~[graylog.jar:?]
        at com.mongodb.internal.connection.InternalStreamConnection.receiveCommandMessageResponse(InternalStreamConnection.java:309) ~[graylog.jar:?]
        at com.mongodb.internal.connection.InternalStreamConnection.sendAndReceive(InternalStreamConnection.java:259) ~[graylog.jar:?]
        at com.mongodb.internal.connection.UsageTrackingInternalConnection.sendAndReceive(UsageTrackingInternalConnection.java:99) ~[graylog.jar:?]
        at com.mongodb.internal.connection.DefaultConnectionPool$PooledConnection.sendAndReceive(DefaultConnectionPool.java:450) ~[graylog.jar:?]
        at com.mongodb.internal.connection.CommandProtocolImpl.execute(CommandProtocolImpl.java:72) ~[graylog.jar:?]
        at com.mongodb.internal.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:226) ~[graylog.jar:?]
        at com.mongodb.internal.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:269) ~[graylog.jar:?]
        at com.mongodb.internal.connection.DefaultServerConnection.command(DefaultServerConnection.java:131) ~[graylog.jar:?]
        at com.mongodb.internal.connection.DefaultServerConnection.command(DefaultServerConnection.java:123) ~[graylog.jar:?]
        at com.mongodb.operation.CommandOperationHelper.executeCommand(CommandOperationHelper.java:343) ~[graylog.jar:?]
        at com.mongodb.operation.CommandOperationHelper.executeCommand(CommandOperationHelper.java:334) ~[graylog.jar:?]
        at com.mongodb.operation.CommandOperationHelper.executeCommandWithConnection(CommandOperationHelper.java:220) ~[graylog.jar:?]
        at com.mongodb.operation.FindOperation$1.call(FindOperation.java:731) ~[graylog.jar:?]
        at com.mongodb.operation.FindOperation$1.call(FindOperation.java:725) ~[graylog.jar:?]
        at com.mongodb.operation.OperationHelper.withReadConnectionSource(OperationHelper.java:463) ~[graylog.jar:?]
        at com.mongodb.operation.FindOperation.execute(FindOperation.java:725) ~[graylog.jar:?]
        at com.mongodb.operation.FindOperation.execute(FindOperation.java:89) ~[graylog.jar:?]
        at com.mongodb.client.internal.MongoClientDelegate$DelegateOperationExecutor.execute(MongoClientDelegate.java:196) ~[graylog.jar:?]
        at com.mongodb.client.internal.MongoClientDelegate$DelegateOperationExecutor.execute(MongoClientDelegate.java:177) ~[graylog.jar:?]
        at com.mongodb.DBCursor.initializeCursor(DBCursor.java:989) ~[graylog.jar:?]
        at com.mongodb.DBCursor.hasNext(DBCursor.java:172) ~[graylog.jar:?]
        at org.mongojack.DBCursor.hasNext(DBCursor.java:330) ~[graylog.jar:?]
        at com.google.common.collect.ImmutableList.copyOf(ImmutableList.java:268) ~[graylog.jar:?]
        at org.graylog2.indexer.indexset.MongoIndexSetService.findAll(MongoIndexSetService.java:125) ~[graylog.jar:?]
        at org.graylog2.rest.resources.system.indexer.IndexSetsResource.list(IndexSetsResource.java:143) ~[graylog.jar:?]
        at jdk.internal.reflect.GeneratedMethodAccessor873.invoke(Unknown Source) ~[?:?]
        at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
        at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
        at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory.lambda$static$0(ResourceMethodInvocationHandlerFactory.java:52) ~[graylog.jar:?]
        at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:124) ~[graylog.jar:?]
        at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:167) ~[graylog.jar:?]
        at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:219) ~[graylog.jar:?]
        at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:79) ~[graylog.jar:?]
        at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:469) ~[graylog.jar:?]
        at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:391) ~[graylog.jar:?]
        at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:80) ~[graylog.jar:?]
        at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:255) [graylog.jar:?]
        at org.glassfish.jersey.internal.Errors$1.call(Errors.java:248) [graylog.jar:?]
        at org.glassfish.jersey.internal.Errors$1.call(Errors.java:244) [graylog.jar:?]
        at org.glassfish.jersey.internal.Errors.process(Errors.java:292) [graylog.jar:?]
        at org.glassfish.jersey.internal.Errors.process(Errors.java:274) [graylog.jar:?]
        at org.glassfish.jersey.internal.Errors.process(Errors.java:244) [graylog.jar:?]
        at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:265) [graylog.jar:?]
        at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:234) [graylog.jar:?]
        at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:680) [graylog.jar:?]
        at org.glassfish.jersey.grizzly2.httpserver.GrizzlyHttpContainer.service(GrizzlyHttpContainer.java:356) [graylog.jar:?]
        at org.glassfish.grizzly.http.server.HttpHandler$1.run(HttpHandler.java:200) [graylog.jar:?]
        at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:180) [graylog.jar:?]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
        at java.lang.Thread.run(Thread.java:829) [?:?]
Caused by: com.fasterxml.jackson.databind.exc.InvalidFormatException: Cannot deserialize value of type `org.graylog2.indexer.indexset.IndexSetConfig$TemplateType` from String "failures": value not one of declared Enum instance names: [events, messages, gim_v1]
 at [Source: de.undercouch.bson4jackson.io.LittleEndianInputStream@5e57cdd6; pos: 775] (through reference chain: org.graylog2.indexer.indexset.IndexSetConfig["index_template_type]
        at com.fasterxml.jackson.databind.exc.InvalidFormatException.from(InvalidFormatException.java:67) ~[graylog.jar:?]
        at com.fasterxml.jackson.databind.DeserializationContext.weirdStringException(DeserializationContext.java:1549) ~[graylog.jar:?]
        at com.fasterxml.jackson.databind.DeserializationContext.handleWeirdStringValue(DeserializationContext.java:911) ~[graylog.jar:?]
        at com.fasterxml.jackson.databind.deser.std.EnumDeserializer._deserializeAltString(EnumDeserializer.java:255) ~[graylog.jar:?]
        at com.fasterxml.jackson.databind.deser.std.EnumDeserializer.deserialize(EnumDeserializer.java:179) ~[graylog.jar:?]
        at com.fasterxml.jackson.databind.deser.SettableBeanProperty.deserialize(SettableBeanProperty.java:530) ~[graylog.jar:?]
        at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeWithErrorWrapping(BeanDeserializer.java:528) ~[graylog.jar:?]
        at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:417) ~[graylog.jar:?]
        at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1287) ~[graylog.jar:?]
        at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:326) ~[graylog.jar:?]
        at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:159) ~[graylog.jar:?]
        at com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:3985) ~[graylog.jar:?]
        at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2343) ~[graylog.jar:?]
        at org.mongojack.internal.stream.JacksonDBDecoder.decode(JacksonDBDecoder.java:80) ~[graylog.jar:?]
        at org.mongojack.internal.stream.JacksonDBDecoder.decode(JacksonDBDecoder.java:64) ~[graylog.jar:?]
        ... 64 more

Thanks everyone !!!

NEWS : I restarted the non-updated nodes. They don’t show up in the web UI anymore when I go to “System/Nodes”… It looks that the cluster is now made of one node… Is it because of a Graylog version mismatch ?
I have more and more unprocessed logs :cold_sweat:

The server.log of the updated node tells me this :

[35]: index [firewall_graylog__2], type [_doc], id [4cfdb462-4243-11ec-944b-00155da310d6], message [ElasticsearchException[Elasticsearch exception [type=mapper_parsing_exception, reason=failed to parse field [event_received_time] of type [date] in document with id '4cfdb462-4243-11ec-944b-00155da310d6'. Preview of field's value: '2021-11-10T17:28:55.000Z']]; nested: ElasticsearchException[Elasticsearch exception [type=illegal_argument_exception, reason=failed to parse date field [2021-11-10T17:28:55.000Z] with format [yyyy/MM/dd HH:mm:ss||yyyy/MM/dd||epoch_millis]]]; nested: ElasticsearchException[Elasticsearch exception [type=date_time_parse_exception, reason=date_time_parse_exception: Failed to parse with all enclosed parsers]];]

My unofficial thought is that it is a versioning error. Since you are now having a date/time error and I have seen other posts where updating to 4.1.2 fixed a time zone issue that someone was having (this post)

Hard for me to judge the production environment - my preference in this instance would be to finish the upgrade sequence and work on issues once complete - you could go down a rabbit-hole that isn’t worth it if finishing the upgrade could fix it… and the alternative of rolling back production when a third of the way through may be just as, if not more, painful.

Just chiming in, by chance have you tried manually rotating you index called firewall_graylog__2 to see if it clears things up?
Have you seen this post when updating a cluster?

Also you MongoDb is version 5.0. I’m not 100% sure but I think the correct version would be no higher then MongDb 4.4

1 Like

Hello everyone, and thanks for taking some time to answer me.

I managed to do a quickfix by completing the update of the three nodes and performing an index rotation. Phew.

I still have a problem, it seems that Graylog can’t index some messages…

image

However, I don’t have any gap or any kind of lack of data in the indexes, and the routing to the streams is done correctly…

Looking at log files, I don’t have any particular informations; either from Graylog or Elasticsearch.

Also you MongoDb is version 5.0. I’m not 100% sure but I think the correct version would be no higher then MongDb 4.4

I think I read something about that. But things had been set up this way and has been working fine so far. I thought about this but feel that downgrading MongoDB would do more harm thant good…

Have a great day !

2 Likes

The utilization in your graphic is realtime… which likely means you are consistently getting 16 messages at each time it checks… if it was unable to process messages you would see that somewhere else such as server logs…

Glad that you were able to resolve the main issue! Mark an answer for anyone who searches for something similar!

Good Job :smiley: and thank for posting what you did.

Rule #51 If it isn’t broke, don’t fix it. :slight_smile:

1 Like

Thanks a lot to both of you.

I am following closely the evolution of the situation.

I’ll write back at the end of the week to let you know what’s going on with the unprocessed messages, if I learn more!

Thanks again and have a great day !

2 Likes

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.