Graylog Index Failures

I am continuously getting index failures messages. The error messages are:

{"type":"illegal_argument_exception","reason":"Document contains at least one immense term in field=\"class_name\" (whose UTF8 encoding is longer than the max length 32766), all of which were skipped. Please correct the analyzer to not produce such terms. The prefix of the first immense term is: '[82, 101, 115, 112, 111, 110, 115, 101, 32, 114, 101, 99, 101, 105, 118, 101, 100, 32, 102, 111, 114, 32, 103, 101, 116, 32, 100, 101, 97, 108]...', original message: bytes can be at most 32766 in length; got 35437","caused_by":{"type":"max_bytes_length_exceeded_exception","reason":"max_bytes_length_exceeded_exception: bytes can be at most 32766 in length; got 35437"}}

Graylog version that I am using: 3.3.2
Elasticsearch Version that I am using: 6.8

Tried this solution as well but it didn’t work:

Ran the above curl command on ES Master and ES Data nodes ^^

Can anyone please let me know how can we remove this error?


you need to create a mapping for this field that gives “ignore_above” … like written in the comment. But specific to YOUR field that is bigger.

OK will try, Thanks @jan

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.