Document contains at least one immense term ERROR

1. Describe your incident:

in server.log file we see index failures. I have read lot of comment arround about this, but i’m sure what are my option.

2024-01-09T18:30:19.100+01:00 WARN [MessagesAdapterES6] Failed to index message: index=<index_321> id=<c1137a73-af14-11ee-8a69-005056a775aa> error=<{"type":"illegal_argument_exception","reason":"Document contains at least one immense term in field=\"@m\" (whose UTF8 encoding is longer than the max length 32766), all of which were skipped. Please correct the analyzer to not produce such terms. The prefix of the first immense term is: '[80, 117, 98, 108, 105, 115, 104, 32, 82, 97, 98, 98, 105, 116, 77, 81, 32, 109, 101, 115, 115, 97, 103, 101, 58, 32, 34, 123, 92, 34]...', original message: bytes can be at most 32766 in length; got 35994","caused_by":{"type":"max_bytes_length_exceeded_exception","reason":"bytes can be at most 32766 in length; got 35994"}}>

I understand that the source app is sending a field too large, I would like to understand what are my options at Elastic or graylog level without modifying source app.

The logs are shipped with filebeat.

2. Describe your environment:

  • OS Information:
    debian 10
  • Package Version:
    graylog 4.0.16
    elasticsearch 6.8.23

3. What steps have you already taken to try and solve the problem?

4. How can the community help?

Helpful Posting Tips: Tips for Posting Questions that Get Answers [Hold down CTRL and link on link to open tips documents in a separate tab]

If you click on the field name in Graylog, what is the data type of that field? It says right at the top of the value menu that pops up.


I wasn’t able to found this information on graylog interface.
I look directly on ES, and seems this field is set to keyword :

          "output_data" : {
            "type" : "keyword"

does this make sense to you ?
Should I convert this to another type ?

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.