Ok so I get these indexing errors lately:
java.lang.IllegalArgumentException: Document contains at least one immense term in field=“trace” (whose UTF8 encoding is longer than the max length 32766), all of which were skipped. Please correct the analyzer to not produce such terms. The prefix of the first immense term is: ‘[42, 42, 32, 80, 111, 108, 105, 99, 121, 68, 101, 116, 97, 105, 108, 32, 99, 111, 110, 118, 101, 114, 116, 101, 100, 32, 116, 111, 32, 58]…’, original message: bytes can be at most 32766 in length; got 65539
First, how can I find what input that ‘trace’ field is originating from? I want to avoid this issue but I don’t recall what source sends messages with a ‘trace’ field and where it enters.
Second, once that indexing error is received, the indexing halts all together. The process buffer fills to 100% and the journal keeps filling up. So how to I clear this so indexing continues?
EDIT: the indexing doesn’t completely halt but it’s actually very slow after the error. Some logs actually get processed after 2-3 hours they arrived
EDIT2: basically this is happening but I can’t afford to flush the journal… failed to execute bulk item. No new messages are accepted in beats input. · Issue #4130 · Graylog2/graylog2-server · GitHub