jochen
(Jochen)
February 22, 2017, 9:58pm
2
The problem is that Lucene (the search library underlying Elasticsearch) can index only fields with a size less than 32 KB. Larger fields can be stored but they cannot be indexed (and not analyzed).
Also see this Discourse topic and this GitHub issue:
Hi Guys
I’ve deployed Graylog to use for a syslog solution. Currently using Sidecar to do the collections of winlogs only.
Been running a week and started loading some more hosts … Then Pooooooof, graylog fell over. Initially I was clueless as to whats going on.
After a bit of digging, I found the dreaded elasticsearch error which seems to be quite common ( bytes can be at most 32766 in length)
I have found a few articles where people say update the analyser, some others that mention set…
opened 05:03PM - 14 Jan 15 UTC
elasticsearch
feature
triaged
Elasticsearch has an upper limit for term length, so trying to index values long… er than ~32kb fails with an error.
Find a way to store those values but not trying to analyze them.