ElasticsearchException ... Limit of total fields [1000] has been exceeded

Description of your problem

im getting “ElasticsearchException[Elasticsearch exception [type=illegal_argument_exception, reason=Limit of total fields [1000] has been exceeded]]”

Description of steps you’ve taken to attempt to solve the issue

I managed to solve this temporary by moving all WinlogBeats logs into its own index. But for the last couple of months i’ve had to increase the limit from 1000 to 1300 by issuing the following command.
“curl -XPUT http://logserver:9200/graylog_14/_settings -H ‘Content-Type: application/json’ -d’{ “index.mapping.total_fields.limit”: 1300 }’”

I know its not the best solution but its the best that i know of. But now since upgrade today to graylog-server-4.1.5-1.noarch on RedHat 8, it no longer works. I dont know if its because of the upgrade or if just happens to be a bad day…

Environmental information

Operating system information

RedHat 8.4 with latest patches

Package versions


If i look at the two last indexes there is some difference,

curl 'http://IP:9200/graylog_14/_settings'

curl 'http://IP:9200/graylog_15/_settings'
I dont know how to solve this... Thanks!!
Your code goes inside the triple backticks

1000-1300 fieldnames… that’s a lot of fields - chances are something is parsing wrong and you are picking up ever-changing data for field names. Check out any extractors, GROK commands, set_fields() functions etc. in your winlogbeats and look at some of your messages for clues on field names that are actually data… Once you get that under control you can rotate your index or create a new one to clear out unused fields.

1 Like

Thanks @tmacgbay, will try removing some extractors.