ElasticsearchException ... Limit of total fields [1000] has been exceeded

Description of your problem

Hello,
im getting “ElasticsearchException[Elasticsearch exception [type=illegal_argument_exception, reason=Limit of total fields [1000] has been exceeded]]”

Description of steps you’ve taken to attempt to solve the issue

I managed to solve this temporary by moving all WinlogBeats logs into its own index. But for the last couple of months i’ve had to increase the limit from 1000 to 1300 by issuing the following command.
“curl -XPUT http://logserver:9200/graylog_14/_settings -H ‘Content-Type: application/json’ -d’{ “index.mapping.total_fields.limit”: 1300 }’”

I know its not the best solution but its the best that i know of. But now since upgrade today to graylog-server-4.1.5-1.noarch on RedHat 8, it no longer works. I dont know if its because of the upgrade or if just happens to be a bad day…

Environmental information

Operating system information

RedHat 8.4 with latest patches

Package versions

elasticsearch-oss-7.10.2-1.x86_64
mongodb-org-server-4.2.16-1.el8.x86_64
graylog-server-4.1.5-1.noarch

If i look at the two last indexes there is some difference,

```
curl 'http://IP:9200/graylog_14/_settings'
{"graylog_14":{"settings":{"index":{"mapping":{"total_fields":{"limit":"1300"}},"number_of_shards":"4","blocks":{"write":"true","metadata":"false","read":"false"},"provided_name":"graylog_14","creation_date":"1629676803648","analysis":{"analyzer":{"analyzer_keyword":{"filter":"lowercase","tokenizer":"keyword"}}},"number_of_replicas":"0","uuid":"sEiychjxTC2xtI3dTW027w","version":{"created":"7100299"}}}}} 


curl 'http://IP:9200/graylog_15/_settings'
{"graylog_15":{"settings":{"index":{"mapping":{"total_fields":{"limit":"1300"}},"number_of_shards":"4","provided_name":"graylog_15","creation_date":"1630886401586","analysis":{"analyzer":{"analyzer_keyword":{"filter":"lowercase","tokenizer":"keyword"}}},"number_of_replicas":"0","uuid":"6ClHO1OjTemcUTriq7g1_A","version":{"created":"7100299"}}}}}
```
I dont know how to solve this... Thanks!!
Your code goes inside the triple backticks

1000-1300 fieldnames… that’s a lot of fields - chances are something is parsing wrong and you are picking up ever-changing data for field names. Check out any extractors, GROK commands, set_fields() functions etc. in your winlogbeats and look at some of your messages for clues on field names that are actually data… Once you get that under control you can rotate your index or create a new one to clear out unused fields.

1 Like

Thanks @tmacgbay, will try removing some extractors.