Hi, I’m using graylog 4.07.
Currently I’m reading a complex json object in the message. SO I used a JsonExtractor to have the possibility to have key/value fields.
I used the dot char as key separator.
In the simulator it works as intended:
Having dots in field names are possible in Elasticsearch but not recommended - they could cause issues with inner object fields. I imagine Graylog clears that up as it processes the message through to Elasticsearch.
a little more challenging, create a custom mapping in Elasticsearch where you force that field in that index to be numeric (be sure to rotate the index after setting up the custom mapping):
Is arriving into Graylog a field where the value is an object that is serialized as string.
In fact looking data with kibana, and Graylog stream too, there are the quotes in and the escaped charactes.
So I used the json extractor.
The best approach should be, configure Graylog toanage it as an object, not as a string. In this way I can execute query using json syntax
There are functions for pipelines that allow you to process json - Functions — Graylog 4.1.0 documentation. I haven’t worked with json so I won’t be much help, but I recall there were quit a few questions in the forums here that you could search to find a solution…