Where does json extracted logs go


I’m a newbie on graylog which I’m using with docker, and I’m trying to format my logs with extractors in json.
I’ve followed the graylog documentation :

I’ve loaded the latest message, add extractor to message field, leave all and hit ok.
Here is my extractor description :
And now where can I see my extracted logs? Does i’ve to import this extractor?
Because in the search, all json message logs dissapears :confused:
since i used “copy” strategy, i should see a duplicated logs?

Thanks a lots for your answer :slight_smile:

What’s in the logs of your Graylog nodes? Maybe the messages with extracted JSON fail to be indexed into Elasticsearch.

Thanks for your answer.
I restarted a docker stack and I’m now able to set an extractor and see my logs in the main search section.
What is the best strategy to now deal with these logs?
I’ve now one input (gelf udp), for each message comming to this input, i try to extract json from the “message” field.
I could set a stream to filter my logs according to value I get for example?
Is it a good way, (I mean a proper way for production usage)?

And if I want to search for a specific value (for example a server-id that I have in my json), I must search in all received logs thoses who match server-id equals a specific value?

Thanks a lots, everything working great :slight_smile:

EDIT: Another question i doesn’t see in docs. I’ve set a JSON extractor from a message ({“name”:“Graylog”,“id”:34,“values”:[0.079785526,0.27987057]}). If this input received a new log with an optional json field ({“name”:“Graylog”,“id”:34,“test”=true,“values”:[0.079785526,0.27987057]}), will it try to decode this message, or will it fail, since it doesn’t know the field test. Does I need to create extractor for all my differents logs structures?

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.