Extract Json via Pipeline

Hello,

i am running Graylog 4.2.9+f0d8298, the logs are picked up from a Tomcat Json Log via Filebeat and shipped to Graylog via an Beats Input.

The messages are routed into the pipeline, and regarding the counters they are going through the rules. But sadly the rules doesnt seem to make any changes.

The responsible stage 0 rule looks like

rule "JSON Parser" 
when 
   has_field("application") AND
   (to_string($message.application)) == "java"
then   
   set_fields(to_map(parse_json(to_string($message.message))));   
end

the content of the original message field looks like

{"level":"INFO","timestamp":"Fri Jul 22 10:36:50 CEST 2022","thread":"ajp-bio-8009-exec-12","loggerName":"com.xyz.xyz.xyz","SessionID":"null","message":"objectId: c9999"}

i already reordered the message processors configuration

but i still dont get any new fields but the messages are processed by this rule regarding the counters. maybe someone has a hint, i am a bit lost right now :laughing:

I think you want Message Filter Chain first - usually where issues pop up is when Pipeline Processor is first like you have it. I don’ think that’s the issue, though if $message.application is getting pulled by the Message Filter Chain then the rule will fail the when.

This looks like it would work to me but when you pack it all in it’s hard to use the debug() in your pipeline rule to help figure out what is going on there. So it would look like this… a little overkill but it should give you clues as to what is going on.:

rule "JSON Parser" 
when 
   has_field("application") AND
   (to_string($message.application)) == "java"
then   

    let the_json = parse_json(to_string($message.message));
    debug(concat("The json: ", to_string(the_json)));
    
    let the_map = to_map(the_json);
    debug(concat("The map: ", to_string(the_map)));

   set_fields(the_map);   
end

And you can watch for results by tailing the Graylog logs:

tail -f /var/log/graylog-server/server.log
2 Likes

Hello, thanks that made Graylog much more talkative. It seems i have some problems in storing the data in elasticsearch. I see a lot of errors like

index [graylog_1], type [_doc], id [8462cc60-0c14-11ed-b771-566f4f06010b], message [ElasticsearchException[Elasticsearch exception [type=mapper_parsing_exception, reason=failed to parse field [level] of type [long] in document with id '8462cc60-0c14-11ed-b771-566f4f06010b'. Preview of field's value: 'INFO']]; nested: ElasticsearchException[Elasticsearch exception [type=illegal_argument_exception, reason=For input string: "INFO"]];]

so i tried to add a custom mapping for “level” like this

‘{
“template”: “graylog_*”,
“mappings” : {
“message” : {
“properties” : {
“level” : {
“type” : “keyword”,
“index”: “true”
}
}
}
}
}’

which just ends up in a 400 error

‘{
“error” : {
“root_cause” : [
{
“type” : “mapper_parsing_exception”,
“reason” : “Root mapping definition has unsupported parameters: [message : {properties={level={index=true, type=keyword}}}]”
}
],
“type” : “mapper_parsing_exception”,
“reason” : “Failed to parse mapping [_doc]: Root mapping definition has unsupported parameters: [message : {properties={level={index=true, type=keyword}}}]”,
“caused_by” : {
“type” : “mapper_parsing_exception”,
“reason” : “Root mapping definition has unsupported parameters: [message : {properties={level={index=true, type=keyword}}}]”
}
},
“status” : 400
}’

level is basically just the loglevel of the tomcat, INFO, DEBUG etc

seems like it was my fault using a es 6.x syntax

‘{
“template”: “graylog_*”,
“mappings”: {
“properties”: {
“level”: {
“type”: “keyword”
}
}
}}’

went through. i am checking the results now :slight_smile:

now the json gets parsed :slight_smile: wonderful

1 Like

Excellent! Glad it worked out!! :smiley:

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.