Unable to parse JSON logs in Graylog

Hi Guys,

I am writing my logs in json file and and input filter is json directly but those are being parsed correctly in elasticsearch? Any clue what I am missing here?

Here is my json file

{"org_name": "google.com", "policy_spf": "pass", "org_email": "noreply-dmarc-support@google.com", "policy_dkim": "pass", "policy_pct": "100", "auth_spf_result": "pass","auth_dkim_domain": "xxx,ccc", "auth_dkim_result": "pass", "identifier_header_from": "xxx,ccc", "date_end": "2019-01-04T05:29:59", "date_start": "2019-01-03T05:30:00", "source_ip": "1,.2.3.4", "count": 1, "auth_spf_domain": "xxx,ccc", "policy_p": "none", "submitter": "unknown", "policy_disposition": "none", "policy_domain": "xxx,ccc", "id": "15325652754200102860"}

{“org_name”: “google.com”, “policy_spf”: “fail”, “org_email”: “noreply-dmarc-support@google.com”, “policy_dkim”: “fail”, “policy_pct”: “100”, “auth_spf_result”: “pass”, “identifier_header_from”: “mail.xxx,ccc”, “date_end”: “2019-01-04T05:29:59”, “date_start”: “2019-01-03T05:30:00”, “source_ip”: “2.3.4.5”, “count”: 1, “auth_spf_domain”: “apc01-hk2-obe.outbound.protection.outlook.com”, “policy_p”: “none”, “submitter”: “unknown”, “policy_disposition”: “none”, “policy_domain”: “xxx,ccc”, “id”: “15325652754200102860”}
{“org_name”: “google.com”, “policy_spf”: “pass”, “org_email”: “noreply-dmarc-support@google.com”, “policy_dkim”: “pass”, “policy_pct”: “100”, “auth_spf_result”: “pass”, “auth_dkim_domain”: “xxx,ccc”, “auth_dkim_result”: “pass”, “identifier_header_from”: “xxx,ccc”, “date_end”: “2019-01-04T05:29:59”, “date_start”: “2019-01-03T05:30:00”, “source_ip”: “2.2.2.2”, “count”: 1, “auth_spf_domain”: “xxx,ccc”, “policy_p”: “none”, “submitter”: “unknown”, “policy_disposition”: “none”, “policy_domain”: “xxx,ccc”, “id”: “15325652754200102860”}

And here is my logstash config file which is not parsing the JSON logs correctly. Can someone confirm what is the issue?

input {
file {
type => “json”
path => “/log/*.json”
start_position => “beginning”
}
}

filter {
  if [source_type] == "json-logs" {
    json {
      source => "message"
      tag_on_failure => ["_jsonparsefailure"]
    }
  }
    }

output {
gelf {
host => “localhost”
port => “12202”
protocol => “UDP”
}
}

Wondering if do I need write GROK pattern for my JSON logs? Since those are JSOn should get parsed automatically right?

It doesn’t get parsed automatically - you either need to put a JSON extractor on the input, or use a pipeline to parse the JSON for you.

Oh I was assuming it gets parsed automatically since its an JSON format. ELK does that hence thought it can be applied here as well.

So what exactly I need to do? To get those messages parsed? Apply JSON extractor.

Not sure or not very well versed with Pipeline yet hence looking for better & quicker option.

Yes after applying it worked. How do I make that automatic? Since enabling the extractor means I’ll have to wait for message to arrive then apply the extractor.

Uh, it is automatic. Once you’ve set up an extractor on an input, every message that comes in is processed by the extractor.

Within certain limits, which you can define yourself…

1 Like

Yeah, that… forgot to mention that :smiley:

That has resolved the issue. Thanks for the help.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.