How to map JSON log fields to GELF and send them to Graylog

I have some log files that I am trying to send to Graylog. The format is one JSON object per line and here is an indented one:

{
  "time": "2017-05-17T11:28:18.677Z",
  "levelName": "INFO",
  "msg": "SchedulerService.lock: Locked instance",
  "context": {
    "instanceId": "566c6513-d0e0-46b3-9b8a-441131e6feff"
  },
  "name": "export-scheduler",
  "hostname": "33acf099f16f",
  "pid": 2435,
  "level": 30,
  "v": 0
}

I’m trying to send local logs with the following mapping:

GELF field:	log field
---
version:	"1.1"
host:	hostname JSON field
short_message:	msg JSON field
full_message:	entire JSON object
timestamp:	time JSON field converted to UNIX timestamp
facility:	name JSON field

The idea is to have the actual host, timestamp, facility, and short message sent to Graylog. I did not find a way to do this using sidecar collector or filebeats. This seems like a basic need to me but it’s not obvious how to do it. Overwriting fields afterwards via extractors is deprecated in Graylog.

Any ideas on how to achieve this?

Why do you think that?

I found a comment here: https://github.com/Graylog2/graylog2-server/issues/456#issuecomment-36340587

I think you’ve misinterpreted that comment.

An extractor configured to overwrite the timestamp and other fields worked. Now everything works fine. The date format had to be changed. Thank you for the quick reply!

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.