Auditbeat gelf graylog - source field contain all json details

Hi

I create a gelf udp input for my auditbeat sender.

My auditbeat config is:
auditbeat.modules:

- module: auditd
  audit_rule_files: [ '${path.config}/audit.rules.d/*.conf' ]
  audit_rules: |

- module: file_integrity
  paths:
  - /bin
  - /usr/bin
  - /sbin
  - /usr/sbin
  - /etc

- module: system
  datasets:
    - package # Installed, updated, and removed packages

  period: 2m # The frequency at which the datasets check for changes

- module: system
  datasets:
    - host    # General host information, e.g. uptime, IPs
    - login   # User logins, logouts, and system boots.
    - process # Started and stopped processes
    - socket  # Opened and closed sockets
    - user    # User information

  state.period: 12h

  user.detect_password_changes: true

  login.wtmp_file_pattern: /var/log/wtmp*
  login.btmp_file_pattern: /var/log/btmp*

setup.template.settings:
  index.number_of_shards: 1

setup.kibana:

output.logstash:
  hosts: ["<LOGSTASHIP:PORT>"]

processors:
  - add_host_metadata: ~
  - add_cloud_metadata: ~
  - add_docker_metadata: ~

Events are send to logstash. My logstash config is:

input {
  beats {
    port => <INPUTPORT>
  }
 }

output {
  gelf {
    host => "<GRAYLOGIP>"
    port => <GRAYLOGPORT>
  }
 } 

The input is:

bind_address:     0.0.0.0
decompress_size_limit:     8388608
number_worker_threads:     8
override_source:     <empty>
port:     <INPUTPORT>
recv_buffer_size:     262144

In Graylog I’ve the event log with several fields but “source” field doesn’t contain only the hostname, it’s contain all details host. Example:

 agent_hostname
    <GOOD_HOSTNAME>
	
[...]
	
full_message
    Process sshd (PID: 21825) by user root STARTED

[...]

source
    {"hostname":"<GOOD_HOSTNAME>","os":{"name":"CentOS Linux","family":"redhat","version":"8 (Core)","kernel":"4.18.0-193.6.3.el8_2.x86_64","platform":"centos","codename":"Core"},"ip":["<IPHOST>","<MAC>"],"containerized":false,"name":"<GOOD_HOSTNAME>","id":"<ID>","mac":["<MAC>"],"architecture":"x86_64"}

How to do to have separate fields based on this informations ?

Thanks

EDIT :

I created a json extractor based on source field. It’s create fields like this:

os_family
    redhat
os_kernel
    4.18.0-193.6.3.el8_2.x86_64
os_version
    8 (Core)
os_platform
    centos
ip
    <IP, MAC>
os_codename
    Core
mac
    <MAC>
hostname
    <GOOD_HOSTNAME>
containerized
name
    <GOOD_HOSTNAME>
os_name
    CentOS Linux
id
    <ID>
architecture
    x86_64 

How to do to have the value of containerized field ? (true / false) to have the IP/MAC distinct fields ?

Thanks

EDIT 2:

The last one works well.

Only one detail doesn’t work. I’ve created a json extractor based on source and system_audit fields with “CUT” policy. But the “CUT” policy doesn’t works. The source and system_audit fields was full with old values before parsing. Any idea ?

Also if you have an other idea to do this json parsing without create N json extractor based on N fields, I listen :slight_smile:

he @celine

why did you not send auditbeat direct to a beats input? that removes the additional logstash as moving part …

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.