Graylog ingesting Crowdstrike FDR logs (Solution)

I spent days searching for a solution to the above. Graylog’s AWS plugin doesn’t work in this case unless you have your own bucket that FDR is dumping into, and Filebeat can’t read the input (likely because the data is stored in gz). So for those that want an actual solution that doesn’t involve “Just spend thousands per month on Splunk!”, here it is:

  1. Use Logstash with the s3 plugin. Example conf.d/fdr.conf:
input {
  s3 {
    access_key_id => "AKblahblahblahblah"
    secret_access_key => "ThisIsNotTheSecretAccessKeyYouAreLookingFor"
    bucket => "CrowdstrikeWillSellYouThis"
    region => "us-some-region"
    additional_settings => {
    force_path_style => false
    follow_redirects => false
    }
  }
}

output {
  gelf {
    host => "GraylogIPorHostname"
    port => PortNumber
    sender => "FDR"
  }
}

No, gelf isn’t required. The challenge was never the output.

Also: Default FDR settings (no filters) will generate at least 5GB/day by itself, flooding Graylog with data every 5 minutes.

And finally, this data is in json format, so once the flood starts flooding, create an Extractor on the Message field, select JSON Extractor, and you should be good to go. You’ll likely have to create another extractor somewhere in order to get the Timestamp to work.

2 Likes

@enjet_it,
Thank you for posting a solution here! It’s members like you who make the Graylog Community a great place to visit regularly for discussions and new information on Graylog technologies.

Glad you’re in the community!

For sanity’s sake, you can wedge in a json parser between the Input and Output:

filter {
  json {
    source => "message"
  }
}

1 Like

Hey @enjet_it
Awesome thanks for sharing :+1: It would be awsome to post that here so it does not get lost over time. Just an Idea.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.