Stream Processing Graylog - Multiple messages

Hi folks,

We have a use case where we need to write a stream processor on the data.
For example: Let’s say CPU utilization data is coming into Graylog into some stream.
We want to write a rule which will compute average CPU utilization for 15 minute period and if it is more than 90% or something more complex like 2 standard deviations away from mean, it should DO SOMETHING.

In another setup where we don’t have Graylog, we are reading data from Kafka and running stream rules.

What is the best way to achieve such use cases when we use Graylog?
We don’t want to do this as BATCH by pulling data directly from ElasticSearch.

Please suggest.


He Sachin,

you would need to write your own plugin that does what you want … with vanilla Graylog that is not possible.

Hi Jan,

Do you think it would be something which Graylog would need in future and the community would find it helpful?

If yes, please share some ideas on how we should build this. We can build this and open source it back to the Graylog community.


I guess that some of the users in the community would like to have a plugin that is able to build a deviation over time or similar.

But how you do this, is up to you. Even that you contribute - or want to is nice.

I think it’s a basic task for a monitoring system.

Unfortunately GL collects logs, and process it one-by-one, so it can’t handle any task which is related to multiple messages (except alerts).

Yes, that example is better achieved with monitoring systems.
My focus was more around doing some similar stuff by directly interacting with Kafka.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.