Alert conditions


(Evgeniy) #1

Hello!
How to configure alert condition, that would graylog had sent all the posts falling within the stream.
Now I use сonfiguration: Alert is triggered when there are more than 0 messages in the last minute. Grace period: 0 minutes. Including last 5 messages in alert notification.
But then when the trigger occurs, the new messages in stream don’t create a new trigger if old is not resolved.
Because of this, security events are skipped in the mail.


(Philipp Ruland) #2

Hello Evgeniy,

which version of Graylog are you using? With version 2.2 (and the 2.2-beta versions), Graylog switched the alerting system to stateful alerts/notifications. This means, a triggered alert will only send one notification (email) and wait until it has been cleared to generate a new notification. See this “With the new stateful notifications, you will not be notified again until the alert condition is no longer satisfied.” on the Graylog blog.

Since you are looking for the message count being larger than 0 in the stream, this alert condition could stay satisfied for a long time.

As far as I know there is no plugin availiable, that acts as a output and sends the messages in emails. This could be an idea for a new plugin, I might look into it when I’ve got time :slight_smile:
You could try the Aggregates Plugin since this uses a custom timing cycle…


Back to normal alert
(Evgeniy) #3

Hello DerPhlipsi,
thanks for your reply.

I using graylog version 2.2.
I tried to use Aggregates Plugin, but it’s not exactly what I need and i don’t can customize notifications. It would be cool if you can realize the function of sending all messages from the Stream to the mail. I think this feature is useful to many. Thank you very much.


(Philipp Ruland) #4

Hey @Evgeniy,

FYI I just saw this: https://github.com/Graylog2/graylog2-server/issues/3511.
This adresses your issue, maybe it is included in one of the next bug-fixes.

Greets - Phil


(Kris) #5

Am I wrong, or shouldn’t this cause an email on every message?

Alert is triggered when there are more than 0 messages in the last 0 minutes. Grace period: 0 minutes. Including last message in alert notification.

Or does this just cause no emails to be sent?


(Philipp Ruland) #6

Well, this is something Graylog is accepting as valid configuration. But I don’t know what its behaviour is…

Are you able to test it? :slight_smile:


(Philipp Ruland) #7

I just figured it out. Graylog uses 0 as a marker for all messages, this is also true in alerts. So this will make Graylog trigger once and will keep the alert active forever (Look at the Triggered at timestamp), or until your stream actually has no messages in it (e.g. when all indexes are closed)


(Kris) #8

Good to know. Do you know if there’s a fix in the works for this? I was reading a lot of people are wanting the previous version back.

What I’m ultimately needing is a means of specifying that I want a new alert for every different message fired off within a specified timeframe. What would be nice (for testing at least) is to be able to set it to send an email for every new message received.


(Jochen) #9

(Kris) #10

Does this mean a fix is in the works, available, what?

Forgive me for my ignorance, but what exactly does this mean? I’m still new to using github, so I’m just wanted to make sure I understand this properly.

If the fix is there, am I able to pull it down and apply it?


(Jochen) #11

Stateful alert notifications have been made optional in Graylog 2.2.2, which was released today.


(Philipp Ruland) #12

Hey @KO1984,

This means, that a fix is already implemented in (merged into) the most recent code of the 2.2 branch available on GitHub. So if you want to use the fix, you could download the release @jochen mentioned and compile it yourself, or you just wait for update 2.2.2 to become available as “official download” on the Graylog Download Page.

Greetings - Phil

PS: I wrote this before @jochen answered but had to go afk, so I’ll post it anyway for your information. :smiley:


(Kris) #13

Thank you! You guys are awesome.
I really appreciate the help.


Alerts - Repeats notifications every minutes
(Philipp Ruland) #14

Your welcome :slight_smile:

Greetings - Phil


(Kris) #15

As a last thing, as it stands, is there any way to make it so that alert emails are sent on every message received in a stream?
I’ve tried a few configurations, but I’m still having alerts stack messages. This is a problem if I get multiple messages from the same source, but of a different meaning.


(Alex) #16

OMG… this issue has been so annoying. So given the logic there really isn’t a way to raise an alert if a message count is 1 or greater without hacking in the usage of “repeat notifications”?

EDIT: I tested this and removing all 0’s from conditions didn’t really help our issues with alerts never closing… what is going on with 2.2.x?


(Jochen) #17

What’s the alert condition you want to formulate?


(Alex) #18

The most simple condition. If stream has any new events, raise an alert + grace period.

Right now:

Stream picks up issues -> Condition message count > 0 -> One or more notifications.


(Jochen) #19

Try using the following Message Count Alert Condition:

Alert is triggered when there are more than 0 messages in the last 5 minutes. Grace period: 5 minutes. Not including any messages in alert notification. Configured to repeat notifications.

The JSON payload for this would be:

{
   "title" : "Alert Condition title",
   "creator_user_id" : "admin",
   "type" : "message_count",
   "parameters" : {
      "backlog" : 0,
      "threshold" : 0,
      "threshold_type" : "MORE",
      "repeat_notifications" : true,
      "time" : 5,
      "grace" : 5
   }
}

(Alex) #20

The problem with that is it doesn’t solve the issue I see. The alerts remain open forever even though only one event was received. Even with the grace period, Graylog alerts every 5 minutes “until the server is restarted”

Something is causing the alert conditions to always be matched. Screenshots and details in https://github.com/Graylog2/graylog2-server/issues/3663

Just follow the logic in this one screenshot and it doesn’t make sense :slight_smile: