Trigger alert if logs stop?

What would be a way to set up an alert if logs from stop from any one of about 30 sources? Those sources send into about 20 inputs. I’m not sure how to monitor messages below a certain threshold based on the different values of one field.

Hi!

I’d go with the Message Count Alert Condition. Here you need to choose the threshold and to type the query. If there’s too few matches in the chosen interval, the alert would go off.

I tried that, problem is that each input I deal with has several sources and that alert isn’t able to tell if only one source out of like 3 stops sending logs.

Then you might want to consider how to tell them apart. For example add an unique identifier (e.g. the hostname) to the log of each of them, extract it to a field, and use a search query with it as the trigger for the alert.

What would a query for that look like? I’m struggling to think of how to write a single query that monitors each individual hostname.

Well, the idea here’s not about a query, but about changing the log itself at the source.

For example, you have some machine with a log like ‘date bla-bla error’. Normally it doesn’t have an identifier, so you can’t tell it apart in Graylog. But if you were to add some tag, then you’d have a unique string like ‘tag date bla-bla error

After that you would be able to easily extract the tag to a separate field, then use a simple query like ‘message: date bla-bla error’. Then just add the field var to the alert body, and you’d have one alert for all sources, but with unique name (tag) in the email.

Or, if you want to set alert up individually for each of the sources, then the query would be like:

field_with_tag: tag

Therefore you would have as many alerts as sources there are.

With the new alerting in 3.1 you will be able to make this alert. with the current you have only the above described option.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.