We are currently looking to Graylog to monitor our cronjobs. However, we 
don't have much experience and are struggling with some best practices. Let 
me give an example.

We have created 1 shell script to dump a MySQL database to a file and to 
compress that file. In our cronjob we call this shell script 10 times to 
dump 10 different databases. Indeed, the shell script expects the database 
name, an output folder and a name for the log file as parameters. When the 
shell script is being executed, we log a lot of "debug" information to a 
log file and we finally write "SUCCESS" at the end of the log file.

We currently have created 1 stream that reads the log file from the shell 
script that is linked to 1 MySQL database. So this stream is linked to the 
log file that is being created when we dump the database "DB1". Next we 
have created an alert that is being triggered when we don't read "SUCCESS" 
in a log file for 24 hours. Meaning that the last MySQL dump hasn't been 
executed or that the cronjob has failed.

This setup seems to be working .... :-)

But if we're calling this script in our cronjob 10 times per night for the 
databases DB1, DB2, DB3, ..., DB10, it seems that we have to create 10 
streams and 10 alerts. 

This doesn't feel correct, and we are wondering if there is a better way to 
organize our cronjob monitoring.

Tx for your help.

Ivan

-- 
You received this message because you are subscribed to the Google Groups 
"Graylog Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/graylog2/1a5315f8-4173-485f-9cbe-34f37fa00347%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to