but I can’t reply to it anymore as it is now closed.
@macko003 replied on it and gave me a script to back up Graylog, but I have confirmed I only need to backup the configuration (streams, dashboards, extractors…), not the actual logs.
So, just wanted to confirm that what I am planning to do is right.
Backup these files:
/etc/mongodb-keyfile
/etc/mongod.conf
/etc/elasticsearch/elasticsearch.yml
/etc/graylog/server/server.conf
/etc/graylog/server/node-id
I can’t find any “mongodb-keyfile” file in my system, is this normal?
Then do a mongodump of the “graylog” database (only his DB? not the admin, local or or config ones?):
/usr/bin/mongodump -d graylog -o /backup/graylog.mongodump
when I do this, it does not create a file, but a folder with lots of json and bson files in it. Is this what’s supposed to happen?
I think not totally clear the components in your mind.
I suggest start with that, I think it will help to answer your questions.
After you have more questions, please let us know.
except for the ElasticSearch data files (/var/lib/elasticsearch/nodes/0), as it is the large dir that contains the actual data, and I do not need to backup this data, I just need graylog configs.
The whole backup is under 1GB, which is very small.
I think this will suffice in the event of the server dying and having to restore its config. What do you think?
It should be ok, but to be sure do a disaster recovery. And make a doc about the steps.
When you have to do it in real life you just open your doc, it can help a lot, and you will calm instead of hasty.
Some thing,
/var/lib/mongodb/ - I think you do the backup with running DB, so it will contains inconsistent data.
/var/lib/graylog-server/journal/ - It contains the journal for messages. You wrote you need the config, so you don’t need it.
/usr/share/graylog-server/ - You will install graylog from apt/yum, so you won’t use eg. the binaries (but you should have the plugins…
/etc - I suggest do a full etc backup. It’s small, but you will have a lot of usefull information at restore. Eg. IP address, gateway, ntp, dns servers, etc. faster if you restore the important files instead of searching for some IPs, or settings.
you can compress the things for a compact, smaller file. eg. change the rsync to tar, and rsync the final file. (eg. my mongo dump is 350 MB, tar make it 15 mb under 10s)
if you compress/tar the whole thing, you can put a date/number in your file name, and you will have a little svn, and you can restore to a good config few backup later
I know, that’s why I do a mongodump and copy the resultant folder (/graylog_backup/) over. So I can probably remove this folder (/var/lib/mongodb/) from the rsync command, I don’t need it if I have the dump.
I thought so but I wasn’t fully sure. I will remove this folder from the rsync command as well then.
Yes, I do not need the binaries, but I do not want to have to copy multiple subdirs, so I just copy the whole dir as it is only a few MBs.
Might do that as the whole /etc/ is only 65MB.
My setup is really small, just a 1MB database, so no need to compress for now. Will think about it if it grows over time.