I have a graylog server, that is getting data from multiple servers. At the moment, I have “All messages” stream, that I use to monitor information.
For long storage purposes, I would like to keep a copy of that messages in a S3 bucket, so what I did was:
- Installed a new server, with logstash, and configured a GELF input to receive data on port 12011/TCP
- Configured a S3 Output in logstash to save a copy of everything
On my actual graylog, I created a new GELF output, to send data to that logstash server.
Trying netcat from graylog to logstash port, I can see that it works.
But, nothing from “all messages” stream is arriving to logstash
Any help would be appreciated.
Hello && Welcome
You may need to create a different stream instead of " All messages" then use that new stream to send your GELF Output.
Have you check your Log files on the Graylog server to see if there are any issues?
Thank you for your reply!
No errors in graylog log. If I create a new stream, and do not choose the option to delete from all messages (that I can’t at the moment) it will duplicate messages in elastic index, right? It will use twice the disk space?
Thank you once again
Yes that’s correct.
The stream output system allows you to forward every message that is routed into a stream to other destinations.
Outputs are managed globally (like message inputs) and not for single streams. You can create new outputs and activate them for as many streams as you like. This way you can configure a forwarding destination once and select multiple streams to use it.
So if you create another stream and forward the messages to another destination. All message in the second stream should be gone, hence In theory it shouldn’t use up your storage.
Hope that helps.
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.