Two elasticsearch instances with one graylog srv

Hello Team,
I want to attach one graylog container to tow elasticsearch containers and create tow indexes each index will be connected to an elasticsearch instance
elasticsearch will store data cumming from different platforms
is it possible first ?

  • Can you connect Graylog to ElasticSearch? Yes
  • Can you connect Graylog to multiple ES hosts? Yes, if they’re in a cluster.
  • Can you connect Graylog to multiple separate Elastic hosts or clusters? I do not know, but I do not think so, no.

So far I have only seen Graylog use one ElasticSearch cluster to dump all the data into.

Is there any particular reason why you want one index on ES1 and another index on ES2? Why not run both/all indices on the same Elastic instance?

i have a very big number of incoming data and a want to store/manage them separately

in my graylog config file i put

Default: http://127.0.0.1:9200

elasticsearch_hosts = http://elasticsearch:9200,http://elasticsearch2:9400

but i don’t know how to send data to the second elasticsearch
by default data is sent to the first one ,

thank you for your help :slight_smile:

Well, unfortunately I do not think you can use one Graylog to split messages across two Elastic instances/clusters. Personally I do not see the point, because you’re going to use Graylog as the GUI anyway.

I fear that you’re trying to be smarter than Graylog or ElasticSearch. Why do you feel that you need to split these two sets of data? What are the business needs driving that decision?

That assumes that both Elastic hosts run the same Elastic cluster. You indicate that you want to throw data at two different Elastic clusters, which I really, really believe is not possible.

@jan?

Not only is it not possible, it’s also a hilariously bad idea for reasons that should be obvious to OP if he actually understood what he was trying to do :wink:

You indicate that you want to throw data at two different Elastic clusters, which I really, really believe is not possible.

Correct that is not possible - all hosts needs to be in the same cluster.

we have a very big number of incoming logs and we want to separate indexes, have a good perf of elasticsearch and get all data on one server (without using the cluster )

thank you for your support :slight_smile:
we will install Graylog (all components ) per platform thank you again

That won’t work. To store lots of data, you are going to need an Elasticsearch cluster, because otherwise you’re just setting yourself up for disappointment. Is there a reason it has to be on a single server?

To confirm whether my thoughts are correct, you are referring to factors like these, right?

  • Input distribution, for higher throughput.
  • Query distribution, for faster queries.
  • Shard distribution, for faster queries.
  • Shard replication, for high-availability (if needed).

Right?

To go down your list:

1: Sort of. It’s usually better to have multiple index sets with multiple shards living on multiple servers so you spread out the indexing load some.
2: That’s just a bonus side effect - for a query, all shards can be used, and if those shards are also distributed on multiple servers, you get faster-ish queries, and less load .
3: Yes
4: Yes

And…

5: Storage. It’s often easier to just plug more data nodes in a cluster than it is to grow a LVM disk and resize the FS. That’s why we have 19 data nodes with 6Tb storage a piece, because we need about 110Tb of “live” storage (with 4Tb of “oh shit something done fucked up” space).

2 Likes

thank you all for support

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.