Connecting Graylog to Wazuh

1. Describe your incident:
I have a problem connecting Graylog to Wazuh. After starting Graylog it does not connect to Wazuh-indexer.

2. Describe your environment:

  • OS Information: Ubuntu 22.04

  • Package Version:

  • graylog-server 5.2.11-1

  • wazuh 4.9.0 with opensearch 2.13.0

  • mongodb 6.0.18

  • Service logs, configurations, and environment variables:

My graylog configuration:

cat /etc/graylog/server/server.conf | egrep -v “^\s*(#|$)”

is_leader = true
node_id_file = /etc/graylog/server/node-id
password_secret = EjLCkSryw2etu5LS1pqCJH561ky10FXjiK6Q82xG6786bbwC9SBjEyxIBpt2cI0KMpBWvZthxfwu2vtK
root_password_sha2 = 3e0fed9c1970b6959cd2082c3c776e5b844735d49a043d31a6
bin_dir = /usr/share/graylog-server/bin
data_dir = /var/lib/graylog-server
plugin_dir = /usr/share/graylog-server/plugin
http_bind_address = 0.0.0.0:9000
stream_aware_field_types=false
elasticsearch_host = https://graylog:62guOA1nmHlt22m4S0X@172.10.244.169:9200
disabled_retention_strategies = none
allow_leading_wildcard_searches = false
allow_highlighting = false
field_value_suggestion_mode = on
output_batch_size = 500
output_flush_interval = 1
output_fault_count_threshold = 5
output_fault_penalty_seconds = 30
processbuffer_processors = 5
outputbuffer_processors = 3
processor_wait_strategy = blocking
ring_size = 65536
inputbuffer_ring_size = 65536
inputbuffer_processors = 2
inputbuffer_wait_strategy = blocking
message_journal_enabled = true
message_journal_dir = /var/lib/graylog-server/journal
lb_recognition_period_seconds = 3
mongodb_uri = mongodb://localhost/graylog
mongodb_max_connections = 1000

3. What steps have you already taken to try and solve the problem?

when the server starts, I get Initial configuration, but when I go to the page and initialize, I see a message “No data nodes have been found”

If i press “Skip provisioning” I get error:
2024-10-01T16:33:34.358+03:00 ERROR [VersionProbe] Unable to retrieve version from Elasticsearch node: unexpected end of stream on http://127.0.0.1:9200/… - \n not found: limit=0 content=….
2024-10-01T16:33:34.359+03:00 INFO [VersionProbe] OpenSearch/Elasticsearch is not available. Retry #89

4. How can the community help?

How can I connect Graylog to Wazuh?

Helpful Posting Tips: Tips for Posting Questions that Get Answers [Hold down CTRL and link on link to open tips documents in a separate tab]

Hello @23uk

This behaviour occurs when the elasticsearch_hosts option has not been set within server.conf but yours appears to be correct. I would not that your setting is elasticsearch_host and not elasticsearch_hosts, is this simply a typo?

Thank you very much! The problem is indeed a typo if settings elasticsearch_hosts , but after after it was corrected, another problem appeared.

2024-10-01T17:54:25.575+03:00 ERROR [VersionProbe] Unable to retrieve version from Elasticsearch node: Hostname 172.10.244.169 not verified:
certificate: sha256/+D4AZhWLtN4VC6Jw4SOIFjI63d4nKvTZm7S2WlDdnXg=
DN: CN=wazuh.indexer, OU=Wazuh, O=Wazuh, L=California, C=US
subjectAltNames: [wazuh.indexer]. - Hostname 178.172.244.169 not verified:
certificate: sha256/+D4AZhWLtN4VC6Jw4SOIFjI63d4nKvTZm7S2WlDdnXg=
DN: CN=wazuh.indexer, OU=Wazuh, O=Wazuh, L=California, C=US
subjectAltNames: [wazuh.indexer].

Previously I added a trusted certificate from wazuh-indexer using the command:
keytool -importcert -keystore /etc/graylog/server/certs/cacerts -storepass changeit -alias root_ca -file /etc/graylog/server/certs/root-ca.pem

What else could this problem be related to?

Did that cert have the ip you are connecting to listed as a SAN and if not could you not use the hostname as it appears within the cert?

Thank for your help!

I use wazuh with docker and my problem was solved after setting the host specified in docker-compose.yml (wazuh.indexer).
as a result I got the following value:

elasticsearch_hosts = https://graylog:62guOAHlt22m4S0X@wazuh.indexer:9200

also the value “wazuh.indexer” needs to be entered into the “/etc/hosts”
127.0.0.1 wazuh.indexer

and my service started correctly!

1 Like