Fresh Install Loop runing migrations

1. Describe your incident:

Fresh container LXD using with Ubuntu 22 and after follow the official docs, graylog startup script enter in loop and don’t finish the setup process.

2. Environment:

  • OS Information: Ubuntu 22 Jammy

  • Package Version: Graylog 4.3.7 / Mongodb 4.0.28 / Opensearch 1.3.4

  • Service logs, configurations, and environment variables:

Error from graylog Log:

2022-09-19T13:27:10.048-03:00 ERROR [AuditLogger] Unable to write audit log entry because there is no valid license.
2022-09-19T13:27:10.062-03:00 INFO  [ServerBootstrap] Graylog server 4.3.7+05bccc7 starting up
2022-09-19T13:27:10.067-03:00 INFO  [ServerBootstrap] JRE: Private Build 18.0.2-ea on Linux 5.15.0-47-generic
2022-09-19T13:27:10.070-03:00 INFO  [ServerBootstrap] Deployment: deb
2022-09-19T13:27:10.071-03:00 INFO  [ServerBootstrap] OS: Ubuntu 22.04.1 LTS (jammy)
2022-09-19T13:27:10.071-03:00 INFO  [ServerBootstrap] Arch: amd64
2022-09-19T13:27:11.101-03:00 INFO  [ServerBootstrap] Running 62 migrations...
2022-09-19T13:27:12.817-03:00 WARN  [ServerBootstrap] Exception while running migrations
org.graylog.shaded.elasticsearch7.org.elasticsearch.ElasticsearchException: Unable to retrieve cluster information
        at org.graylog.storage.elasticsearch7.ElasticsearchClient.exceptionFrom(ElasticsearchClient.java:151) ~[?:?]
        at org.graylog.storage.elasticsearch7.ElasticsearchClient.execute(ElasticsearchClient.java:111) ~[?:?]
        at org.graylog.storage.elasticsearch7.PlainJsonApi.perform(PlainJsonApi.java:38) ~[?:?]
        at org.graylog.storage.elasticsearch7.NodeAdapterES7.version(NodeAdapterES7.java:41) ~[?:?]
        at org.graylog2.indexer.cluster.Node.getVersion(Node.java:33) ~[graylog.jar:?]
        at org.graylog2.migrations.V20170607164210_MigrateReopenedIndicesToAliases.getReopenedIndices(V20170607164210_MigrateReopenedIndicesToAliases.java:87) ~[graylog.jar:?]
        at org.graylog2.migrations.V20170607164210_MigrateReopenedIndicesToAliases.getReopenedIndices(V20170607164210_MigrateReopenedIndicesToAliases.java:137) ~[graylog.jar:?]
        at org.graylog2.migrations.V20170607164210_MigrateReopenedIndicesToAliases.lambda$upgrade$0(V20170607164210_MigrateReopenedIndicesToAliases.java:81) ~[graylog.jar:?]
        at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:273) ~[?:?]
        at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197) ~[?:?]
        at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:992) ~[?:?]
        at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?]
        at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) ~[?:?]
        at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150) ~[?:?]
        at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173) ~[?:?]
        at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
        at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596) ~[?:?]
        at org.graylog2.migrations.V20170607164210_MigrateReopenedIndicesToAliases.upgrade(V20170607164210_MigrateReopenedIndicesToAliases.java:83) ~[graylog.jar:?]
        at org.graylog2.bootstrap.ServerBootstrap.lambda$runMigrations$0(ServerBootstrap.java:264) ~[graylog.jar:?]
        at com.google.common.collect.ImmutableList.forEach(ImmutableList.java:422) ~[graylog.jar:?]
        at com.google.common.collect.RegularImmutableSortedSet.forEach(RegularImmutableSortedSet.java:88) ~[graylog.jar:?]
        at org.graylog2.bootstrap.ServerBootstrap.runMigrations(ServerBootstrap.java:261) ~[graylog.jar:?]
        at org.graylog2.bootstrap.ServerBootstrap.startCommand(ServerBootstrap.java:187) [graylog.jar:?]
        at org.graylog2.bootstrap.CmdLineTool.run(CmdLineTool.java:311) [graylog.jar:?]
        at org.graylog2.bootstrap.Main.main(Main.java:45) [graylog.jar:?]
Caused by: java.net.ConnectException: Connection refused
        at org.graylog.shaded.elasticsearch7.org.elasticsearch.client.RestClient.extractAndWrapCause(RestClient.java:849) ~[?:?]
        at org.graylog.shaded.elasticsearch7.org.elasticsearch.client.RestClient.performRequest(RestClient.java:259) ~[?:?]
        at org.graylog.shaded.elasticsearch7.org.elasticsearch.client.RestClient.performRequest(RestClient.java:246) ~[?:?]
        at org.graylog.storage.elasticsearch7.PlainJsonApi.lambda$perform$0(PlainJsonApi.java:40) ~[?:?]
        at org.graylog.storage.elasticsearch7.ElasticsearchClient.execute(ElasticsearchClient.java:109) ~[?:?]
        ... 23 more
Caused by: java.net.ConnectException: Connection refused
        at sun.nio.ch.Net.pollConnect(Native Method) ~[?:?]
        at sun.nio.ch.Net.pollConnectNow(Net.java:672) ~[?:?]
        at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:946) ~[?:?]
        at org.graylog.shaded.elasticsearch7.org.apache.http.impl.nio.reactor.DefaultConnectingIOReactor.processEvent(DefaultConnectingIOReactor.java:174) ~[?:?]
        at org.graylog.shaded.elasticsearch7.org.apache.http.impl.nio.reactor.DefaultConnectingIOReactor.processEvents(DefaultConnectingIOReactor.java:148) ~[?:?]
        at org.graylog.shaded.elasticsearch7.org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor.execute(AbstractMultiworkerIOReactor.java:351) ~[?:?]
        at org.graylog.shaded.elasticsearch7.org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager.execute(PoolingNHttpClientConnectionManager.java:221) ~[?:?]
        at org.graylog.shaded.elasticsearch7.org.apache.http.impl.nio.client.CloseableHttpAsyncClientBase$1.run(CloseableHttpAsyncClientBase.java:64) ~[?:?]
        at java.lang.Thread.run(Thread.java:833) ~[?:?]
2022-09-19T13:27:27.161-03:00 INFO  [ImmutableFeatureFlagsCollector] Following feature flags are used: {}
2022-09-19T13:27:29.119-03:00 INFO  [CmdLineTool] Loaded plugin: AWS plugins 4.3.7 [org.graylog.aws.AWSPlugin]
2022-09-19T13:27:29.122-03:00 INFO  [CmdLineTool] Loaded plugin: Enterprise Integrations 4.3.7 [org.graylog.enterprise.integrations.EnterpriseIntegrationsPlugin]
2022-09-19T13:27:29.124-03:00 INFO  [CmdLineTool] Loaded plugin: Integrations 4.3.7 [org.graylog.integrations.IntegrationsPlugin]
2022-09-19T13:27:29.126-03:00 INFO  [CmdLineTool] Loaded plugin: Collector 4.3.7 [org.graylog.plugins.collector.CollectorPlugin]
2022-09-19T13:27:29.128-03:00 INFO  [CmdLineTool] Loaded plugin: Graylog Enterprise 4.3.7 [org.graylog.plugins.enterprise.EnterprisePlugin]
2022-09-19T13:27:29.133-03:00 INFO  [CmdLineTool] Loaded plugin: Graylog Enterprise (ES6 Support) 4.3.7 [org.graylog.plugins.enterprise.org.graylog.plugins.enterprise.es6.EnterpriseES6Plugin]
2022-09-19T13:27:29.135-03:00 INFO  [CmdLineTool] Loaded plugin: Graylog Enterprise (ES7 Support) 4.3.7 [org.graylog.plugins.enterprise.org.graylog.plugins.enterprise.es7.EnterpriseES7Plugin]
2022-09-19T13:27:29.137-03:00 INFO  [CmdLineTool] Loaded plugin: Threat Intelligence Plugin 4.3.7 [org.graylog.plugins.threatintel.ThreatIntelPlugin]
2022-09-19T13:27:29.140-03:00 INFO  [CmdLineTool] Loaded plugin: Elasticsearch 6 Support 4.3.7+05bccc7 [org.graylog.storage.elasticsearch6.Elasticsearch6Plugin]
2022-09-19T13:27:29.141-03:00 INFO  [CmdLineTool] Loaded plugin: Elasticsearch 7 Support 4.3.7+05bccc7 [org.graylog.storage.elasticsearch7.Elasticsearch7Plugin]
2022-09-19T13:27:29.214-03:00 INFO  [CmdLineTool] Running with JVM arguments: -Xms1g -Xmx1g -XX:NewRatio=1 -XX:+ResizeTLAB -XX:-OmitStackTraceInFastThrow -Djdk.tls.acknowledgeCloseNotify=true -Dlog4j2.formatMsgNoLookups=true -Dlog4j.configurationFile=file:///etc/graylog/server/log4j2.xml -Djava.library.path=/usr/share/graylog-server/lib/sigar -Dgraylog2.installation_source=deb
2022-09-19T13:27:30.400-03:00 INFO  [cluster] Cluster created with settings {hosts=[localhost:27017], mode=SINGLE, requiredClusterType=UNKNOWN, serverSelectionTimeout='30000 ms', maxWaitQueueSize=5000}
2022-09-19T13:27:30.500-03:00 INFO  [cluster] Cluster description not yet available. Waiting for 30000 ms before timing out
2022-09-19T13:27:30.579-03:00 INFO  [connection] Opened connection [connectionId{localValue:1, serverValue:1687}] to localhost:27017
2022-09-19T13:27:30.655-03:00 INFO  [cluster] Monitor thread successfully connected to server with description ServerDescription{address=localhost:27017, type=STANDALONE, state=CONNECTED, ok=true, version=ServerVersion{versionList=[4, 0, 28]}, minWireVersion=0, maxWireVersion=7, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=5087934}
2022-09-19T13:27:30.680-03:00 INFO  [connection] Opened connection [connectionId{localValue:2, serverValue:1688}] to localhost:27017
2022-09-19T13:27:30.733-03:00 INFO  [connection] Closed connection [connectionId{localValue:2, serverValue:1688}] to localhost:27017 because the pool has been closed.
2022-09-19T13:27:30.736-03:00 INFO  [MongoDBPreflightCheck] Connected to MongoDB version 4.0.28
2022-09-19T13:27:31.364-03:00 INFO  [SearchDbPreflightCheck] Connected to (Elastic/Open)Search version <OpenSearch:1.3.4>
2022-09-19T13:27:31.552-03:00 INFO  [Version] HV000001: Hibernate Validator null
2022-09-19T13:27:39.826-03:00 INFO  [InputBufferImpl] Message journal is enabled.
2022-09-19T13:27:39.844-03:00 INFO  [NodeId] Node ID: d236b99f-a6d7-4815-a1fc-75e9392550c2
2022-09-19T13:27:40.242-03:00 INFO  [LogManager] Loading logs.
2022-09-19T13:27:40.296-03:00 WARN  [Log] Found a corrupted index file, /var/lib/graylog-server/journal/messagejournal-0/00000000000000000000.index, deleting and rebuilding index...
2022-09-19T13:27:40.392-03:00 INFO  [LogManager] Logs loading complete.
2022-09-19T13:27:40.397-03:00 INFO  [LocalKafkaJournal] Initialized Kafka based journal at /var/lib/graylog-server/journal
2022-09-19T13:27:40.403-03:00 INFO  [cluster] Cluster created with settings {hosts=[localhost:27017], mode=SINGLE, requiredClusterType=UNKNOWN, serverSelectionTimeout='30000 ms', maxWaitQueueSize=5000}
2022-09-19T13:27:40.409-03:00 INFO  [cluster] Cluster description not yet available. Waiting for 30000 ms before timing out
2022-09-19T13:27:40.433-03:00 INFO  [connection] Opened connection [connectionId{localValue:3, serverValue:1689}] to localhost:27017
2022-09-19T13:27:40.435-03:00 INFO  [cluster] Monitor thread successfully connected to server with description ServerDescription{address=localhost:27017, type=STANDALONE, state=CONNECTED, ok=true, version=ServerVersion{versionList=[4, 0, 28]}, minWireVersion=0, maxWireVersion=7, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=1655181}
2022-09-19T13:27:40.445-03:00 INFO  [connection] Opened connection [connectionId{localValue:4, serverValue:1690}] to localhost:27017
2022-09-19T13:27:40.644-03:00 INFO  [InputBufferImpl] Initialized InputBufferImpl with ring size <65536> and wait strategy <BlockingWaitStrategy>, running 2 parallel message handlers.
2022-09-19T13:27:41.072-03:00 INFO  [ElasticsearchVersionProvider] Elasticsearch cluster is running OpenSearch:1.3.4
2022-09-19T13:27:42.295-03:00 INFO  [ProcessBuffer] Initialized ProcessBuffer with ring size <65536> and wait strategy <BlockingWaitStrategy>.
2022-09-19T13:27:42.428-03:00 INFO  [OutputBuffer] Initialized OutputBuffer with ring size <65536> and wait strategy <BlockingWaitStrategy>.
2022-09-19T13:27:42.444-03:00 INFO  [connection] Opened connection [connectionId{localValue:5, serverValue:1691}] to localhost:27017
2022-09-19T13:27:42.558-03:00 INFO  [connection] Opened connection [connectionId{localValue:6, serverValue:1692}] to localhost:27017
2022-09-19T13:27:42.623-03:00 INFO  [connection] Opened connection [connectionId{localValue:7, serverValue:1693}] to localhost:27017
2022-09-19T13:27:42.709-03:00 INFO  [connection] Opened connection [connectionId{localValue:8, serverValue:1694}] to localhost:27017
2022-09-19T13:27:42.876-03:00 INFO  [connection] Opened connection [connectionId{localValue:9, serverValue:1695}] to localhost:27017
2022-09-19T13:27:49.243-03:00 ERROR [AuditLogger] Unable to write audit log entry because there is no valid license.
2022-09-19T13:27:49.248-03:00 INFO  [ServerBootstrap] Graylog server 4.3.7+05bccc7 starting up
2022-09-19T13:27:49.257-03:00 INFO  [ServerBootstrap] JRE: Private Build 18.0.2-ea on Linux 5.15.0-47-generic
2022-09-19T13:27:49.259-03:00 INFO  [ServerBootstrap] Deployment: deb
2022-09-19T13:27:49.261-03:00 INFO  [ServerBootstrap] OS: Ubuntu 22.04.1 LTS (jammy)
2022-09-19T13:27:49.263-03:00 INFO  [ServerBootstrap] Arch: amd64
2022-09-19T13:27:49.439-03:00 INFO  [connection] Opened connection [connectionId{localValue:10, serverValue:1696}] to localhost:27017
2022-09-19T13:27:50.284-03:00 INFO  [ServerBootstrap] Running 62 migrations...
2022-09-19T13:27:52.004-03:00 WARN  [ServerBootstrap] Exception while running migrations
org.graylog.shaded.elasticsearch7.org.elasticsearch.ElasticsearchException: Unable to retrieve cluster information
        at org.graylog.storage.elasticsearch7.ElasticsearchClient.exceptionFrom(ElasticsearchClient.java:151) ~[?:?]
        at org.graylog.storage.elasticsearch7.ElasticsearchClient.execute(ElasticsearchClient.java:111) ~[?:?]
        at org.graylog.storage.elasticsearch7.PlainJsonApi.perform(PlainJsonApi.java:38) ~[?:?]
        at org.graylog.storage.elasticsearch7.NodeAdapterES7.version(NodeAdapterES7.java:41) ~[?:?]
        at org.graylog2.indexer.cluster.Node.getVersion(Node.java:33) ~[graylog.jar:?]
        at org.graylog2.migrations.V20170607164210_MigrateReopenedIndicesToAliases.getReopenedIndices(V20170607164210_MigrateReopenedIndicesToAliases.java:87) ~[graylog.jar:?]
        at org.graylog2.migrations.V20170607164210_MigrateReopenedIndicesToAliases.getReopenedIndices(V20170607164210_MigrateReopenedIndicesToAliases.java:137) ~[graylog.jar:?]
        at org.graylog2.migrations.V20170607164210_MigrateReopenedIndicesToAliases.lambda$upgrade$0(V20170607164210_MigrateReopenedIndicesToAliases.java:81) ~[graylog.jar:?]
        at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:273) ~[?:?]
        at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197) ~[?:?]
        at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:992) ~[?:?]
        at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?]
        at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) ~[?:?]
        at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150) ~[?:?]
        at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173) ~[?:?]
        at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
        at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596) ~[?:?]
        at org.graylog2.migrations.V20170607164210_MigrateReopenedIndicesToAliases.upgrade(V20170607164210_MigrateReopenedIndicesToAliases.java:83) ~[graylog.jar:?]
        at org.graylog2.bootstrap.ServerBootstrap.lambda$runMigrations$0(ServerBootstrap.java:264) ~[graylog.jar:?]
        at com.google.common.collect.ImmutableList.forEach(ImmutableList.java:422) ~[graylog.jar:?]
        at com.google.common.collect.RegularImmutableSortedSet.forEach(RegularImmutableSortedSet.java:88) ~[graylog.jar:?]
        at org.graylog2.bootstrap.ServerBootstrap.runMigrations(ServerBootstrap.java:261) ~[graylog.jar:?]
        at org.graylog2.bootstrap.ServerBootstrap.startCommand(ServerBootstrap.java:187) [graylog.jar:?]
        at org.graylog2.bootstrap.CmdLineTool.run(CmdLineTool.java:311) [graylog.jar:?]
        at org.graylog2.bootstrap.Main.main(Main.java:45) [graylog.jar:?]
Caused by: java.net.ConnectException: Connection refused
        at org.graylog.shaded.elasticsearch7.org.elasticsearch.client.RestClient.extractAndWrapCause(RestClient.java:849) ~[?:?]
        at org.graylog.shaded.elasticsearch7.org.elasticsearch.client.RestClient.performRequest(RestClient.java:259) ~[?:?]
        at org.graylog.shaded.elasticsearch7.org.elasticsearch.client.RestClient.performRequest(RestClient.java:246) ~[?:?]
        at org.graylog.storage.elasticsearch7.PlainJsonApi.lambda$perform$0(PlainJsonApi.java:40) ~[?:?]
        at org.graylog.storage.elasticsearch7.ElasticsearchClient.execute(ElasticsearchClient.java:109) ~[?:?]
        ... 23 more
Caused by: java.net.ConnectException: Connection refused
        at sun.nio.ch.Net.pollConnect(Native Method) ~[?:?]
        at sun.nio.ch.Net.pollConnectNow(Net.java:672) ~[?:?]
        at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:946) ~[?:?]
        at org.graylog.shaded.elasticsearch7.org.apache.http.impl.nio.reactor.DefaultConnectingIOReactor.processEvent(DefaultConnectingIOReactor.java:174) ~[?:?]
        at org.graylog.shaded.elasticsearch7.org.apache.http.impl.nio.reactor.DefaultConnectingIOReactor.processEvents(DefaultConnectingIOReactor.java:148) ~[?:?]
        at org.graylog.shaded.elasticsearch7.org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor.execute(AbstractMultiworkerIOReactor.java:351) ~[?:?]
        at org.graylog.shaded.elasticsearch7.org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager.execute(PoolingNHttpClientConnectionManager.java:221) ~[?:?]
        at org.graylog.shaded.elasticsearch7.org.apache.http.impl.nio.client.CloseableHttpAsyncClientBase$1.run(CloseableHttpAsyncClientBase.java:64) ~[?:?]
        at java.lang.Thread.run(Thread.java:833) ~[?:?]

Graylog server.conf:

is_leader = true
node_id_file = /etc/graylog/server/node-id
password_secret = <REMOVED>
root_password_sha2 = <REMOVED>
root_email = <REMOVED>
bin_dir = /usr/share/graylog-server/bin
data_dir = /var/lib/graylog-server
plugin_dir = /usr/share/graylog-server/plugin
http_bind_address = 10.0.1.4:9000
http_publish_uri = http://graylog.mydomain.com/
http_external_uri = http://graylog.mydomain.com/
http_enable_cors = false
http_enable_tls = true
http_tls_cert_file = /etc/graylog/server/graylog.mydomain.com.crt
http_tls_key_file = /etc/graylog/server/graylog.mydomain.com.key
elasticsearch_hosts = https://admin:admin@graylog.mydomain.com:9200
rotation_strategy = count
elasticsearch_max_docs_per_index = 20000000
elasticsearch_max_number_of_indices = 20
retention_strategy = delete
elasticsearch_shards = 4
elasticsearch_replicas = 0
elasticsearch_index_prefix = graylog
allow_leading_wildcard_searches = false
allow_highlighting = false
elasticsearch_analyzer = standard
output_batch_size = 500
output_flush_interval = 1
output_fault_count_threshold = 5
output_fault_penalty_seconds = 30
processbuffer_processors = 5
outputbuffer_processors = 3
processor_wait_strategy = blocking
ring_size = 65536
inputbuffer_ring_size = 65536
inputbuffer_processors = 2
inputbuffer_wait_strategy = blocking
message_journal_enabled = true
message_journal_dir = /var/lib/graylog-server/journal
lb_recognition_period_seconds = 3
mongodb_uri = mongodb://localhost/graylog
mongodb_max_connections = 1000
mongodb_threads_allowed_to_block_multiplier = 5
proxied_requests_thread_pool_size = 32

Mongodb mongod.conf:

storage:
  dbPath: /var/lib/mongodb
  journal:
    enabled: true

systemLog:
  destination: file
  logAppend: true
  path: /var/log/mongodb/mongod.log

net:
  port: 27017
  bindIp: 127.0.0.1


processManagement:
  timeZoneInfo: /usr/share/zoneinfo

Opensearch opensearch.yml

cluster.name: graylog
action.auto_create_index: false
path.data: /var/lib/opensearch
path.logs: /var/log/opensearch
network.host: 10.0.1.4
http.port: 9200
cluster.initial_master_nodes: ["graylog"]

plugins.security.ssl.transport.pemcert_filepath: graylog.mydomain.com.crt
plugins.security.ssl.transport.pemkey_filepath: graylog.mydomain.com.key
plugins.security.ssl.transport.pemtrustedcas_filepath: ca-certificates.crt
plugins.security.ssl.transport.enforce_hostname_verification: true
plugins.security.ssl.transport.enabled_protocols:
  - "TLSv1.3"
  - "TLSv1.2"
plugins.security.ssl.transport.enabled_ciphers:
  - "TLS_AES_256_GCM_SHA384"
  - "TLS_CHACHA20_POLY1305_SHA256"
  - "ECDHE-RSA-AES256-GCM-SHA384"
  - "ECDHE-RSA-CHACHA20-POLY1305"
  - "DHE-RSA-AES256-GCM-SHA384"
  - "ECDHE-RSA-AES128-GCM-SHA256"
  - "ECDHE-ECDSA-AES256-GCM-SHA384"
  - "ECDHE-ECDSA-CHACHA20-POLY1305"
  - "ECDHE-ECDSA-AES128-GCM-SHA256"
  - "DHE-RSA-AES128-GCM-SHA256"
plugins.security.ssl.http.enabled: true
plugins.security.ssl.http.pemcert_filepath: graylog.mydomain.com-full.crt
plugins.security.ssl.http.pemkey_filepath: graylog.mydomain.com.key
plugins.security.ssl.http.pemtrustedcas_filepath: ca-certificates.crt
plugins.security.ssl.http.enabled_protocols:
  - "TLSv1.3"
  - "TLSv1.2"
plugins.security.ssl.http.enabled_ciphers:
  - "TLS_AES_256_GCM_SHA384"
  - "TLS_CHACHA20_POLY1305_SHA256"
  - "ECDHE-RSA-AES256-GCM-SHA384"
  - "ECDHE-RSA-CHACHA20-POLY1305"
  - "DHE-RSA-AES256-GCM-SHA384"
  - "ECDHE-RSA-AES128-GCM-SHA256"
  - "ECDHE-ECDSA-AES256-GCM-SHA384"
  - "ECDHE-ECDSA-CHACHA20-POLY1305"
  - "ECDHE-ECDSA-AES128-GCM-SHA256"
  - "DHE-RSA-AES128-GCM-SHA256"
plugins.security.allow_unsafe_democertificates: true
plugins.security.allow_default_init_securityindex: true
plugins.security.authcz.admin_dn:
  - CN=graylog.mydomain.com,OU=mydomain.com,O=mydomain

Tables created on MongoDB:

access_tokens
alarmcallbackconfigurations
alarmcallbackhistory
alerts
archive_backends
archive_catalog
audit_log
cluster_config
cluster_events
collectors
content_packs
content_packs_installations
event_processor_state
forwarders
grants
index_failures
index_field_types
index_ranges
index_sets
licenses
lut_caches
lut_data_adapters
lut_tables
notifications
opensearch_anomaly_detectors
pipeline_processor_pipelines
pipeline_processor_pipelines_streams
pipeline_processor_rules
processing_status
roles
scheduler_triggers
searches
sessions
sidecar_configuration_variables
sidecars
streamrules
streams
system_messages
team_sync_backend_configs
teams
traffic
users

3. What steps have you already taken to try and solve the problem?

I’m trying getting this up since previous week, on every try I purge the previos instalations to clean the enviroment.
On MongoDB I drop the graylog database using: db.dropDatabase()

On this momment I’m trying get Graylog up without erros at logs, next step will be study about how hardening it.

4. How can the community help?

Someone would point me what I’m missing?

PS.: This is my first contact with MongoDB / Opensearch / Graylog.

It would take some more research but I think this might mean that you intend to be clustered when you aren’t clustered… look into that more (I am short on time at the moment) Also, is your Java at 18? The docs say you need to be at 17…

This post was flagged by the community and is temporarily hidden.

Graylog 4.3.x and above supports OpenSearch. If you are hunting around in Elasticsearch for commands to run in OpenSearch, make sure you have the ElasticSearch web page set to version 7.10.x since that is when it was forked to OpenSearch.

The docs reference cluster.name: graylog which I also have in my elasticsearch.yml. The config cluster.initial_master_nodes: ["graylog"] is in the documentation but in a section that is referencing a cluster upgrade from Elasticsearch 7.10 to Opensearch…

This post was flagged by the community and is temporarily hidden.

Delving deeper into Java - Graylog shows different depending on where you are… OpenSearch shows 8,11,14… Honestly I really don’t know if it will be an issue or not, I only saw your 18 and was reading docs earlier in the day where it stated 17 … unlikely to be your issue.

To me ElasticsearchException: Unable to retrieve cluster information means it thinks you want a cluster but there isn’t. found this in OpenSearch docs:

# Unless you have already configured a cluster, you should set
# discovery.type to single-node, or the bootstrap checks will
# fail when you try to start the service.
discovery.type: single-node

@tmacgbay thanks for remember about interoperability, I have skipped this step.

Now, just to make things compatible have installed OpenJDK 11 and used:

apt-mark hold openjdk-11-jre-headless

The hint about:

discovery.type: single-node

Did not work for me using opensearch.

On the last test I did a few minutes ago, this time using elasticsearch 7.10.2 the graylog started without errors…

Following the migration guide (title: MIGRATING TO OPENSEARCH), and trying start with opensearch, same error occours.

My hope was about the problem checking occurs only because some table haven’t be created at mongodb, and since elasticsearch did all steps, everything would work… But seems not…

As I’m installing it to validate graylog in my house, I’ll stick with elasticsearch for now to see how graylog could be streamlined when dealing with logs instead use grafana (I got this one up and working in 3 days including study about hardening), later I’ll leave a ticket on GitHub about this issue.

My goal is to find a not too expensive dashboard to sell as a solution to the clients I work with (read they would buy a corporate license after my deployment - that’s why before I test everything in my lab to get a ready-made solution).

Thanks for the help!!!

PS.: I’m unable to post the link from docs,don’t know why…

2 Likes

Reading through all the elasticsearch docs to make sure I made all possible settings to harden it (which seems useless as the OSS version is almost without features and plugins), I seem to have found the solution to the “error” in opensearch in the version 7.17 elasticsearch docs:

If you do not need any discovery configuration, for instance if running a single-node cluster, set discovery.seed_hosts: [] to disable discovery and satisfy this bootstrap check.

Steps from docs 7.10 says:

This bootstrap check ensures that discovery is not running with the default configuration. It can be satisfied by setting at least one of the following properties:

  • discovery.seed_hosts
  • discovery.seed_providers
  • cluster.initial_master_nodes

The method I have did with last one parameter…

Did an new test on opensearch, this time using the hint from elasticsearch 7.17:

discovery.seed_hosts: []

And its working with opensearch!!!

Now I can have encryption and extra security features.

2 Likes

Great dig!! Glad you found that!

1 Like