Graylog cannot authenticate with AWS OpenSearch

1. Describe your incident:
I’m running Graylog on AWS fargate with AWS OpenSearch and enabled authentication. However, when Graylog starts up, it mentioned as unauthoized.

In the config file:

elasticsearch_hosts =https://username:password@vpc-xxx.es.amazonaws.com:443

Previously, when I have not enable the OpenSearch Authentication, Graylog was able to connect to OpenSearch via https.

Note that the password only contain special characters of - or _ only

2. Describe your environment:

  • OS Information: RedHat 9

  • Package Version: Graylog 5.0.2

  • Service logs, configurations, and environment variables:

 ERROR: org.graylog2.storage.versionprobe.VersionProbe - Unable to retrieve version from Elasticsearch node vpc-xxxx.es.amazonaws.com:-1: unknown error - an exception occurred while deserializing error response: com.fasterxml.jackson.core.JsonParseException: Unrecognized token 'Unauthorized': was expecting (JSON String, Number, Array, Object or token 'null', 'true' or 'false')  at [Source: (okhttp3.ResponseBody$BomAwareReader); line: 1, column: 13]

3. What steps have you already taken to try and solve the problem?
Did a “curl https://username:password@vpc-xxx.es.amazonaws.com:443” within the fargate that hosted Graylog, it return with a successful response.

{
  "name" : "xxxxx",
  "cluster_name" : "xxx:x",
  "cluster_uuid" : "xxx",
  "version" : {
    "distribution" : "opensearch",
    "number" : "2.3.0",
    "build_type" : "tar",
    "build_hash" : "unknown",
    "build_date" : "2023-04-20T07:23:19.274646Z",
    "build_snapshot" : false,
    "lucene_version" : "9.3.0",
    "minimum_wire_compatibility_version" : "7.10.0",
    "minimum_index_compatibility_version" : "7.0.0"
  },
  "tagline" : "The OpenSearch Project: https://opensearch.org/"
}

4. How can the community help?
Can anyone point me a direction i should be looking at?

Update: I did a test with the VersionProbe.java in [graylog2-server/graylog2-server/src/main/java/org/graylog2/configuration/IndexerDiscoveryProvider.java at 8dd21cc4f47ebe9b711ba7f9dcaba28e326010f1 · Graylog2/graylog2-server · GitHub](https://Graylog Github) and it works with the following response

Optional[VersionResponse{number=2.3.0, distribution=opensearch}]

This was what i tested

public static void main(String[] args) throws MalformedURLException {
        System.out.println("Hello World2!");

        
        try {
            String url = new String("https://username:password@vpc-xxx.es.amazonaws.com:443");
            URI host = new URI(url);
        

            OkHttpClient okHttpClient = new OkHttpClient();
            
            ObjectMapper objectMapper = new ObjectMapper();

            Retrofit retrofit = new Retrofit.Builder()
                        .baseUrl(host.toURL())
                        .addConverterFactory(JacksonConverterFactory.create(objectMapper))
                        .client(addAuthenticationIfPresent(host, okHttpClient))
                        .build();
            final RootRoute root = retrofit.create(RootRoute.class);

            final Converter<ResponseBody, ErrorResponse> errorResponseConverter = retrofit.responseBodyConverter(ErrorResponse.class, new Annotation[0]);
            final Consumer<ResponseBody> errorLogger = (responseBody) -> {
                try {
                    final ErrorResponse errorResponse = errorResponseConverter.convert(responseBody);
                    System.out.println("Unable to retrieve version from Elasticsearch node");
                    // LOG.error("Unable to retrieve version from Elasticsearch node {}:{}: {}", host.getHost(), host.getPort(), errorResponse);
                } catch (IOException e) {
                    System.out.println("Unable to retrieve version from Elasticsearch node 2");
                    System.out.println(e);
                    // LOG.error("Unable to retrieve version from Elasticsearch node {}:{}: unknown error - an exception occurred while deserializing error response: {}", host.getHost(), host.getPort(), e);
                }
            };

            System.out.println(rootResponse(root, errorLogger)
                .map(RootResponse::version));
        } catch (URISyntaxException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }
    }

Suspecting the fault is from how it might read the config file and pass over to the VersionProbe class.

Hey @lkwjohn

In the doc’s/configuration file it only shows .

# List of Elasticsearch hosts Graylog should connect to.
# Need to be specified as a comma-separated list of valid URIs for the http ports of your elasticsearch nodes.
# If one or more of your elasticsearch hosts require authentication, include the credentials in each node URI that
# requires authentication.
#
# Default: http://127.0.0.1:9200
#elasticsearch_hosts = http://node1:9200,http://user:password@node2:19200
elasticsearch_hosts = http://192.168.1.100:9200

Graylog connecting to Opensearch can not use HTTPS as far as I know, only Elasticsearch/Opensearch/Opensearch-Dashboards can use HTTPS for a connection.

Hi @gsmith

That’s weird, if that’s the case, my previous setup using https://vpc-xxx.es.amazonaws.com:443 without basic authentication should not have work as well with https. But in fact it’s able to connect org.graylog2.storage.providers.ElasticsearchVersionProvider - Elasticsearch cluster is running OpenSearch:2.3.0. I tested again, and confirm https works without basic authentication.

Also without https, having basic authentication defeat the purpose, since basis authentication is just a base64 encoding which can be decode easily and this will cause security issue.

Hey,

I agree, but for the past years members tried to get HTTPS to work from Graylog to Elasticsearch and/or Opensearch. But if you get Graylog to connect using HTTPS and authenticate to Opensearch/Elasticsearch I would be internested on how you did it.

My Graylog connects to OpenSearch with HTTPS and it’s working fine. You just need to take care about certificates trust. For example you need to add the CA certificate in a Trust Store and you need to configure Graylog to use this Trust Store (JVM setting).

Also tou can check for errros in /var/log/graylog-server/server.log

1 Like

Hey @frantz

Thanks for chiming in :+1:

If i can ask you a couple questions:

Did you disable the security on Opensearch? If not was your certificates used for Graylog used in Opensearch YML file? Or did you make a completely different set of certs for OpenSearch, then add them to the trusted store?

I assume you configured Graylog settings to something like this…

elasticsearch_hosts = https://username:password@192.168.1.100:9200

I do have GL configured with HTTPS and a Java trusted store.

Security is enabled on Opensearch.
I use the same certificate for Graylog and Opensearch (mainly because they are on the same host but whatever).
But you can use 2 different certificates.
In both situation you just need to add the CA certificate in the trust store, you don’t need to add the Graylog or Opensearch certificate in the trust store, just the CA certificate.
Moreover with Opensearch security enabled you need a certificate for some administration actions, you can either use the same certificate or create a dedicated one.
Yes I configured Graylog settings like that.

Noted.
Thanks @frantz for the reply :+1:

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.