"Invalid credentials, please verify ..." when using DNS for web, IP works

Hi all,

When attempting to log into Graylog’s web interface via a DNS entry I get the following error: “Invalid credentials, please verify them and retry.” However if I access the web interface by IP address I’m able to log in. I should also note the same issue happens when I go to a different DNS name that points to a reverse proxy. I’ve had this issue since 2.4 and I upgraded all the way to 3.3 today hoping that it would resolve the issue. Below is my config:

Graylog Config:

# Cluster settings
is_master = true
node_id_file = /etc/graylog/server/node-id

# General
dashboard_widget_default_cache_time = 10s

# Access secrets
password_secret = "redacted"
root_username = admin
root_email = redacted
root_timezone = EST
root_password_sha2 = redacted

# Plugins
plugin_dir = /usr/share/graylog-server/plugin

# Proxies
trusted_proxies = 127.0.0.1/32, 10.10.10.0/24

# Web interface
http_bind_address = 10.10.10.123:9000
http_enable_cors = true
http_enable_gzip = false
http_publish_uri = http://master-chief.domain.com:9000/
http_external_uri = https://graylog.domain.com/

# Search options
allow_leading_wildcard_searches = false
allow_highlighting = false

# Elasticsearch
elasticsearch_index_prefix = graylog
elasticsearch_analyzer = standard
output_batch_size = 500
output_flush_interval = 1
output_fault_count_threshold = 5
output_fault_penalty_seconds = 30

# Elasticsearch http client (GL >= 2.3)
elasticsearch_hosts = http://master-chief:9200,http://cortana:9200,http://sgt-johnson:9200,http://arbiter:9200,http://lord-hood:9200,http://dr-halsey:9200,http://miranda-keys:9200
elasticsearch_max_total_connections = 20
elasticsearch_max_total_connections_per_route = 2
elasticsearch_discovery_enabled = true

# Processors
processbuffer_processors = 5
outputbuffer_processors = 3
async_eventbus_processors = 2
outputbuffer_processor_keep_alive_time = 5000
outputbuffer_processor_threads_core_pool_size = 3
outputbuffer_processor_threads_max_pool_size = 30
processor_wait_strategy = blocking
udp_recvbuffer_sizes = 1048576
inputbuffer_ring_size = 65536
inputbuffer_processors = 2
inputbuffer_wait_strategy = blocking

# Message journal
message_journal_enabled = true
message_journal_dir = /var/lib/graylog-server/journal
message_journal_max_age = 12h
message_journal_max_size = 5gb
message_journal_flush_age = 1m
message_journal_flush_interval = 1000000
message_journal_segment_age = 1h
message_journal_segment_size = 100mb

# Timeouts
output_module_timeout = 10000
stale_master_timeout = 2000
shutdown_timeout = 30000
ldap_connection_timeout = 2000
http_connect_timeout = 5s
http_read_timeout = 10s
http_write_timeout = 10s
stream_processing_timeout = 2000

# Ring buffers
ring_size = 65536

# Load balancing
lb_recognition_period_seconds = 3

# Stream processing
stream_processing_max_faults = 3

# Intervals

# MongoDB Configuration
mongodb_uri = mongodb://graylog:redacted@guilty-spark.domain.com:27017/graylog
mongodb_max_connections = 100
mongodb_threads_allowed_to_block_multiplier = 5

# Drools Rule File (Use to rewrite incoming log messages)
# See: http://docs.graylog.org/en/1.0/pages/drools.html

# HTTP proxy for outgoing HTTP calls

# GC

# Collector
collector_inactive_threshold  = 1m
collector_expiration_threshold  = 14d

# Content packs
content_packs_loader_enabled = false
content_packs_dir = /usr/share/graylog-server/contentpacks
content_packs_auto_load = grok-patterns.json

NGINX Reverse Proxy Conf:

server {
  listen                             80;
  server_name                 graylog.domain.com;
  location / {
    return 301                    https://$server_name$request_uri;
  }
}

server {
  listen                         443 ssl http2;
  server_name            graylog.domain.com;
  include                     /etc/nginx/ssl.conf;
  location / {
      proxy_set_header     Host $http_host;
      proxy_set_header     X-Forwarded-Host $host;
      proxy_set_header     X-Forwarded-Server $host;
      proxy_set_header     X-Forwarded-For $proxy_add_x_forwarded_for;
      proxy_set_header     X-Graylog-Server-URL https://$server_name/;
      proxy_pass                http://10.10.10.123:9000;
  }
}

Environment Info:

I took a look at /etc/graylog/server/server.conf and didn’t see any errors. Any help is greatly appreciated, thank you!

So I understand the issue clearly…

you have a DNS A record for master-chief.domain.com that points to 10.10.10.123
you also have a DNS A record for graylog.domain.com that point to 10.10.10.122 (this is nginx)

you open your browser and want to be able to type https://graylog.domain.com and connect but that’s not working. Is that correct?

when you connect via IP you connect to 10.10.10.123:9000? http or https?

I ask because you proxy_pass line doesn’t indicate the https protocol…

PS - Halo much?

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.