Trying to get an old Graylog v2.3 system accessible on http port 80

Hi All,

I have an old Graylog server v2.3 which has been running for years without issue. Recently the certificates it was using expired and we’ve been unable to access the web interface, although the server itself is still running, collecting logs, and sending email notifications.

Not being very familiar with this software (or Apache) I tried to get Graylog to run without SSL on port 80. Now when it is accessed on port 80 Graylog responds but it says:

"We are experiencing problems connecting to the Graylog server running on https://graylog.localsite.com/api/ . Please verify that the server is healthy and working correctly.

Can anybody explain this like I’m 5 and point me in the right direction? What setting(s) am I missing?

Below is my server.conf

> ## Global
>       2 is_master = true
>       3 node_id_file = /etc/graylog/server/node-id
>       4 password_secret = <REMOVED>
>       5 root_password_sha2 = <REMOVED>
       6 root_email = <REMOVED>
>       7 root_timezone = EST
>       8 plugin_dir = /usr/share/graylog-server/plugin
>       9 rules_file = /etc/graylog/server/rules.drl
>      10 lb_recognition_period_seconds = 3
>      11 ldap_connection_timeout = 5000
>      12 alert_check_interval = 45
>      13
>      14 ## REST API & Web
>      15 rest_listen_uri = http://0.0.0.0:12900/
>      16 rest_transport_uri = http://10.1.115.39:12900/
>      17 web_enable = true
>      18 web_listen_uri = http://127.0.0.1:9000/
>      20
>      21
>      22 ## Search
>      23 allow_leading_wildcard_searches = true
>      24 allow_highlighting = false
>      25
>      26 ## Elasticsearch
>      27 elasticsearch_shards = 1
>      28 elasticsearch_replicas = 0
>      29 elasticsearch_index_prefix = graylog2
>      30 elasticsearch_cluster_name = locker
>      31 elasticsearch_discovery_zen_ping_multicast_enabled = false
>      32 elasticsearch_discovery_zen_ping_unicast_hosts = 10.1.115.39:9300
>      33 elasticsearch_analyzer = standard
>      34 output_batch_size = 500
>      35 output_flush_interval = 1
>      36 output_fault_count_threshold = 5
>      37 output_fault_penalty_seconds = 30
>      38 processbuffer_processors = 5
>      39 outputbuffer_processors = 3
>      40 processor_wait_strategy = blocking
>      41 ring_size = 65536
>      42 inputbuffer_ring_size = 65536
>      43 inputbuffer_processors = 2
>      44 inputbuffer_wait_strategy = blocking
>      45
>      51 elasticsearch_hosts = http://10.1.115.39:9200
>      52 elasticsearch_connect_timeout = 10s
>      53 # elasticsearch_idle_timeout = 999
>      54 elasticsearch_max_total_connections = 20
>      55 elasticsearch_max_total_connections_per_route = 2
>      56 elasticsearch_max_retries = 2
>      57 elasticsearch_socket_timeout = 60s
>      58 elasticsearch_discovery_enabled = false
>      59 elasticsearch_discovery_filter =
>      60 elasticsearch_discovery_frequency = 30s
>      61
>      62 ## Message Journal
>      63 message_journal_enabled = true
>      64 message_journal_dir = /mnt/store1/graylog-server/journal/
>      65 message_journal_max_age = 72h
>      66 message_journal_max_size = 150gb
>      67 message_journal_flush_age = 1m
>      68 message_journal_flush_interval = 1000000
>      69 message_journal_segment_age = 1h
>      70 message_journal_segment_size = 400mb
>      71
>      72 ## MongoDB
>      73 mongodb_uri = <REMOVED>
>      74 mongodb_max_connections = 1000
>      75 mongodb_threads_allowed_to_block_multiplier = 5
>      76
>      77 # Email transport
>      78 transport_email_enabled = true
>      79 transport_email_hostname = <REMOVED>
>      80 transport_email_port = 25
>      81 transport_email_use_auth = false
>      82 transport_email_use_tls = false
>      83 transport_email_use_ssl = false
>      84 transport_email_subject_prefix = [Graylog]
>      85 transport_email_from_email = <REMOVED>
>      86 transport_email_web_interface_url = <REMOVED>
>      87
>      88 # Content Packs
>      89 content_packs_loader_enabled = false
>      90 # content_packs_dir = data/contentpacks
>      91 # content_packs_auto_load = grok-patterns.json
>      92
>      93 # Plugins
>      94 usage_statistics_enabled = false
>      95 dns_resolver_enabled = true
>      96 dns_resolver_run_before_extractors = true
>      97

Hello @danny999

I have old Graylog server but version 2.4.
Found this documentation for older Graylog servers for you incase you may need it.

The basic configuration for logging into your Web UI would be this shown below, I seen you stated without HTTPS. Below is an example replace you IP address with the one Im showing

# vi /etc/graylog/server/server.conf
password_secret = 8FLksQcQJzVZyJxrWTKkHfLRaeCcq3733ZTDwyuqXWMz7Z83X07XkEr8o6jwmU0orDvyx2SPT9djuPV9SjHnygUfSqKQBpJ4
root_password_sha2 = e3c652f0ba0b4801205814f8b6bc49672c4c74e25b497770bb89b22cdeb4e95
root_email = My-Email@gmail.com
root_timezone = America/Chicago
is_master = true

Modify the entries to let Graylog Web Interface to connect to the Graylog server. Notice the similarity’s between rest & web

rest_listen_uri = http://192.168.1.1:9000/api 
rest_transport_uri = http://192.168.1.1:9000/api/
web_listen_uri = http://192.168.1.1:9000/
web_enable_cors = true

Check your Graylog 2.3 log files here

Once you adjusted your config file make sure you restart your Graylog service

systemctl restart graylog-server

Tail your log file for errors.

tail -f /var/log/graylog-server/server.log

Should be able to use this to log in like this.

http://192.168.1.1:9000

Might need to refresh you browser.

“It does not matter how slowly you go so long as you do not stop.”
– Confucius

Hope that helps.

Thanks gsmith, that was helpful. After a bunch of tinkering I am able to access Graylog via http again.

That’s great :smiley: :+1:

Can I ask exactly what you did to resolve this? and also could you mark this post resolved for future search. I’m quit sure someone will have the same issue.

-Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.