How to configure graylog to display logs from elasticsearch

1. Describe your incident:
I am newbie to graylog and I am trying to setup graylog to display application logs in the graylog UI. Tools used are graylog, elasticsearch and filebeat.

  1. I have installed and configured the graylog in server-A and able to open the webpage and login with admin credentials.
  2. I have installed and configured elasticsearch in the same server (server-A) where graylog is running and able to see the below message in the graylog UI.
    “Elasticsearch cluster graylog is green.”
  3. I have installed and configured filebeat on server-B and able to push the output (logs) to elasticsearch and can see the document count is getting increased by running the below command.
    "curl -XGET “elasticsearch-host:9200/_cat/indices?v”

Now I want to configure graylog to fetch those logs from the elasticsearch and display it in graylog UI.
I am not sure what to configure here, whether it is input or indices?

FYI: I am not using any logstash or sidecar.

2. Describe your environment:

  • OS Information: RHEL 7.9

  • Package Version: graylog - 5.1.3-1, elasticsearch - 7.17.11-1, filebeat - 7.17.0

  • Service logs, configurations, and environment variables:
    Graylog’s server.conf

is_leader = true
node_id_file = /etc/graylog/server/node-id
password_secret = <token>
root_password_sha2 = <token>
root_timezone = UTC
bin_dir = /usr/share/graylog-server/bin
data_dir = /data1/graylog/lib/graylog-server
plugin_dir = /usr/share/graylog-server/plugin
http_bind_address = <graylog_server_ip>:9000
http_enable_cors = true
stream_aware_field_types=false
elasticsearch_hosts = http://<elasticsearch_hosts_ip>:9200
elasticsearch_index_prefix = graylog
allow_leading_wildcard_searches = false
allow_highlighting = false
output_batch_size = 500
output_flush_interval = 1
output_fault_count_threshold = 5
output_fault_penalty_seconds = 30
processbuffer_processors = 5
outputbuffer_processors = 3
processor_wait_strategy = blocking
ring_size = 65536
inputbuffer_ring_size = 65536
inputbuffer_processors = 2
inputbuffer_wait_strategy = blocking
message_journal_enabled = true
message_journal_dir = /data1/graylog/lib/graylog-server/journal
lb_recognition_period_seconds = 3
mongodb_uri = mongodb://localhost/graylog
mongodb_max_connections = 1000

elasticsearch.yml

cluster.name: graylog
path.data: /var/lib/elasticsearch
path.logs: /var/log/elasticsearch
network.host: 0.0.0.0
discovery.seed_hosts: ["127.0.0.1"]
action.auto_create_index: .monitoring*,.watches,.triggered_watches,.watcher-history*,.ml*

filebeat.yml

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /home/webservices/nginx/logs/*.log
    - /data1/var/logs/*.log
    #- /var/log/*.log
    #- /var/log/messages
    #- c:\programdata\elasticsearch\logs\*

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: false

  # Period on which files under path should be checked for changes
  #reload.period: 10s

# ======================= Elasticsearch template setting =======================

setup.ilm.overwrite: true
setup.template.overwrite: true
setup.template.settings:

  index.number_of_shards: 1
  index.number_of_replicas: 0
  #index.codec: best_compression
  #_source.enabled: false

setup.kibana:


# ---------------------------- Elasticsearch Output ----------------------------
output.elasticsearch:
  # Array of hosts to connect to.
  hosts: ["http://<elasticsearch_server_ip>:9200"]

  # Protocol - either `http` (default) or `https`.
  #protocol: "https"

  # Authentication credentials - either API key or username/password.
  #api_key: "id:api_key"
  #username: "elastic"
  #password: "changeme"

# ================================= Processors =================================
processors:
  - add_host_metadata:
      when.not.contains.tags: forwarded
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_kubernetes_metadata: ~

3. What steps have you already taken to try and solve the problem?
I tried to configure input and indices but nothing works.

4. How can the community help?
I want to see the logs sent by filebeat in graylog.

Helpful Posting Tips: Tips for Posting Questions that Get Answers [Hold down CTRL and link on link to open tips documents in a separate tab]

With graylog you will send the log message to graylog itself and not to elasticsearch.

I recommend using OpenSearch instead though as graylog only supports up to Elasticsearch 7.10.2. It is not compatible with Elasticsearch 7.17. In the near future support for Elasticsearch will be dropped entirely.

In case you want a quick copy/paste commands guide, check out https://github.com/Graylog2/se-poc-docs/tree/main/src/On%20Prem%20POC

Regarding configuring beats, you’ll need a beats input in your graylog cluster. You can send logs from beats to a graylog beat input using the following:

output.logstash:
   hosts: ["servername.domain.tld:5044"]

Note that TCP Port 5044 is the default beat port, but is customizable. Make sure it matches the port you configured on your graylog beats input.

Hope that helps!

So, do I need to install logstash? and what version of opensearch and filebeat is recommended for graylog 5.1.3-1?

Now I have downgraded all versions
Graylog - 4.2.5-1
elasticsearch - 7.9.2-1
mongodb - 4.4.23-1

But still I cannot see any log messages for the input type “Beats” in graylog dashboard.

do I need to install logstash?

no, filebeat will use the logstash output but sends to graylog.

what version of opensearch and filebeat is recommended

The latest versions of both should work ok.

But still I cannot see any log messages for the input type “Beats” in graylog dashboard

Can you confirm you have sources sending via the logstashoutput to your beats input on graylog? Do you see any messages for your beat input?

No, I don’t see any messages…

Can you verify your log source can reach your graylog server on that port, and can you check via tcpdump? For example: via your graylog server sudo tcpdump -i eth0 -nA port 5044 . This will show realtime traffic on that port.

Also do you see any errors in the filebeat log? It will show an indication if it can or cannot reach the configured output.

{
    "log.level": "debug",
    "@timestamp": "2023-07-31T15:08:05.475Z",
    "log.logger": "logstash",
    "log.origin": {
        "file.name": "logstash/async.go",
        "file.line": 172
    },
    "message": "1 events out of 1 events sent to logstash host 127.0.0.1:5044. Continue sending",
    "service.name": "filebeat",
    "ecs.version": "1.6.0"
}

@drewmiranda-gl , Thank a ton…
The issue got resolved. The hosts of output.logstash had URL with “http://” and after removing this I am able to see the logs in graylog dashboard.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.