Use Graylog and Kibana at the same time


#1

Hi,

I am trying to use graylog 3.0.0 and Kibana 6.6.0. I have ELK stack working in the same version (6.6.0) and changed the configuration in logstash.conf from this:

output {
    elasticsearch {
        hosts => "localhost:9200"
        index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    }
}

to this configuration to work with graylog:

output {
  gelf {
    host => localhost
    port => 12201
  }
}

However, when I changed to the latest configuration I stop receiving indexes with the pattern “filebeat-YYYY-MMDD” and start receiving “graylog_1” because of the configuration in logstash.conf.

Can you please confirm if this is the way to configure logstash.conf to access logs in Kibana and graylog?

Thank you so much


(Jan Doberstein) #2

Graylog can only work with messages it has received and indices it takes care of. You are not able to ingest messages with logstash to Elasticsearch and work with Graylog on them.


#3

Thanks Jan, the pipeline I have is as follows:
Filebeat collects logs in the server it is hosted and sends the logs to logstash to be transform, parse and extract additional fields. Then, logstash outputs to GELF format in port 12201. Next, elasticsearch connects to logstash to get the transformed data in the Graylog format and creates the index which is requested by Graylog.

Then, the index definition in graylog is internal and created in elasticsearch through Graylog and cannot be change.

Can you please confirm this logic is correct?

Thanks again for your help and clarification.


(Jan Doberstein) #4

with graylog you would ingest the filebeat to Graylog (BEATS Input) and transform, parse and extract additional fields in Graylog and push the result to Elasticsearch.


#5

Thanks for the clarification Jan, I will change the pipeline accordingly.