Hi,
Excuse me, I have an idea that uses graylog collector-sidecar to collect logs and can throw logs to kafka. I have read some documents about this condition, but still not sure whether it is feasible. My purpose is to relieve the stress of elasticsearch.
I did some experiments, but it didn’t work.
There is what I have already done:
- Install kafka.
- Start ZooKeeper.
- Start kafka.
- Create a topic.
> bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic test
- Create Raw/Plaintext Kafka input in graylog web.
- Edit
/etc/graylog/collector-sidecar/generated/filebeat.yml
filebeat:
prospectors:
- encoding: utf-8
fields:
collector_node_id: graylog-collector-sidecar
flag: apache
gl2_source_collector: f2056814-97f7-4a67-8b05-c9c1ab997107
type: log
ignore_older: 0
paths:
- /var/log/httpd/*
scan_frequency: 10s
tail_files: true
type: log
#output:
# logstash:
# hosts:
# - localhost:5044
output.kafka:
# initial brokers for reading cluster metadata
hosts: ["localhost:9092"]
# message topic selection + partitioning
topic: '%{[test]}'
partition.round_robin:
reachable_only: false
required_acks: 1
compression: gzip
max_message_bytes: 1000000
path:
data: /var/cache/graylog/collector-sidecar/filebeat/data
logs: /var/log/graylog/collector-sidecar
tags:
- apache-log
The message did not throw kafka. Maybe I missing something fundamental. Is there any other way to throw messages to kafka?
I will appreciate if you can help me!