Transfer symfony logfiles with filebeat to graylog in local docker-environment

I am trying to build an equal configuration in my local docker-environment like on our production system. After spending some time investigating and rebuilding the docker container setup, still can’t get it to work and Graylog is not receiving any data.

Overview and interim results

  • web , php and db container are in use for the symfony based application
  • symfony runs properly on localhost in php-container and generates logfiles
  • symfony-logfiles are located here: /var/www/html/var/logs/*.log
  • symfony-logfiles format is json / gelf
  • all other containers are also up and running when starting the complete composition
  • filebeat configuration is based on first link below
  • filebeat.yml seems to retrieve any logfile found in any container
  • filebeat configured to transfer data directly to elasticsearch
  • elasticsearch persists data in mongodb
  • all graylog related data in persisted in named volumes in docker
  • additionally I am working with docker-sync on a Mac

config.yml

# Monolog Configuration

monolog:
channels: [graylog]
handlers:
graylog:
type: stream
formatter: line_formatter
path: “%kernel.logs_dir%/graylog.log”
channels: [graylog]

docker-compose.yml

version: "3"
services:
    web:
        image: nginx
        ports:
            - "80:80"
            - "443:443"
        links:
            - php
        volumes:
            - ./docker-config/nginx.conf:/etc/nginx/conf.d/default.conf
            - project-app-sync:/var/www/html
            - ./docker-config/localhost.crt:/etc/nginx/ssl/localhost.crt
            - ./docker-config/localhost.key:/etc/nginx/ssl/localhost.key

    php:
        build:
            context: .
            dockerfile: ./docker-config/Dockerfile-php
        links:
            - graylog
        volumes:
            - project-app-sync:/var/www/html
            - ./docker-config/php.ini:/usr/local/etc/php/php.ini
            - ./docker-config/www.conf:/usr/local/etc/php-fpm.d/www.conf

    db:
        image: mysql
        ports:
            - "3306:3306"
        environment:
            - MYSQL_ALLOW_EMPTY_PASSWORD=yes
            - MYSQL_DATABASE=project
            - MYSQL_USER=project
            - MYSQL_PASSWORD=password
        volumes:
            - ./docker-config/mysql.cnf:/etc/mysql/conf.d/mysql.cnf
            - project-mysql-sync:/var/lib/mysql

    # Graylog / Filebeat

    filebeat:
        build: ./docker-config/filebeat
        volumes:
          - /var/lib/docker/containers:/var/lib/docker/containers:ro
          - /var/run/docker.sock:/var/run/docker.sock
        networks:
          - graylog-network
        depends_on:
          - graylog-elasticsearch

    graylog:
        image: graylog/graylog:2.4
        volumes:
          - graylog-journal:/usr/share/graylog/data/journal
        networks:
          - graylog-network
        environment:
          - GRAYLOG_PASSWORD_SECRET=somepasswordpepper
          - GRAYLOG_ROOT_PASSWORD_SHA2=8c6976e5b5410415bde908bd4dee15dfb167a9c873fc4bb8a81f6f2ab448a918
          - GRAYLOG_WEB_ENDPOINT_URI=http://127.0.0.1:9000/api
        links:
          - graylog-mongo:mongo
          - graylog-elasticsearch:elasticsearch
        depends_on:
          - graylog-mongo
          - graylog-elasticsearch
        ports:
          # Graylog web interface and REST API
          - 9000:9000

    graylog-mongo:
        image: mongo:3
        volumes:
            - graylog-mongo-data:/data/db
        networks:
            - graylog-network

    graylog-elasticsearch:
        image: docker.elastic.co/elasticsearch/elasticsearch:5.6.10
        ports:
            - "9200:9200"
        volumes:
            - graylog-elasticsearch-data:/usr/share/elasticsearch/data
        networks:
            - graylog-network
        environment:
            - cluster.name=graylog
            - "discovery.zen.minimum_master_nodes=1"
            - "discovery.type=single-node"
            - http.host=0.0.0.0
            - transport.host=localhost
            - network.host=0.0.0.0
            # Disable X-Pack security: https://www.elastic.co/guide/en/elasticsearch/reference/5.6/security-settings.html#general-security-settings
            - xpack.security.enabled=false
            - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
        ulimits:
            memlock:
                soft: -1
                hard: -1

volumes:
    project-app-sync:
        external: true
    project-mysql-sync: ~
    graylog-mongo-data:
        driver: local
    graylog-elasticsearch-data:
        driver: local
    graylog-journal:
        driver: local

networks:
    graylog-network: ~

Dockerfile of filebeat container

FROM docker.elastic.co/beats/filebeat:6.3.1
COPY filebeat.yml /usr/share/filebeat/filebeat.yml
# must run as root to access /var/lib/docker and /var/run/docker.sock
USER root
RUN chown root /usr/share/filebeat/filebeat.yml
# dont run with -e, to disable output to stderr
CMD [""]

filebeat.yml

filebeat.prospectors:
- type: docker
  paths:
    - '/var/lib/docker/containers/*/*.log'
    # path to symfony based logs
    - '/var/www/html/var/logs/*.log'
  containers.ids: '*'

processors:
  - decode_json_fields:
      fields: ["host","application","short_message"]
      target: ""
      overwrite_keys: true
  - add_docker_metadata: ~

output.elasticsearch:
  # transfer data to elasticsearch container?
  hosts: ["localhost:9200"]

logging.to_files: true
logging.to_syslog: false

Graylog backend

After setting up this docker composition I started the Graylog web-view and set up a collector and input as described here:

Maybe I have totally misunderstood how this could work. I am not totally sure if Beats from Elastic is the same as the filebeats container and if the sidecar collector is something extra I forgot to add. Maybe I misconfigured the collector and input in graylog?!

I would be thankful to any help or working example according to my problem …
Regards,
Sandeep,
Elasticsearch Developer.

your filebeat need to use the logstash output section.

output.logstash:
        hosts: ["gray.log.lan:5044"]

The message flow must be done via Graylgo and not direct into elasticsearch as Graylog process on ingest.

In Graylog you need to create only a Beats input ( System > Inputs) that your filebeat can send in the messages.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.