Issue: When making a content pack, selected entities in the “Content Selection”-step don’t (always) transfer to the “Prameters”-step which, subsequently, results in an empty content pack
Use Case: I’m hosting a graylog environment for multiple developers. On deployment I’m auto-loading a content pack ensuring the availability of certain dashboards. After this content pack has been loaded users can add notifications/dashboards/etc… to this instance of Graylog. My goal is expanding the content pack, used in the auto-load, with new content created by developers, enabling me to update or tweak the container without the risk of losing any content.
Expected behaviour: all selected entities end up in the created content pack
Method: I’m using the webbrowser and selecting the “Content packs” option from the menu followed by “Create Content Pack” or slecting a content pack and “Create New From Revision”. Both options show the same results: some entities transfer properly but some don’t. Each try seems to be different.
Setup: I’m using docker-compose setup in which graylog works with elasticsearch and mongoDB.
My docker-compose looks like this:
version: "3.8"
services:
mongodb:
image: "mongo:latest"
container_name: "mongodb"
volumes:
- "mongodb_data:/data/db"
restart: "always"
mem_limit: 3g
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch-oss:7.10.2
container_name: "elasticsearch"
restart: always
volumes:
- es_data:/opt/graylog/data/elasticsearch
environment:
- http.host=0.0.0.0
- transport.host=localhost
- network.host=0.0.0.0
- "ES_JAVA_OPTS=-Dlog4j2.formatMsgNoLookups=true -Xms2g -Xmx2g"
ulimits:
memlock:
soft: -1
hard: -1
mem_limit: 3g
graylog:
hostname: "graylog"
container_name: "graylog"
image: graylog/graylog:5.2
depends_on:
elasticsearch:
condition: "service_started"
mongodb:
condition: "service_started"
entrypoint: /usr/bin/tini -- wait-for-it elasticsearch:9200 -- /docker-entrypoint.sh
environment:
GRAYLOG_PASSWORD_SECRET: {{graylog password secret}}
GRAYLOG_ROOT_PASSWORD_SHA2: {{graylog sha password}}
GRAYLOG_HTTP_BIND_ADDRESS: "0.0.0.0:9000"
GRAYLOG_HTTP_EXTERNAL_URI: {{http graylog uri}}
GRAYLOG_WEB_ENDPOINT_URI: {{graylog uri}
GRAYLOG_ELASTICSEARCH_HOSTS: {{internal elasticsearchUri}}
GRAYLOG_MONGODB_URI: {{internal mongodbUri}}
GRAYLOG_TRANSPORT_EMAIL_WEB_INTERFACE_URL: {{emailwebinterfaceport}}
GRAYLOG_TRANSPORT_EMAIL_ENABLED: "true"
GRAYLOG_TRANSPORT_PROTOCOL: "smtp"
GRAYLOG_TRANSPORT_EMAIL_HOSTNAME: {{emailhostname}}
GRAYLOG_TRANSPORT_EMAIL_PORT: {{emailport}}
GRAYLOG_TRANSPORT_EMAIL_USE_AUTH: "false"
GRAYLOG_TRANSPORT_EMAIL_USE_TLS: "false"
GRAYLOG_TRANSPORT_EMAIL_USE_SSL: "false"
GRAYLOG_TRANSPORT_EMAIL_FROM_NAME: "Graylog"
GRAYLOG_TRANSPORT_EMAIL_FROM_EMAIL: {{our_devteam_emailaddress}}
GRAYLOG_TRANSPORT_SUBJECT_PREFIX: "[graylog]"
GRAYLOG_CONTENT_PACKS_AUTO_INSTALL: "content-pack.json"
GRAYLOG_CONTENT_PACKS_DIR: "data/contentpacks"
GRAYLOG_CONTENT_PACKS_LOADER_ENABLED: "true"
GRAYLOG_SERVER_JAVA_OPTS: "-Xms1g -Xmx1g -XX:NewRatio=1 -server -XX:+ResizeTLAB -XX:-OmitStackTraceInFastThrow"
ports: {{all our ports}}
volumes:
- "graylog_data:/usr/share/graylog/data/data"
- "./contentpacks:/usr/share/graylog/data/contentpacks/"
restart: "always"
mem_limit: 3g
volumes:
mongodb_data:
es_data:
graylog_data:
graylog_journal:
Is there something wrong with the way that I host Graylog or something wrong with my approach to the problem?