Filebeat logs dropped out

Hi

I have 3 graylog clusters (no connection between the clusters).
I would like to set filebeat to read and send a log file. I will need application logs from one envirement, MSSQL files from other. I set it up, but it doesn’t work, so I started to play in the third (test) system. And it is working well, so I can’t understand…

server side versions:
same in the master and the test envirement:
graylog-server-2.4.0-9
elasticsearch-5.6.6
mongodb-org-server-3.6.2

Client side versions:
Graylog Collector Sidecar version 0.0.9
filebeat version 1.2.3 (386)

I use graylog sidecar, and it create the followinf file for filebeat:

filebeat:
  prospectors:
  - document_type: log
    encoding: plain
    exclude_files: []
    fields:
      gl2_source_collector: 694c87d3-a4e8-4133-9a6e-47af95f956f4
    ignore_older: 0
    input_type: log
    paths:
    - c:\test.log
    scan_frequency: 10s
    tail_files: false
output:
  logstash:
    hosts:
    - 10.14.0.XX:5044
filebeat:
  registry_file: C:/Program Files/graylog/collector-sidecar/.filebeat.yml
logging:
  to_files: true
  files:
    path: C:/Program Files/graylog/collector-sidecar
    rotateeverybytes: 10485760 
  level: warning

So this configuration working well with the test envirement. (ok, sometimes it missed the first two letters from the line) If I change the IP address at the graylog website collector part, it doesn’t work with the live envirement. (the two systems in the same sumbets, so no different firóewall or proxys what can modify the packages)

I changed the log to trace, and I got this in the test system:

[root@graylog-t-node-01 ~]# grep 44652dc0-00f1-11e8-87c5-0050569f2180 -i /var/log/graylog-server/server.log
2018-01-24T11:28:06.940+01:00 TRACE [Message] [Message size update][44652dc0-00f1-11e8-87c5-0050569f2180] key message/7, new/old/change: 6/0/13 total: 13
2018-01-24T11:28:06.940+01:00 TRACE [Message] [Message size update][44652dc0-00f1-11e8-87c5-0050569f2180] key source/6, new/old/change: 6/0/12 total: 25
2018-01-24T11:28:06.940+01:00 TRACE [Message] [Message size update][44652dc0-00f1-11e8-87c5-0050569f2180] key timestamp/9, new/old/change: 8/0/17 total: 42
2018-01-24T11:28:06.940+01:00 TRACE [Message] [Message size update][44652dc0-00f1-11e8-87c5-0050569f2180] key name/4, new/old/change: 6/0/10 total: 52
2018-01-24T11:28:06.940+01:00 TRACE [Message] [Message size update][44652dc0-00f1-11e8-87c5-0050569f2180] key type/4, new/old/change: 3/0/7 total: 59
2018-01-24T11:28:06.940+01:00 TRACE [Message] [Message size update][44652dc0-00f1-11e8-87c5-0050569f2180] key facility/8, new/old/change: 8/0/16 total: 75
2018-01-24T11:28:06.940+01:00 TRACE [Message] [Message size update][44652dc0-00f1-11e8-87c5-0050569f2180] key file/4, new/old/change: 12/0/16 total: 91
2018-01-24T11:28:06.940+01:00 TRACE [Message] [Message size update][44652dc0-00f1-11e8-87c5-0050569f2180] key input_type/10, new/old/change: 3/0/13 total: 104
2018-01-24T11:28:06.940+01:00 TRACE [Message] [Message size update][44652dc0-00f1-11e8-87c5-0050569f2180] key count/5, new/old/change: 4/0/9 total: 113
2018-01-24T11:28:06.940+01:00 TRACE [Message] [Message size update][44652dc0-00f1-11e8-87c5-0050569f2180] key offset/6, new/old/change: 4/0/10 total: 123
2018-01-24T11:28:06.940+01:00 DEBUG [ProcessBufferProcessor] Starting to process message <44652dc0-00f1-11e8-87c5-0050569f2180>.
2018-01-24T11:28:06.940+01:00 TRACE [Message] [Message size update][44652dc0-00f1-11e8-87c5-0050569f2180] stream added: 131
2018-01-24T11:28:06.940+01:00 DEBUG [MessageFilterChainProcessor] Applying filter [Extractor] on message <44652dc0-00f1-11e8-87c5-0050569f2180>.
2018-01-24T11:28:06.940+01:00 DEBUG [MessageFilterChainProcessor] Applying filter [Static field appender] on message <44652dc0-00f1-11e8-87c5-0050569f2180>.
2018-01-24T11:28:06.940+01:00 DEBUG [MessageFilterChainProcessor] Applying filter [Rulesfilter] on message <44652dc0-00f1-11e8-87c5-0050569f2180>.
2018-01-24T11:28:06.940+01:00 DEBUG [MessageFilterChainProcessor] Applying filter [StreamMatcher] on message <44652dc0-00f1-11e8-87c5-0050569f2180>.
2018-01-24T11:28:06.940+01:00 DEBUG [StreamMatcherFilter] Routed message <44652dc0-00f1-11e8-87c5-0050569f2180> to 0 streams.
2018-01-24T11:28:06.941+01:00 DEBUG [ProcessBufferProcessor] Finished processing message <44652dc0-00f1-11e8-87c5-0050569f2180>. Writing to output buffer.
2018-01-24T11:28:06.941+01:00 DEBUG [OutputBufferProcessor] Processing message <44652dc0-00f1-11e8-87c5-0050569f2180> from OutputBuffer.
2018-01-24T11:28:06.941+01:00 TRACE [OutputBufferProcessor] Message id for [class org.graylog2.outputs.BlockingBatchedESOutput]: <44652dc0-00f1-11e8-87c5-0050569f2180>
2018-01-24T11:28:06.941+01:00 DEBUG [OutputBufferProcessor] Wrote message <44652dc0-00f1-11e8-87c5-0050569f2180> to all outputs. Finished handling.
2018-01-24T11:28:07.501+01:00 TRACE [ElasticSearchOutput] Writing message ids to [ElasticSearch Output]: <44649180-00f1-11e8-87c5-0050569f2180, 44652dc0-00f1-11e8-87c5-0050569f2180>
2018-01-24T11:28:08.687+01:00 TRACE [Message] [Message size update][44652dc0-00f1-11e8-87c5-0050569f2180] key offset/6, new/old/change: 4/0/10 total: 10
2018-01-24T11:28:08.687+01:00 TRACE [Message] [Message size update][44652dc0-00f1-11e8-87c5-0050569f2180] key streams/7, new/old/change: 0/0/7 total: 17
2018-01-24T11:28:08.687+01:00 TRACE [Message] [Message size update][44652dc0-00f1-11e8-87c5-0050569f2180] key input_type/10, new/old/change: 3/0/13 total: 30
2018-01-24T11:28:08.687+01:00 TRACE [Message] [Message size update][44652dc0-00f1-11e8-87c5-0050569f2180] key count/5, new/old/change: 4/0/9 total: 39
2018-01-24T11:28:08.687+01:00 TRACE [Message] [Message size update][44652dc0-00f1-11e8-87c5-0050569f2180] key source/6, new/old/change: 6/0/12 total: 51
2018-01-24T11:28:08.687+01:00 TRACE [Message] [Message size update][44652dc0-00f1-11e8-87c5-0050569f2180] key message/7, new/old/change: 6/0/13 total: 64
2018-01-24T11:28:08.687+01:00 TRACE [Message] [Message size update][44652dc0-00f1-11e8-87c5-0050569f2180] key type/4, new/old/change: 3/0/7 total: 71
2018-01-24T11:28:08.687+01:00 TRACE [Message] [Message size update][44652dc0-00f1-11e8-87c5-0050569f2180] key file/4, new/old/change: 12/0/16 total: 87
2018-01-24T11:28:08.687+01:00 TRACE [Message] [Message size update][44652dc0-00f1-11e8-87c5-0050569f2180] key name/4, new/old/change: 6/0/10 total: 97
2018-01-24T11:28:08.687+01:00 TRACE [Message] [Message size update][44652dc0-00f1-11e8-87c5-0050569f2180] key facility/8, new/old/change: 8/0/16 total: 113
2018-01-24T11:28:08.687+01:00 TRACE [Message] [Message size update][44652dc0-00f1-11e8-87c5-0050569f2180] key timestamp/9, new/old/change: 8/0/17 total: 130

And I got it in the live:

[root@graylog-node-01 ~]# grep abb0b065-00ee-11e8-bf19-0050569f426c /var/log/graylog-server/server.log
2018-01-24T11:09:31.750+01:00 TRACE [RawMessageEncoderHandler] Serialized message abb0b065-00ee-11e8-bf19-0050569f426c for journal, size 418 bytes
2018-01-24T11:09:31.750+01:00 TRACE [JournallingMessageHandler] Journalling message abb0b065-00ee-11e8-bf19-0050569f426c
2018-01-24T11:09:31.758+01:00 DEBUG [DecodingProcessor] Dropping incomplete message RawMessage{id=abb0b065-00ee-11e8-bf19-0050569f426c, journalOffset=385255131, codec=beats, payloadSize=297, timestamp=2018-01-24T10:09:31.750Z, remoteAddress=/100.64.3.47:61149} on input <58ee3d6df3f079039382a57b>. Parsed fields: [{gl2_source_collector=694c87d3-a4e8-4133-9a6e-47af95f956f4, file=c:\teszt.log, offset=474, name=L61922, input_type=log, count=1, _id=abb1e8e0-00ee-11e8-bf19-0050569f426c, source=L61922, message=, type=log, facility=filebeat, timestamp=2018-01-24T10:09:26.805Z}]
[root@graylog-node-01 ~]#

I have checked the graylogs configuration, but I don’t see any diference.

Message Processors Configuration

The following message processors are executed in order. Disabled processors will be skipped.
#	Processor	Status
1	GeoIP Resolver	disabled
2	Message Filter Chain	active
3	Pipeline Processor	active
4	AWS Instance Name Lookup	disabled

In the live system I have pipelines configured, but I also tried to remove the connections of the pipelines from “All messages” stream, but it didn’t help. Also I tried to make dummy pipeline in the test system.
I have checked, same input setting on both side.

Beats Beats 2 RUNNING

   bind_address:
    0.0.0.0
   override_source:
    <empty>
   port:
    5044
   recv_buffer_size:
    1048576
   tcp_keepalive:
    false
   tls_cert_file:
    <empty>
   tls_client_auth:
    disabled
   tls_client_auth_cert_file:
    <empty>
   tls_enable:
    false
   tls_key_file:
    <empty>
   tls_key_password:
    ********

Any idea why the wto system handle the same configuration different?

Thanks,

M

The DecodingProcessor will drop all incomplete (i. e. invalid) messages, and the message “abb1e8e0-00ee-11e8-bf19-0050569f426c” was dropped because has an empty mandatory “message” field.

@jochen , yes, you are right. I saw the same. I don’t understand why did it the server.
With the same client config my test server accept the data flow, and store the message, the live server not.

For reference:

yes, it is ok.
I would like to understand why will the message field empty if I set the filebeat output to the live server, and why contains the correct data if I set the test server.
Why different the full log process. on the test server : Message -> ProcessBufferProcessor -> MessageFilterChainProcessor -> StreamMatcherFilter …etc,
and on the live server: RawMessageEncoderHandler -> JournallingMessageHandler -> DecodingProcessor

So I see why the graylog drop the message, but I don’t know what cause this reason (why think the live graylog the message field empty, if the test thinks it contains message).

I would like to find a solution to arrive message from log file to graylog.

What does that mean?

I don’t know the exact log processing methode, but in the two log with the same log level (trace) I see different “ways”. I think the graylog put the processor/module name between [*], and I copied the full result for grep, but I don’t see common processors.

I did some more debug, and I got more information. Unfortunately it can’t help for me.

First I grep to my IP, after I tried with the message id.

[root@graylog-node-01 ~]# grep 100.64.3.53 /var/log/graylog-server/server.log -B5 -A5
2018-01-25T10:39:25.980+01:00 DEBUG [ProcessBufferProcessor] Finished processing message <a1c7cdc0-01b3-11e8-b8f9-0050569f426c>. Writing to output buffer.
2018-01-25T10:39:25.980+01:00 DEBUG [OutputBufferProcessor] Processing message <a1c7cdc0-01b3-11e8-b8f9-0050569f426c> from OutputBuffer.
2018-01-25T10:39:25.980+01:00 DEBUG [OutputBufferProcessor] Writing message to [class org.graylog2.outputs.BlockingBatchedESOutput].
2018-01-25T10:39:25.980+01:00 TRACE [OutputBufferProcessor] Message id for [class org.graylog2.outputs.BlockingBatchedESOutput]: <a1c7cdc0-01b3-11e8-b8f9-0050569f426c>
2018-01-25T10:39:25.980+01:00 DEBUG [OutputBufferProcessor] Wrote message <a1c7cdc0-01b3-11e8-b8f9-0050569f426c> to all outputs. Finished handling.
2018-01-25T10:39:26.013+01:00 TRACE [58ee3d6df3f079039382a57b] Recv network data: 8 bytes via input 'Beats' <58ee3d6df3f079039382a57b> from remote address /100.64.3.53:57260
2018-01-25T10:39:26.013+01:00 TRACE [58ee3d6df3f079039382a57b] Recv network data: 4 bytes via input 'Beats' <58ee3d6df3f079039382a57b> from remote address /100.64.3.53:57260
2018-01-25T10:39:26.013+01:00 TRACE [58ee3d6df3f079039382a57b] Recv network data: 262 bytes via input 'Beats' <58ee3d6df3f079039382a57b> from remote address /100.64.3.53:57260
2018-01-25T10:39:26.014+01:00 TRACE [RawMessageEncoderHandler] Serialized message a1ccfde0-01b3-11e8-b8f9-0050569f426c for journal, size 482 bytes
2018-01-25T10:39:26.014+01:00 DEBUG [JournallingMessageHandler] End of batch, journalling 1 messages
2018-01-25T10:39:26.014+01:00 TRACE [JournallingMessageHandler] Journalling message a1ccfde0-01b3-11e8-b8f9-0050569f426c
2018-01-25T10:39:26.014+01:00 TRACE [KafkaJournal] Message a1 cc fd e0 01 b3 11 e8 b8 f9 00 50 56 9f 42 6c contains bytes 08 01 11 e8 11 b3 01 e0 fd cc a1 19 6c 42 9f 56 50 00 f9 b8 21 be ac af 2c 61 01 00 00 2a 07 0a 05 62 65 61 74 73 32 42 0a 24 34 39 38 30 38 63 66 62 2d 39 37 31 37 2d 34 64 31 61 2d 38 30 30 39 2d 33 62 63 30 30 34 37 31 31 62 65 30 10 00 1a 18 35 38 65 65 33 64 36 64 66 33 66 30 37 39 30 33 39 33 38 32 61 35 37 62 3a 0a 0a 04 64 40 03 35 10 ac bf 03 42 e9 02 7b 22 40 6d 65 74 61 64 61 74 61 22 3a 7b 22 74 79 70 65 22 3a 22 6c 6f 67 22 2c 22 62 65 61 74 22 3a 22 66 69 6c 65 62 65 61 74 22 7d 2c 22 74 79 70 65 22 3a 22 6c 6f 67 22 2c 22 69 6e 70 75 74 5f 74 79 70 65 22 3a 22 6c 6f 67 22 2c 22 63 6f 75 6e 74 22 3a 31 2c 22 66 69 65 6c 64 73 22 3a 7b 22 67 6c 32 5f 73 6f 75 72 63 65 5f 63 6f 6c 6c 65 63 74 6f 72 22 3a 22 38 63 33 35 32 63 62 66 2d 65 33 66 35 2d 34 39 38 35 2d 61 30 31 38 2d 32 34 30 33 33 35 36 64 63 39 35 61 22 7d 2c 22 6f 66 66 73 65 74 22 3a 32 34 31 36 2c 22 73 6f 75 72 63 65 22 3a 22 63 3a 5c 5c 74 65 73 7a 74 2e 6c 6f 67 22 2c 22 6d 65 73 73 61 67 65 22 3a 22 4a 61 6e 20 32 35 20 31 33 3a 35 30 3a 30 34 20 67 72 61 79 6c 6f 67 2d 6e 6f 64 65 2d 30 31 20 73 79 73 74 65 6d 64 2d 6c 6f 67 69 6e 64 3a 20 54 65 73 74 20 6d 65 73 73 61 67 65 20 35 2e 22 2c 22 62 65 61 74 22 3a 7b 22 68 6f 73 74 6e 61 6d 65 22 3a 22 4c 36 31 39 32 32 22 2c 22 6e 61 6d 65 22 3a 22 4c 36 31 39 32 32 22 7d 2c 22 40 74 69 6d 65 73 74 61 6d 70 22 3a 22 32 30 31 38 2d 30 31 2d 32 35 54 30 39 3a 33 39 3a 32 33 2e 35 39 30 5a 22 7d
2018-01-25T10:39:26.014+01:00 DEBUG [KafkaJournal] Trying to write ByteBufferMessageSet with size of 524 bytes to journal
[root@graylog-node-01 ~]# grep a1ccfde0-01b3-11e8-b8f9-0050569f426c  /var/log/graylog-server/server.log -B5 -A5
2018-01-25T10:39:25.980+01:00 TRACE [OutputBufferProcessor] Message id for [class org.graylog2.outputs.BlockingBatchedESOutput]: <a1c7cdc0-01b3-11e8-b8f9-0050569f426c>
2018-01-25T10:39:25.980+01:00 DEBUG [OutputBufferProcessor] Wrote message <a1c7cdc0-01b3-11e8-b8f9-0050569f426c> to all outputs. Finished handling.
2018-01-25T10:39:26.013+01:00 TRACE [58ee3d6df3f079039382a57b] Recv network data: 8 bytes via input 'Beats' <58ee3d6df3f079039382a57b> from remote address /100.64.3.53:57260
2018-01-25T10:39:26.013+01:00 TRACE [58ee3d6df3f079039382a57b] Recv network data: 4 bytes via input 'Beats' <58ee3d6df3f079039382a57b> from remote address /100.64.3.53:57260
2018-01-25T10:39:26.013+01:00 TRACE [58ee3d6df3f079039382a57b] Recv network data: 262 bytes via input 'Beats' <58ee3d6df3f079039382a57b> from remote address /100.64.3.53:57260
2018-01-25T10:39:26.014+01:00 TRACE [RawMessageEncoderHandler] Serialized message a1ccfde0-01b3-11e8-b8f9-0050569f426c for journal, size 482 bytes
2018-01-25T10:39:26.014+01:00 DEBUG [JournallingMessageHandler] End of batch, journalling 1 messages
2018-01-25T10:39:26.014+01:00 TRACE [JournallingMessageHandler] Journalling message a1ccfde0-01b3-11e8-b8f9-0050569f426c
2018-01-25T10:39:26.014+01:00 TRACE [KafkaJournal] Message a1 cc fd e0 01 b3 11 e8 b8 f9 00 50 56 9f 42 6c contains bytes 08 01 11 e8 11 b3 01 e0 fd cc a1 19 6c 42 9f 56 50 00 f9 b8 21 be ac af 2c 61 01 00 00 2a 07 0a 05 62 65 61 74 73 32 42 0a 24 34 39 38 30 38 63 66 62 2d 39 37 31 37 2d 34 64 31 61 2d 38 30 30 39 2d 33 62 63 30 30 34 37 31 31 62 65 30 10 00 1a 18 35 38 65 65 33 64 36 64 66 33 66 30 37 39 30 33 39 33 38 32 61 35 37 62 3a 0a 0a 04 64 40 03 35 10 ac bf 03 42 e9 02 7b 22 40 6d 65 74 61 64 61 74 61 22 3a 7b 22 74 79 70 65 22 3a 22 6c 6f 67 22 2c 22 62 65 61 74 22 3a 22 66 69 6c 65 62 65 61 74 22 7d 2c 22 74 79 70 65 22 3a 22 6c 6f 67 22 2c 22 69 6e 70 75 74 5f 74 79 70 65 22 3a 22 6c 6f 67 22 2c 22 63 6f 75 6e 74 22 3a 31 2c 22 66 69 65 6c 64 73 22 3a 7b 22 67 6c 32 5f 73 6f 75 72 63 65 5f 63 6f 6c 6c 65 63 74 6f 72 22 3a 22 38 63 33 35 32 63 62 66 2d 65 33 66 35 2d 34 39 38 35 2d 61 30 31 38 2d 32 34 30 33 33 35 36 64 63 39 35 61 22 7d 2c 22 6f 66 66 73 65 74 22 3a 32 34 31 36 2c 22 73 6f 75 72 63 65 22 3a 22 63 3a 5c 5c 74 65 73 7a 74 2e 6c 6f 67 22 2c 22 6d 65 73 73 61 67 65 22 3a 22 4a 61 6e 20 32 35 20 31 33 3a 35 30 3a 30 34 20 67 72 61 79 6c 6f 67 2d 6e 6f 64 65 2d 30 31 20 73 79 73 74 65 6d 64 2d 6c 6f 67 69 6e 64 3a 20 54 65 73 74 20 6d 65 73 73 61 67 65 20 35 2e 22 2c 22 62 65 61 74 22 3a 7b 22 68 6f 73 74 6e 61 6d 65 22 3a 22 4c 36 31 39 32 32 22 2c 22 6e 61 6d 65 22 3a 22 4c 36 31 39 32 32 22 7d 2c 22 40 74 69 6d 65 73 74 61 6d 70 22 3a 22 32 30 31 38 2d 30 31 2d 32 35 54 30 39 3a 33 39 3a 32 33 2e 35 39 30 5a 22 7d
2018-01-25T10:39:26.014+01:00 DEBUG [KafkaJournal] Trying to write ByteBufferMessageSet with size of 524 bytes to journal
2018-01-25T10:39:26.014+01:00 DEBUG [KafkaJournal] Wrote 1 messages to journal: 524 bytes (payload 482 bytes), log position 388049429 to 388049429
2018-01-25T10:39:26.014+01:00 DEBUG [JournallingMessageHandler] Processed batch, wrote 482 bytes, last journal offset: 388049429, signalling reader.
2018-01-25T10:39:26.014+01:00 DEBUG [JournalReader] Messages have been written to Journal, continuing to read.

if you convert HEX -> ASCII

{"@metadata":{"type":"log","beat":"filebeat"},"type":"log","input_type":"log","count":1,"fields":{"gl2_source_collector":"8c352cbf-e3f5-4985-a018-2403356dc95a"},"offset":2416,"source":"c:\\teszt.log","message":"Jan 25 13:50:04 graylog-node-01 systemd-logind: Test message 5.","beat":{"hostname":"L61922","name":"L61922"},"@timestamp":"2018-01-25T09:39:23.590Z"}

The client side filebeat log contains:

2018-01-25T10:39:23+01:00 DBG  Start next scan
2018-01-25T10:39:23+01:00 DBG  scan path c:\nem_letezo_file.log
2018-01-25T10:39:23+01:00 DBG  full line read
2018-01-25T10:39:23+01:00 DBG  End of file reached: c:\teszt.log; Backoff now.
2018-01-25T10:39:23+01:00 DBG  Start next scan
2018-01-25T10:39:23+01:00 DBG  scan path c:\teszt.log
2018-01-25T10:39:23+01:00 DBG  Check file for harvesting: c:\teszt.log
2018-01-25T10:39:23+01:00 DBG  Same file as before found. Fetch the state.
2018-01-25T10:39:23+01:00 DBG  Update existing file for harvesting: c:\teszt.log
2018-01-25T10:39:23+01:00 DBG  Not harvesting, file didn't change: c:\teszt.log
2018-01-25T10:39:24+01:00 DBG  End of file reached: c:\teszt.log; Backoff now.
2018-01-25T10:39:25+01:00 DBG  Flushing spooler because of timeout. Events flushed: 1
2018-01-25T10:39:25+01:00 DBG  Publish: {
  "@timestamp": "2018-01-25T09:39:23.590Z",
  "beat": {
    "hostname": "L61922",
    "name": "L61922"
  },
  "count": 1,
  "fields": {
    "gl2_source_collector": "8c352cbf-e3f5-4985-a018-2403356dc95a"
  },
  "input_type": "log",
  "message": "Jan 25 13:50:04 graylog-node-01 systemd-logind: Test message 5.",
  "offset": 2416,
  "source": "c:\\teszt.log",
  "type": "log"
}
2018-01-25T10:39:25+01:00 DBG  output worker: publish 1 events
2018-01-25T10:39:25+01:00 DBG  Try to publish 1 events to logstash with window size 17
2018-01-25T10:39:25+01:00 DBG  1 events out of 1 events sent to logstash. Continue sending ...
2018-01-25T10:39:25+01:00 DBG  send completed
2018-01-25T10:39:25+01:00 INFO Events sent: 1
2018-01-25T10:39:25+01:00 DBG  Processing 1 events
2018-01-25T10:39:25+01:00 DBG  Write registry file: C:\.filebeat
2018-01-25T10:39:25+01:00 INFO Registry file updated. 1 states written.
2018-01-25T10:39:26+01:00 DBG  End of file reached: c:\teszt.log; Backoff now.

//The date in the message filed not real, I just insert this line in the logfile manually.

So the message field doesn’t empty, but in this case I didn’t find dropping message or processing, just write to journal.

I also find this message on the graylog web page. I don’t find the same ID, but usually comes when I test. So i’m not sure this message releated with my main problem.

|Timestamp|Index|Letter ID|Error message|
|15 minutes ago|graylog-_13|a4461430-01b3-11e8-b8f9-0050569f426c|{"type":"mapper_parsing_exception","reason":"failed to parse","caused_by":{"type":"illegal_argument_exception","reason":"Can't parse [index] value [not_analyzed] for field [input_type], expected [true] or [false]"}}|

Have you upgraded Elasticsearch recently?
Try rotating the active write-index via System/Indices/Index Set/Maintenance.

I did update 1-2 weeks ago. (the filebeat useage is a new need).
I did the rotate and the message appears in the search.

But we use old elasticsearch on the third system. We plan the update, so maybe it solves the problem there too.

Thanks for your assistance.

See, that’s why you always provide the full logs if you have a problem and not just some arbitrary lines you think are important. :wink:

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.