I’m having an issue running filebeat on some Windows VM’s where the filebeat collector/sidecar randomly stops, sometimes after a couple hours, sometimes after a couple days. I can’t seem to figure out why it’s stopping so I increased log level to debug so I can gather some more details about the stoppage but I can’t decipher what is happening.
2022-07-12T16:09:27.149-0700 DEBUG [harvester] log/log.go:107 End of file reached: C:\inetpub\logs\LogFiles\W3SVC1\u_ex22071223_x.log; Backoff now.
2022-07-12T16:09:27.903-0700 DEBUG [service] service/service.go:65 Received svc stop/shutdown request
2022-07-12T16:09:27.903-0700 INFO beater/filebeat.go:515 Stopping filebeat
2022-07-12T16:09:27.903-0700 DEBUG [publisher] pipeline/client.go:158 client: closing acker
2022-07-12T16:09:27.903-0700 DEBUG [publisher] pipeline/client.go:163 client: done closing acker
2022-07-12T16:09:27.903-0700 INFO beater/crawler.go:148 Stopping Crawler
2022-07-12T16:09:27.904-0700 INFO beater/crawler.go:158 Stopping 2 inputs
2022-07-12T16:09:27.904-0700 DEBUG [publisher] pipeline/client.go:165 client: unlink from queue
2022-07-12T16:09:27.904-0700 INFO [crawler] beater/crawler.go:163 Stopping input: 12961389870753776509
2022-07-12T16:09:27.904-0700 INFO [crawler] beater/crawler.go:163 Stopping input: 10275958006650416499
2022-07-12T16:09:27.904-0700 INFO input/input.go:136 input ticker stopped
2022-07-12T16:09:27.904-0700 INFO input/input.go:136 input ticker stopped
2022-07-12T16:09:27.904-0700 DEBUG [publisher] pipeline/client.go:187 client: cancelled 0 events
2022-07-12T16:09:27.904-0700 INFO log/harvester.go:329 Reader was closed: C:\inetpub\logs\LogFiles\W3SVC1\u_ex22071223_x.log. Closing.
2022-07-12T16:09:27.904-0700 DEBUG [publisher] pipeline/client.go:167 client: done unlink
2022-07-12T16:09:27.905-0700 DEBUG [publisher] pipeline/client.go:170 client: closing processors
2022-07-12T16:09:27.905-0700 DEBUG [harvester] log/harvester.go:612 Stopping harvester for file: C:\inetpub\logs\LogFiles\W3SVC1\u_ex22071223_x.log
2022-07-12T16:09:27.905-0700 DEBUG [publisher] pipeline/client.go:175 client: done closing processors
2022-07-12T16:09:27.905-0700 DEBUG [harvester] log/harvester.go:622 Closing file: C:\inetpub\logs\LogFiles\W3SVC1\u_ex22071223_x.log
2022-07-12T16:09:27.905-0700 DEBUG [harvester] log/harvester.go:488 Update state: C:\inetpub\logs\LogFiles\W3SVC1\u_ex22071223_x.log, offset: 4408143
2022-07-12T16:09:27.906-0700 DEBUG [harvester] log/harvester.go:633 harvester cleanup finished for file: C:\inetpub\logs\LogFiles\W3SVC1\u_ex22071223_x.log
2022-07-12T16:09:27.906-0700 DEBUG [publisher] pipeline/client.go:158 client: closing acker
2022-07-12T16:09:27.906-0700 DEBUG [publisher] pipeline/client.go:163 client: done closing acker
2022-07-12T16:09:27.906-0700 DEBUG [publisher] pipeline/client.go:165 client: unlink from queue
2022-07-12T16:09:27.906-0700 DEBUG [publisher] pipeline/client.go:187 client: cancelled 0 events
2022-07-12T16:09:27.906-0700 DEBUG [publisher] pipeline/client.go:167 client: done unlink
2022-07-12T16:09:27.906-0700 DEBUG [publisher] pipeline/client.go:170 client: closing processors
2022-07-12T16:09:27.906-0700 DEBUG [publisher] pipeline/client.go:175 client: done closing processors
2022-07-12T16:09:27.906-0700 INFO beater/crawler.go:178 Crawler stopped
2022-07-12T16:09:27.906-0700 INFO [registrar] registrar/registrar.go:132 Stopping Registrar
2022-07-12T16:09:27.906-0700 INFO [registrar] registrar/registrar.go:166 Ending Registrar
2022-07-12T16:09:27.916-0700 DEBUG [registrar] registrar/registrar.go:167 Stopping Registrar
2022-07-12T16:09:27.916-0700 INFO [registrar] registrar/registrar.go:137 Registrar stopped
2022-07-12T16:09:27.921-0700 INFO [monitoring] log/log.go:152 Total non-zero metrics {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":1966781,"time":{"ms":1966781}},"total":{"ticks":2764812,"time":{"ms":2764812},"value":2764812},"user":{"ticks":798031,"time":{"ms":798031}}},"handles":{"open":190},"info":{"ephemeral_id":"ff97be90-9c50-42a7-b4f4-68a4862d390b","uptime":{"ms":119969889}},"memstats":{"gc_next":9458608,"memory_alloc":6029200,"memory_sys":25856928,"memory_total":77213790384,"rss":42139648},"runtime":{"goroutines":15}},"filebeat":{"events":{"active":49,"added":1982148,"done":1982099},"harvester":{"closed":35,"open_files":0,"running":0,"started":35}},"libbeat":{"config":{"module":{"running":0}},"output":{"events":{"acked":1981870,"active":0,"batches":13590,"total":1981870},"read":{"bytes":81540},"type":"logstash","write":{"bytes":254162631}},"pipeline":{"clients":0,"events":{"active":49,"filtered":229,"published":1981919,"retry":341,"total":1982148},"queue":{"acked":1981870}}},"registrar":{"states":{"cleanup":48,"current":98,"update":1982099},"writes":{"success":13817,"total":13817}},"system":{"cpu":{"cores":4}}}}}
2022-07-12T16:09:27.921-0700 INFO [monitoring] log/log.go:153 Uptime: 33h19m29.8912989s
2022-07-12T16:09:27.921-0700 INFO [monitoring] log/log.go:130 Stopping metrics logging.
2022-07-12T16:09:27.922-0700 INFO instance/beat.go:474 filebeat stopped.
Interesting part is the second line, I’m not sure why it receives service shutdown request?
Received svc stop/shutdown request