Graylog monitor its own logs


Im 48 hours new to Graylog, and very excited to implement it in production. I’m still testing it and facing some issues. As a start, is it possible to create an INPUT to send Graylog and Elastic search logs to Graylog to make it easier for me to debug problems?


Yes, install the collector-sidecar

1 Like

Hi @jochen, after installing the internal logs plugin how do you go about finding logs it is writing?
I looked around for new inputs, streams, etc… but didn’t find anything. I just want to make sure it’s working.

I do see it registered as a plugin under system/nodes/installed plugins.

Also, am I correct in assuming it is writing the logs into a Graylog index?

If the plugin has been loaded correctly, you can create an input on the System/Inputs page for ingesting the internal Graylog log messages.

Thanks. Was that in the directions some place? I looked in a few places and didn’t find anything.

Also, would it make sense to add a “Static Fields” option for this input? I know it has been a very appreciated feature for us.

If you think it is, I can add a feature request.

Thanks for the help!

There’s no step-by-step documentation for the plugin (being a Graylog Labs project, see, but the very first sentence in the README says:

This plugin provides an input for recording Graylog’s internal log messages

This should already be possible, if I’m not completely mistaken. The static fields functionality is available for every type of input.
Have you tried clicking on “More actions” next to the input on the System / Inputs page?

1 Like

@jochen, you are correct, I should have caught that. I guess that’s what happens when you make such a great product and I don’t have to touch it very often.

Thanks for all the help!

Than you @jochen for this nice simple Plugin. I can see the logs now after creating the Input.

Hello @jochen ,

after installing “graylog-labs/graylog-plugin-internal-logs” im getting tihs error:

Error Message:
Unable to perform search query
Search status code:
Search response:
cannot GET https://myserver/api/search/universal/relative?query=*&range=300&limit=150&sort=timestamp%3Adesc (500)

I’m corelating this error to the plugin because i got the same error when i used the OVA version, and i had to route to the manual install. The same error happened next day when I checked the newly installed system.

Any idea what went wrong?


Check the logs of your Graylog and Elasticsearch nodes.

Hi @jochen

It seems like I created there is a problem in indexing, caused by non-numeric values. The question is how to fix this mapping, and make sure that indexing is fine?

Graylog LOG

2018-05-10T15:42:35.639-04:00 WARN  [Messages] Failed to index message: index=<graylog_0> id=<498c1f60-548a-11e8-876c-000c299fd582> error=<{"type":"mapper_parsing_exception","reason":"failed to parse [response_time]","caused_by":{"type":"number_format_exception","reason":"For input string: \"Name:\t-\n\n\tCaller\""}}>
2018-05-10T15:42:35.640-04:00 ERROR [Messages] Failed to index [1] messages. Please check the index error log in your web interface for the reason. Error: One or more of the items in the Bulk request failed, check BulkResult.getItems() for more information.
2018-05-10T15:43:39.641-04:00 WARN  [Messages] Failed to index message: index=<graylog_0> id=<6fbb3541-548a-11e8-876c-000c299fd582> error=<{"type":"mapper_parsing_exception","reason":"failed to parse [response_time]","caused_by":{"type":"number_format_exception","reason":"For input string: \"Name:\t-\n\n\tCaller\""}}>
2018-05-10T15:43:39.647-04:00 ERROR [Messages] Failed to index [1] messages. Please check the index error log in your web interface for the reason. Error: One or more of the items in the Bulk request failed, check BulkResult.getItems() for more information.

Elasticsearch LOG

[2018-05-10T16:14:51,633][DEBUG][o.e.a.b.TransportShardBulkAction] [AN1ps8U] [graylog_0][1] failed to execute bulk item (index) BulkShardRequest [[graylog_0][1]] containing [index {[graylog_deflector][message][cbb95490-548e-11e8-876c-000c299fd582], source[{"winlogbeat_fields_gl2_source_collector":"33e1ca5b-a3c6-40d4-8f5a-9608a5e55e55","winlogbeat_record_number":"4388113","winlogbeat_user_domain":"NT AUTHORITY","collector_node_id":"SENA-GATEWAY","gl2_remote_ip":"","gl2_remote_port":2359,"winlogbeat_level":"Audit Success","winlogbeat_tags":["windows"],"source":"SENA-gateway","type":"eventlogging","gl2_source_input":"5aef333939664b064358f410","winlogbeat_fields_collector_node_id":"SENA-GATEWAY","winlogbeat_user_type":"Well Known Group","winlogbeat_event_data_param10":"-","winlogbeat_event_data_param11":"-","winlogbeat_event_data_param12":"-","winlogbeat_source_name":"Security","gl2_source_node":"d06bcdf8-313b-4c24-ac06-6ba40ed5f5fe","winlogbeat_user_name":"ANONYMOUS LOGON","timestamp":"2018-05-10 20:15:36.000","winlogbeat_log_name":"Security","winlogbeat_user_identifier":"S-1-5-7","gl2_source_collector":"33e1ca5b-a3c6-40d4-8f5a-9608a5e55e55","streams":["000000000000000000000001"],"winlogbeat_type":"eventlogging","message":"Successful Network Logon:\n\n\tUser Name:\t\n\n\tDomain:\t\t\n\n\tLogon ID:\t\t(0x0,0x13A79B9)\n\n\tLogon Type:\t3\n\n\tLogon Process:\tNtLmSsp \n\n\tAuthentication Package:\tNTLM\n\n\tWorkstation Name:\tPNMA\n\n\tLogon GUID:\t-\n\n\tCaller User Name:\t-\n\n\tCaller Domain:\t-\n\n\tCaller Logon ID:\t-\n\n\tCaller Process ID: -\n\n\tTransited Services: -\n\n\tSource Network Address:\t10.0.0.50\n\n\tSource Port:\t3040","winlogbeat_event_data_param13":"-","winlogbeat_event_data_param14":"","winlogbeat_event_data_param15":"3040","tags":["windows"],"winlogbeat_event_data_param3":"(0x0,0x13A79B9)","winlogbeat_event_data_param4":"3","winlogbeat_event_data_param5":"NtLmSsp","winlogbeat_event_id":540,"name":"SENA-gateway","response_time":"Name:\t-\n\n\tCaller","winlogbeat_event_data_param6":"NTLM","facility":"winlogbeat","winlogbeat_computer_name":"SENA-GATEWAY","winlogbeat_event_data_param7":"PNMA","winlogbeat_event_data_param8":"-","winlogbeat_event_data_param9":"-","http_response_code":"User"}]}]
org.elasticsearch.index.mapper.MapperParsingException: failed to parse [response_time]
	at org.elasticsearch.index.mapper.FieldMapper.parse( ~[elasticsearch-5.6.9.jar:5.6.9]
... many line to follow
Caused by: java.lang.NumberFormatException: For input string: "Name:	-

at sun.misc.FloatingDecimal.readJavaFormatString( ~[?:?]

Thank you

You can either modify the messages in question in a pipeline rule (e. g. with rename_field()) or store these messages in a stream with a custom index set (with its own index mapping) and remove them from the “All messages” stream, see

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.