Failed to index message: failed to parse field [level] of type [long] in document

Hi.

I am running graylog 3.1 on centos 7.7 with elasticsearch 6.8. I am having an issue where graylog is rejecting messages with this error:

2020-06-16T11:36:02.881Z WARN [Messages] Failed to index message: index=<grayloglod_20> id= error=<{“type”:“mapper_parsing_exception”,“reason”:“failed to parse field [level] of type [long] in document with id ‘id’”,“caused_by”:{“type”:“illegal_argument_exception”,“reason”:“For input string: “INFO””}}>

I do not have any configuration that specifies the “level” field must be an int.

I have other graylog instances on same versions where this does not occur, and on those installs the “level” field is successfully translated from INFO to 6. I do not have an elasticsearch template or a graylog pipeline to do this, so i am unsure where it happens.

I have tried deleting the indices to no avail.

Any suggestions would be appreciated. Thank you

You try to send message with field string (“INFO”) and not long (numeric). So graylog can’t save because different type.

Check:

  1. your message field level is only numeric
  2. you don’t use extractor to convert level from numeric to string to same field

Hi shoothub.

Thank you for your response.

I have not configured any elasticsearch templates and none of the default specify that “level” must be int.

The only mention of level in all templates is:

“mappings” : {
“audit_message” : {
“properties” : {
“job_id” : {
“type” : “keyword”
},
“level” : {
“type” : “keyword”
},
“message” : {
“type” : “text”,
“fields” : {
“raw” : {
“type” : “keyword”
}
}
},
“timestamp” : {
“type” : “date”
},
“node_name” : {
“type” : “keyword”
}
}
}
},

I have one extractor on the input for kubernetes metadata:

Configuration

  • list_separator: ,
  • kv_separator: =
  • key_prefix: k8s_
  • key_separator: _
  • replace_key_whitespace:
  • key_whitespace_replacement: _

I have other installs with same configuration with kubernetes projects sending exact same logs where this is not a problem.

Cheers


Pete

graylog uses elastic dynamic mapping to define schema, so if your first message field level is numeric (float) it will create field with this type. Another message with message field level with string will fail, because of another type. Best way is to create separate index for different source messages and don’t mix them all…

Check also custom index mapping in docs:
https://docs.graylog.org/en/3.1/pages/configuration/elasticsearch.html#custom-index-mappings

1 Like

Ahah, that is very useful information. Thank you very much shoothub!

Hi shoothub.

Sorry to pester you, but in the event that all logs are coming in through the same input, is there any way I could add a pipeline that would rewrite any non integer level fields?

Something like:

rule “level to long”
when
has_field(“level”) && $message.level == “info”
then
remove_field(“level”);
set_field(“level”, 6);
end

Or is this a bad idea?

Thanks for help.

Not a bad idea… it should definitely work…

Thanks again shoothub, it does indeed appear to be working.

For posterity:

rule “info level to int”
when
has_field(“level”) && lowercase(to_string($message.level)) == “info”
then
remove_field(“level”);
set_field(“level”, 6);
end

rule “warn level to int”
when
has_field(“level”) && lowercase(to_string($message.level)) == “warn”
then
remove_field(“level”);
set_field(“level”, 4);
end

I added these to a pipeline connected to all messages.

Cheers!

2 Likes

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.