Gl2_processing_error

Before you post: Your responses to these questions will help the community help you. Please complete this template if you’re asking a support question.
Don’t forget to select tags to help index your topic!

1. Describe your incident:
gl2_processing_error in incoming NGINX log message

2. Describe your environment:

  • OS Information:
    Ubuntu 18.04

  • Package Version:
    Graylog 4.2.5+59802bf

  • Service logs, configurations, and environment variables:

    log_format graylog2_json escape=json '{ "timestamp": "$time_iso8601", '
                         '"remote_addr": "$remote_addr", '
                         '"body_bytes_sent": $body_bytes_sent, '
                         '"request_time": $request_time, '
                         '"response_status": $status, '
                         '"request": "$request", '
                         '"request_method": "$request_method", '
                         '"host": "$host",'
                         '"upstream_cache_status": "$upstream_cache_status",'
                         '"upstream_addr": "$upstream_addr",'
                         '"http_x_forwarded_for": "$http_x_forwarded_for",'
                         '"http_referrer": "$http_referer", '
                         '"http_user_agent": "$http_user_agent" }';

3. What steps have you already taken to try and solve the problem?

None

4. How can the community help?

I am seeing a gl2_processing_error in incoming NGINX logs in graylog such as:

and I believe is related to the following configuration I have in NGINX in order to send its logs to graylog:

log_format graylog2_json escape=json '{ "timestamp": "$time_iso8601", '
                     '"remote_addr": "$remote_addr", '
                     '"body_bytes_sent": $body_bytes_sent, '
                     '"request_time": $request_time, '
                     '"response_status": $status, '
                     '"request": "$request", '
                     '"request_method": "$request_method", '
                     '"host": "$host",'
                     '"upstream_cache_status": "$upstream_cache_status",'
                     '"upstream_addr": "$upstream_addr",'
                     '"http_x_forwarded_for": "$http_x_forwarded_for",'
                     '"http_referrer": "$http_referer", '
                     '"http_user_agent": "$http_user_agent" }';

I can provide more information as needed but I believe the error is related to the Content Pack configuration above.

Helpful Posting Tips: Tips for Posting Questions that Get Answers [Hold down CTRL and link on link to open tips documents in a separate tab]

Hello,

Elasticsearch doesn’t like you timestamp field. This could be a couple things. INPUT not compatible with the device send logs. Index template not configured correctly. Etc…

Could you explain in greater detail how your ingesting logs into graylog?

I had a feeling. I am using this content pack.

:open_mouth: Looks like it wa smade for version 3.0. Not much more details on this plug-in.
Looking at GitHub its using syslog UDP graylog2.inputs.syslog.udp.SyslogUDPInput. A lot of new configurations have been made since Graylog Version 3.

Was this running before or was this just installed?

EDIT: I also seen a ton of extractors on that INPUT to create those fields. If you can find the TIMESTAMP fields maybe you can change? Just and Idea

EDIT2: Still looking at this content pack. So nginx does not allow to send his logs to a remote site native. Write the nginx log locally and use a collector. By chance what collector are you using?

This has been running for quite sometime but only noticed after a recent update to Graylog. If I find the time stamp field, what should I change it to?

I am having Nginx simply write its logs to the Ubuntu syslog location and extracting it from there using the content pack.

Hello,

I see but the content pack doesn’t ship logs. NXlog, Winlogbeat, Filebeat, Rsyslog, etc… are the software that ships logs. I assume Nginx is on the same server as your Graylog server? If so, then you maybe using rsyslog which is native to Linux.

Just so you know I have worked with Content packs before but not the one you have. I’m having doubts if its the content pack or it could be the way graylog is ingest logs. Meaning what your actually using to ship logs locally or remotely.

EDIT: I get it now,

replace the hostnames with the IP or hostname of your Graylog2 server
access_log syslog:server=graylog.server.org:12304 graylog2_json;

Ok,
So I had to find out more on what’s going on with this plug in. This is what I have so far.

Two Inputs.

The first one nginx access log has some extractors.

Second one nginx error log There are also some extractors.

I’m not 100% sure but I think the error is coming from the second INPUT. The extractor called Timestamp. I clicked the edit button next to timestamp. If this is correct so far maybe try adjusting the convert to date type as shown below.

EDIT: Or check this postS out,

Hope that helps

Thanks for all the work you put into this issue. I checked to see if my timestamp was correct in the second input and it is exactly as you have:

I looked into the other threads you linked but I wonder if at this point I am better off searching for something else since this content pack is so old.

That format string doesn’t match the timestamp you have coming in. Your timestamp:

2022-01-12T00:24:22+00:00

I don’t have a playground to try it but it seems to me it might be something similar to:

yyyy-MM-dd'T'HH:mm:ss+SS:SS

You can read more at: TLDR

1 Like

I tried your suggestion but unfortunately the same outcome. I don’t quite understand the error message thought. Is it saying that only this part T17:41:20-05:00 is malformed?

gl2_processing_error
    Replaced invalid timestamp value in message <c2ecdf70-73f8-11ec-9ba3-000c29199921> with current time - Value <2022-01-12T17:41:20-05:00> caused exception: Invalid format: "2022-01-12T17:41:20-05:00" is malformed at "T17:41:20-05:00".

Hello,

Maybe I can break it down.
You have this in your Nginx config file.

log format graylog2_json escape=json '{ "timestamp": "$time_iso8601", '

So, when it applies the nginx log to this file Access_log.

access_log syslog:server=graylog.server.org:12304 graylog2_json;

The two different timestamps shown below.

new Date().toISOString()
"2019-10-11T18:56:08.984Z"
new Date().toUTCString()
"Fri, 11 Oct 2019 18:56:08 GMT"

I think even if you remove your content pack you would still have the same issue.

You may need to convert timestamp through a pipeline. This is pretty common for specific log types.

Can you find the message c2ecdf70-73f8-11ec-9ba3-000c29199921?

You could use this gl2_message_id: OR _id: to find the message in your search bar…

1 Like

I did find the message:

body_bytes_sent
    5331
facility
    local7
facility_num
    23
from_nginx
    true
gl2_processing_error
    Replaced invalid timestamp value in message <c2ecdf70-73f8-11ec-9ba3-000c29199921> with current time - Value <2022-01-12T17:41:20-05:00> caused exception: Invalid format: "2022-01-12T17:41:20-05:00" is malformed at "T17:41:20-05:00".
host
    REDACTED
http_user_agent
    Mozilla/5.0 (iPhone; CPU iPhone OS 15_1_1 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/15.1 Mobile/15E148 Safari/604.1
http_x_forwarded_for
    REDACTED
json
    -
level
    6
message
    GET / HTTP/2.0
nginx_access
    true
remote_addr
    REDACTED
remote_addr_city_name
    N/A
remote_addr_country_code
    US
remote_addr_geolocation
    REDACTED
request
    GET / HTTP/2.0
request_method
    GET
request_time
    0.253
response_status
    200
source
    u1804proxy
timestamp
    2022-01-12 17:41:20.104 -05:00
upstream_addr
    192.168.140.14:80

Even though I receive this error, the content pack still does what is supposed to so even I don’t find a solution, is there any harm in using it like this?

Oh nice, For a minute I though you didn’t get the message.

Not that I know of, but if you want it to go away your probably going to need you use something to convert the timestamp so elasticsearch will stop yelling about it. I would monitor it to make sure your not missing log. Perhaps random checks, etc… In the forum there are lots of example of how to covert timestamp fields and there is even TAG’s on this issue.

1 Like

Yea, luckily the messages still come through. I will do some research on converting the timestamp but again, I think I may consider switching content pack soon since this one is so old.

Really appreciate your assistance with this issue. :+1:

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.