Grok Extractor Try succeeds, but not fields in search

I’ve configured a single (Grok) extractor running against an input. When I load the sample message both the condition regex and the try against example succeed. However, searching yields no fields are getting saved when new messages arrive. The extractors “Details” link shows that there are thousands of hits, which leads me to believe it should be working. I’m very new to Graylog, but haven’t been able to dig up any reason for this behavior. Other test extractors that I setup using regex worked as expected (now removed to simplify things).

This is a stock Ubuntu 20.04 install running mongodb 4.4, ES 7.10.2 and Graylog Open 4.2.1-1

Here’s a sample message:
hostname Eventreader: @@202,clientJoin,“apMac"=“12:12:12:12:12:12”,“clientMac”=“12:66:12:12:12:12”,“ssid”=“userWLANname”,“bssid”=“12:12:12:12:12:12”,“userId”="",“wlanId”=“100”,“iface”=“wlan0”,“tenantUUID”=“839f87c6-d116-497e-afce-438211rbd32c”,“apName”=“AP209”,“apGps”=“10.20,-135.9”,“userName”="”,“vlanId”=“101”,“radio”=“g/n”,“encryption”=“WPA2-AES”,“Instantaneous rssi”=“0”,“Xput”=“0”,“fwVersion”=“”,“model”=“100”,“zoneUUID”=“bf34718e-c3ff-4ec8-8d5e-75423ab81f4a”,“zoneName”=“000000000001”,“timeZone”=“MST+7”,“apLocation”=“000000000001”,“apGps”=“10.20,-135.9”,“apIpAddress”=“”,“apIpv6Address”="",“apGroupUUID”=“f9c107c8-8ead-4899-bda7-b9145678890d”,“domainId”=“666fa0e1-5786-43fe-9bea-b34567893ef1”,“serialNumber”=“000002008341”,“domainName”=“BOB - Desert”,“wlanGroupUUID”=“11476630-30f7-11ec-86ae-34567890f052”,“idealEventVersion”=“3.5.1”,“apDescription”=“kitchen”

And the Grok pattern is as follows:
%{DATA:Host} %{DATA:Type}: @@%{INT:eventCode},%{DATA:eventType},“apMac”="%{MAC:apMAC}",“clientMac”="%{MAC:clientmac}",“ssid”="%{DATA:ssid}",“bssid”="%{MAC:UNWANTED}",“userId”="%{DATA}",“wlanId”="%{INT:UNWANTED}",“iface”="%{DATA:UNWANTED}",“tenantUUID”="%{DATA:UNWANTED}",“apName”="%{DATA:apname}",“apGps”="%{DATA:UNWANTED}"

The pattern doesn’t match every field (I’m assuming if it matches from the beginning that’s expected - I’m new to Grok as well). Named captures only is selected, and the connectional regex is simply clientJoin.

Everything succeeds in the Extractor configuration - but no changes in the search. The search filter does seem to recognize the new keys (apMac, clientMac, etc), but there are no results stored.

I’m sure I must be missing something trivial at this point. I tried similar extractors with Regex, which are working for some other syslog data, and received the same result. is there an issue with the incoming message size, or something else?

Thanks in advance.

Helpful Posting Tips: Tips for Posting Questions that Get Answers [Hold down CTRL and link on link to open tips documents in a separate tab]

It seems the extractor is indeed fine. The logs in question do not have a timezone on them and are thus getting modified by Graylog to set the timezone as GMT-7. The logs are finally showing in Graylog 7 hours later. From reading/researching I’ll need to setup a separate stream/pipeline to solve this? Any help is appreciated.

Hello && Welcome
I might be able to help.
Correct me if I’m wrong, your extractors are working correctly which mean you have the correct fields shown while searching?

Now graylog is setting your timestamp field to GMT -7 because there is no timestamp field in the messages?
What kind of logs are these may I ask?

I noticed the log messages it shows MST +7.

I need to ask a couple question first.
Does the server time/date correct?


On your Graylog server, is the Date/time the same as your devices sending logs?
If all is correct then yeah I would suggest a pipeline to correct the issue with your logs that have the wrong timestamp. Just an FYI we now have tags for issues like this.

I’m quit sure if you did a search for Timestamp issues there would be a lot :slight_smile:

hope that helps

The logs are from a Ruckus vSZ controller. I’m not sure why there’s the timezone in the log entry - each wireless zone inside the vSZ controller can be set to a unique timezone so it is likely specific to that. The entries coming from the controller for sure don’t have the timezone in them, but are definitely UTC.

Log entry: Nov 24 03:40:20
Server time is configured via NTP and timezone set to local America/Denver:

You’re correct, Graylog is likely setting the timezone to -7 due to no timezone present as that is what root_timezone in the config is set to (America/Denver).

I’ll dig further into the pipeline/streams tomorrow. Thanks for the tag suggestion, I’m sure that will get me where I need to be.

1 Like

Here is a starter if need be.

I was able to get this working as desired with the following pipeline rule:

rule "Add UTC timezone to vSZ Syslog Messages"
    let cur_date = to_string($message.timestamp);
    let mid_date = regex_replace("(.*000).*", cur_date, "$1");
    let new_date = parse_date(
        value: to_string(mid_date),
        pattern: "yyyy-MM-dd'T'HH:mm:ss.SSS",
        timezone: "UTC");
    set_field("timestamp", new_date);

Thanks again for the help!

Awesome, and thanks for posting your solution here. I know it will help others.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.