Meraki Events - not extracting using GROK

Hi all,
I’m having issues extracting into fields from my Meraki Events log and hope someone can help. Flows work’s fine.
Input is set up as Raw/Plaintext UDP.

Using GROK ideally, as

here is an example couple of lines:

<134>1 1516115870.190511599 MyMeraki events type=vpn_connectivity_change vpn_type='site-to-site' peer_contact='1.2.3.4:57267' peer_ident='d1eebb67c390f3f110853b70533e975f' connectivity='true'

<134>1 1516115970.357522446 MyMeraki events client_vpn_disconnect user id 'mort' local ip 192.168.1.14 connected from 4.3.2.1

So, would like to extract different fields based on what the line is.
First line - get peer contact, and the connectivity status.
Looking at the second line, would be good to get username, connect/disconnect, local IP and connected from IP.

I can do the correct GROK patterns, and match them using the GROK debugger, however i always get errors in my indexer log and an unprocessed message.

{"type":"mapper_parsing_exception","reason":"failed to parse","caused_by":{"type":"illegal_argument_exception","reason":"Can't parse [index] value [not_analyzed] for field [Peer], expected [true] or [false]"}}

I’ve tried using %{WORD:UNWANTED} on all fields except one, and also separately tried using %{GREEDYDATA:something} just to test grabbing almost the whole line of data and putting it in, but with no joy.

What am I doing wrong?

Suggestions welcome!

Mort

What grok patterns are you using exactly?

peer_contact='%{IPV4:peer}:%{INT:port}'

is what i’m using now for testing, and i get the following error

	{"type":"mapper_parsing_exception","reason":"failed to parse","caused_by":{"type":"illegal_argument_exception","reason":"Can't parse [index] value [not_analyzed] for field [port], expected [true] or [false]"}}

Thanks!

instead of

try using:

peer_contact=‘%{IPORHOST:peer}:%{POSINT:port}’

Hope this helps :thinking:

Create a custom index mapping for your custom fields (i. e. “peer” and “port”) and rotate indices afterwards (System/Indices/Index Set/Maintenance in the Graylog web interface).
http://docs.graylog.org/en/2.4/pages/configuration/elasticsearch.html#custom-index-mappings

Thanks,

OK - so, before doing that i removed an old index, and also “Rotate active write index”, just to see the impact.

That appears to have fixed it, without putting in the custom mappings.

So I can learn, why would this happen? Also, why did it work fine with the Meraki Flows messages, but not the Event messages?
Is this because the messages can be in a different syntax? (as per my snip in 1st post)

Thank you so much for your help ,and your speedy replies!

Marked your reply as “Answered” as it got me to the end solution.

Mort

Elasticsearch tries to guess the type of a message field if you don’t provide a custom mapping.
Sometimes the type was guessed incorrectly and sometimes the type might have changed.

See Dynamic Mapping | Elasticsearch Reference [5.6] | Elastic and Dynamic Mapping | Elasticsearch: The Definitive Guide [2.x] | Elastic for details.

In any case, I’d recommend creating a custom index mapping for your messages to prevent future errors like the one you’ve described.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.