IPFIX from Sophos UTM

Trying out the new IPFIX input in 3.2 by sending data from a Sophos UTM. Seeing this in server.log:

2020-02-05T21:06:05.013-05:00 ERROR [DecodingProcessor] Unable to decode raw message RawMessage{id=3b402050-4885-11ea-808a-000c29404c16, journalOffset=393119342, codec=ipfix, payloadSize=745, timestamp=2020-02-06T02:06:05.013Z, remoteAddress=/192.168.0.1:44408} on input <5e38c5e829ccde06888b7552>.
2020-02-05T21:06:05.014-05:00 ERROR [DecodingProcessor] Error processing message RawMessage{id=3b402050-4885-11ea-808a-000c29404c16, journalOffset=393119342, codec=ipfix, payloadSize=745, timestamp=2020-02-06T02:06:05.013Z, remoteAddress=/192.168.0.1:44408}
org.graylog.integrations.ipfix.IpfixException: Missing information element definitions for private enterprise number 21373
at org.graylog.integrations.ipfix.InformationElementDefinitions.getDefinition(InformationElementDefinitions.java:86) ~[?:?]
at org.graylog.integrations.ipfix.IpfixParser.parseDataSet(IpfixParser.java:337) ~[?:?]
at org.graylog.integrations.ipfix.codecs.IpfixCodec.lambda$decodeMessages$3(IpfixCodec.java:206) ~[?:?]
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[?:1.8.0_242]
at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1382) ~[?:1.8.0_242]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482) ~[?:1.8.0_242]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472) ~[?:1.8.0_242]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[?:1.8.0_242]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:1.8.0_242]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566) ~[?:1.8.0_242]
at org.graylog.integrations.ipfix.codecs.IpfixCodec.decodeMessages(IpfixCodec.java:212) ~[?:?]
at org.graylog2.shared.buffers.processors.DecodingProcessor.processMessage(DecodingProcessor.java:148) ~[graylog.jar:?]
at org.graylog2.shared.buffers.processors.DecodingProcessor.onEvent(DecodingProcessor.java:91) [graylog.jar:?]
at org.graylog2.shared.buffers.processors.ProcessBufferProcessor.onEvent(ProcessBufferProcessor.java:90) [graylog.jar:?]
at org.graylog2.shared.buffers.processors.ProcessBufferProcessor.onEvent(ProcessBufferProcessor.java:47) [graylog.jar:?]
at com.lmax.disruptor.WorkProcessor.run(WorkProcessor.java:143) [graylog.jar:?]
at com.codahale.metrics.InstrumentedThreadFactory$InstrumentedRunnable.run(InstrumentedThreadFactory.java:66) [graylog.jar:?]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_242]

he @SteveU

could you provide a PCAP of your messages?

Missing information element definitions for private enterprise number 21373

I guess something in that kind of IPFIX Data is different and we need to see if this is a Sophos specific issue of if we have an error in the input.

thx

i’m getting similar errors with Stormshield UTM

2020-02-05 14:01:23,585 ERROR: org.graylog2.shared.buffers.processors.DecodingProcessor - Error processing message RawMessage{id=fe4bada4-481f-11ea-93fc-0242ac120007, journalOffset=3668958933, codec=ipfix, payloadSize=550, timestamp=2020-02-05T14:01:23.578Z, remoteAddress=/192.168.0.254:12487}
java.lang.IndexOutOfBoundsException: readerIndex(182) + length(8) exceeds writerIndex(184): UnpooledHeapByteBuf(ridx: 182, widx: 184, cap: 184/184)
        at io.netty.buffer.AbstractByteBuf.checkReadableBytes0(AbstractByteBuf.java:1477) ~[graylog.jar:?]
        at io.netty.buffer.AbstractByteBuf.checkReadableBytes(AbstractByteBuf.java:1463) ~[graylog.jar:?]
        at io.netty.buffer.AbstractByteBuf.readBytes(AbstractByteBuf.java:896) ~[graylog.jar:?]
        at org.graylog.integrations.ipfix.IpfixParser.parseDataSet(IpfixParser.java:364) ~[?:?]
        at org.graylog.integrations.ipfix.codecs.IpfixCodec.lambda$decodeMessages$3(IpfixCodec.java:206) ~[?:?]
        at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[?:1.8.0_242]
        at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1382) ~[?:1.8.0_242]
        at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482) ~[?:1.8.0_242]
        at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472) ~[?:1.8.0_242]
        at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[?:1.8.0_242]
        at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:1.8.0_242]
        at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566) ~[?:1.8.0_242]
        at org.graylog.integrations.ipfix.codecs.IpfixCodec.decodeMessages(IpfixCodec.java:212) ~[?:?]
        at org.graylog2.shared.buffers.processors.DecodingProcessor.processMessage(DecodingProcessor.java:148) ~[graylog.jar:?]
        at org.graylog2.shared.buffers.processors.DecodingProcessor.onEvent(DecodingProcessor.java:91) [graylog.jar:?]
        at org.graylog2.shared.buffers.processors.ProcessBufferProcessor.onEvent(ProcessBufferProcessor.java:90) [graylog.jar:?]
        at org.graylog2.shared.buffers.processors.ProcessBufferProcessor.onEvent(ProcessBufferProcessor.java:47) [graylog.jar:?]
        at com.lmax.disruptor.WorkProcessor.run(WorkProcessor.java:143) [graylog.jar:?]
        at com.codahale.metrics.InstrumentedThreadFactory$InstrumentedRunnable.run(InstrumentedThreadFactory.java:66) [graylog.jar:?]
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_242]

@maniel

would you mind capture some pcap and open an issue:

@SteveU you could add your pcap to that issue please.

Thanks

1 Like

just out of curiosity, wouldn’t it include some sensitive information? it seems our UTM includes user names in ipfix and in our case it’s surname_name, so could I just send you a pcap without posting it publicly on issue tracker?

I can send you a company/confidential link that you can upload the pcap their if you like. This way our Developers can take a look and you are sure to not expose confidential data to the public.

sure, plz send link;-)

opened the issue https://github.com/Graylog2/graylog-plugin-integrations/issues/395

I’ve got a pcap, give me someplace safe to put it.

Hello @SteveU! Thanks for the feedback. In looking at the log errors it seems you are collecting additional element definitions specific to 21373 (netfilter/iptables project), and those additional field definitions need to be provided to Graylog via a json file in order for the logs to be parsed properly.

We have just recently updated our documentation and provided a template for how to do this. Please take a look at that and let us know if that works. https://docs.graylog.org/en/3.2/pages/integrations/inputs/ipfix_input.html

Thanks!

1 Like

just sent you a pcap

Hello @maniel, I looked at the pcap you provided, and it looks like you are collecting 6 additional elements that are specific to 11256 (Stormshield), and these element definitions need to be defined in a json file and provided when adding the input.

An example of how this file should be formatted can be found in the documentation: https://docs.graylog.org/en/3.2/pages/integrations/inputs/ipfix_input.html

Please create this file and provide it to the input in the IPFIX field definition option.

Thanks!

1 Like

Created the files below based on what I’m seeing in the pcap, using this (https://github.com/elastic/beats/blob/cd4907841d0f5e740cbb08eacb759dda3469911d/x-pack/filebeat/input/netflow/decoder/fields/zfields_assorted.go) from the Elastic Beats repo for reference:

pen21373.json:

{
        "enterprise_number": 21373,
        "information_elements": [
                {
                        "element_id": 4,
                        "name": "mark",
                        "data_type": "unsigned32"
                },
                {
                        "element_id": 6,
                        "name": "conntrack_id",
                        "data_type": "unsigned32"
                }
        ]
}

pen9789.json:

{
        "enterprise_number": 9789,
        "information_elements": [
                {
                        "element_id": 1,
                        "name": "afcProtocol",
                        "data_type": "unsigned16"
                },
                {
                        "element_id": 2,
                        "name": "afcProtocolName",
                        "data_type": "string"
                }
                {
                        "element_id": 4,
                        "name": "flowDirection",
                        "data_type": "unsigned8"
                }

        ]
}

Added both of those to the input (figured out the text box could take multiple entries):


And after several minutes, I started seeing messages. But, some appear to be getting dropped due to this error in server.log:

2020-02-07T18:03:11.648-05:00 ERROR [DecodingProcessor] Unable to decode raw message RawMessage{id=0370daf0-49fe-11ea-808a-000c29404c16, journalOffset=394098681, codec=ipfix, payloadSize=807, timestamp=2020-02-07T23:03:11.647Z, remoteAddress=/192.168.0.1:37828} on input <5e38c5e829ccde06888b7552>.
2020-02-07T18:03:11.649-05:00 ERROR [DecodingProcessor] Error processing message RawMessage{id=0370daf0-49fe-11ea-808a-000c29404c16, journalOffset=394098681, codec=ipfix, payloadSize=807, timestamp=2020-02-07T23:03:11.647Z, remoteAddress=/192.168.0.1:37828}
java.lang.IndexOutOfBoundsException: readerIndex(126) + length(4) exceeds writerIndex(128): UnpooledHeapByteBuf(ridx: 126, widx: 128, cap: 128/128)
        at io.netty.buffer.AbstractByteBuf.checkReadableBytes0(AbstractByteBuf.java:1477) ~[graylog.jar:?]
        at io.netty.buffer.AbstractByteBuf.checkReadableBytes(AbstractByteBuf.java:1463) ~[graylog.jar:?]
        at io.netty.buffer.AbstractByteBuf.readBytes(AbstractByteBuf.java:896) ~[graylog.jar:?]
        at io.netty.buffer.AbstractByteBuf.readBytes(AbstractByteBuf.java:904) ~[graylog.jar:?]
        at org.graylog.integrations.ipfix.IpfixParser.parseDataSet(IpfixParser.java:430) ~[?:?]
        at org.graylog.integrations.ipfix.codecs.IpfixCodec.lambda$decodeMessages$3(IpfixCodec.java:206) ~[?:?]
        at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[?:1.8.0_242]
        at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1382) ~[?:1.8.0_242]
        at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482) ~[?:1.8.0_242]
        at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472) ~[?:1.8.0_242]
        at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[?:1.8.0_242]
        at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:1.8.0_242]
        at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566) ~[?:1.8.0_242]
        at org.graylog.integrations.ipfix.codecs.IpfixCodec.decodeMessages(IpfixCodec.java:212) ~[?:?]
        at org.graylog2.shared.buffers.processors.DecodingProcessor.processMessage(DecodingProcessor.java:148) ~[graylog.jar:?]
        at org.graylog2.shared.buffers.processors.DecodingProcessor.onEvent(DecodingProcessor.java:91) [graylog.jar:?]
        at org.graylog2.shared.buffers.processors.ProcessBufferProcessor.onEvent(ProcessBufferProcessor.java:90) [graylog.jar:?]
        at org.graylog2.shared.buffers.processors.ProcessBufferProcessor.onEvent(ProcessBufferProcessor.java:47) [graylog.jar:?]
        at com.lmax.disruptor.WorkProcessor.run(WorkProcessor.java:143) [graylog.jar:?]
        at com.codahale.metrics.InstrumentedThreadFactory$InstrumentedRunnable.run(InstrumentedThreadFactory.java:66) [graylog.jar:?]
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_242]

Thanks for all the help. -Steve

I’m not as lucky as @SteveU and can’t find fields for enterprise number 11256 to create an yml, can i somehow create an yml just from reading a pcap in wireshark?

That’s where I started, but had no idea what the data was until I found that link. If you look for the Data-Template packets, you should find the element ID, and based on the data in the packet determine the data_type. Beware that there may be multiple different Data-Templates with different items defined, make sure to run your pcap for at least 10 minutes to catch them all.

yeah, i looked in the pcap, the only thing that i see useful is the size of the field, i could just try different matching data types and random field names, frankly i don’t need IPFIX that much to justify the time spent, syslog from our UTM are robust enough

@maniel - I’m glad you are able to weigh in the pros and cons of whether you need a IPFIXInput vs syslog.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.