Extractor kills outgoing traffic

Hello Guys,
a few moth ago i created extractors for my Sophos firewall, these worked properly.
Now a new Firewall come to my Graylog, I created new extractors but these killed my outgoing Traffic. Only src_ip=([^\s]) and log_component=β€œ([^”])" work. As an example when i add dst_ip=([^\s]*) it dosnt work :frowning:

On:
Debian 10
Graylog 5.1.2

Extractor configuration

Regular expression: dst_ip=([^\s]*) (try β†’ correct ip)
Condition: Only attempt extraction if field contains string
Field contains string: device=β€œSFW” (try β†’ Matches! Extractor would run against this example.)
Store as field: fW_dst_ip
Extraction strategy: copy
Extractor title: Sophos Destination IP

I tried to give graylog and elastic more cores and RAM.
Use cut instead of copy.
Rename everything.
Use β€œAlways try to extract” and β€œOnly attempt extraction if field matches regular expression”

Hope some one had a similar problem and can help me
Greetings

Hey @Marvin1

I have something similar.

dstip=+?((?:\d+.){3}\d+).+

I would check tyou resources when enabling that Extractor by using TOP or HTOP.
along with tailing your graylog log file, something like this.

tail -f /var/log/graylog-server/server.log
1 Like

Thanks for your response:
htop:
before

after

logs

Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.log4j.MemoryLimitedCompressingFifoRingBuffer.add(MemoryLimitedCompressingFifoRingBuffer.java:76)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.log4j.MemoryAppender.append(MemoryAppender.java:83)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.AppenderControl.tryCallAppender(AppenderControl.java:161)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.AppenderControl.callAppender0(AppenderControl.java:134)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.AppenderControl.callAppenderPreventRecursion(AppenderControl.java:125)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.AppenderControl.callAppender(AppenderControl.java:89)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.LoggerConfig.callAppenders(LoggerConfig.java:542)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.LoggerConfig.processLogEvent(LoggerConfig.java:500)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:483)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:417)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.AwaitCompletionReliabilityStrategy.log(AwaitCompletionReliabilityStrategy.java:82)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.Logger.log(Logger.java:161)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.spi.AbstractLogger.tryLogMessage(AbstractLogger.java:2205)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.spi.AbstractLogger.logMessageTrackRecursion(AbstractLogger.java:2159)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.spi.AbstractLogger.logMessageSafely(AbstractLogger.java:2142)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.spi.AbstractLogger.logMessage(AbstractLogger.java:2040)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.spi.AbstractLogger.logIfEnabled(AbstractLogger.java:1907)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.slf4j.Log4jLogger.error(Log4jLogger.java:309)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog.storage.elasticsearch7.MessagesAdapterES7.logFailures(MessagesAdapterES7.java:152)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog.storage.elasticsearch7.MessagesAdapterES7.bulkIndexChunked(MessagesAdapterES7.java:136)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.ChunkedBulkIndexer.index(ChunkedBulkIndexer.java:43)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog.storage.elasticsearch7.MessagesAdapterES7.bulkIndex(MessagesAdapterES7.java:108)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.Messages.lambda$runBulkRequest$6(Messages.java:225)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at com.github.rholder.retry.AttemptTimeLimiters$NoAttemptTimeLimit.call(AttemptTimeLimiters.java:78)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at com.github.rholder.retry.Retryer.call(Retryer.java:160)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.Messages.runBulkRequest(Messages.java:225)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.Messages.bulkIndexRequests(Messages.java:148)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.Messages.bulkIndex(Messages.java:140)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.Messages.bulkIndex(Messages.java:120)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.outputs.ElasticSearchOutput.writeMessageEntries(ElasticSearchOutput.java:103)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.outputs.BlockingBatchedESOutput.flush(BlockingBatchedESOutput.java:129)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.outputs.BlockingBatchedESOutput.writeMessageEntry(BlockingBatchedESOutput.java:110)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.outputs.BlockingBatchedESOutput.write(BlockingBatchedESOutput.java:92)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.buffers.processors.OutputBufferProcessor$1.run(OutputBufferProcessor.java:189)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:180)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at java.base/java.util.concurrent.FutureTask.run(Unknown Source)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at java.base/java.lang.Thread.run(Unknown Source)
Jun 20 09:01:14 mon1 graylog-server[60972]: 2023-06-20 09:01:14,041 outputbuffer-processor-executor-1 ERROR An exception occurred processing Appender graylog-internal-logs java.lang.NullPointerException: Cannot invoke β€œcom.github.luben.zstd.ZstdOutputStream.close()” because β€œthis.compressedStream” is null
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.log4j.MemoryLimitedCompressingFifoRingBuffer.flush(MemoryLimitedCompressingFifoRingBuffer.java:95)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.log4j.MemoryLimitedCompressingFifoRingBuffer.add(MemoryLimitedCompressingFifoRingBuffer.java:76)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.log4j.MemoryAppender.append(MemoryAppender.java:83)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.AppenderControl.tryCallAppender(AppenderControl.java:161)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.AppenderControl.callAppender0(AppenderControl.java:134)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.AppenderControl.callAppenderPreventRecursion(AppenderControl.java:125)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.AppenderControl.callAppender(AppenderControl.java:89)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.LoggerConfig.callAppenders(LoggerConfig.java:542)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.LoggerConfig.processLogEvent(LoggerConfig.java:500)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:483)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:417)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.AwaitCompletionReliabilityStrategy.log(AwaitCompletionReliabilityStrategy.java:82)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.Logger.log(Logger.java:161)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.spi.AbstractLogger.tryLogMessage(AbstractLogger.java:2205)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.spi.AbstractLogger.logMessageTrackRecursion(AbstractLogger.java:2159)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.spi.AbstractLogger.logMessageSafely(AbstractLogger.java:2142)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.spi.AbstractLogger.logMessage(AbstractLogger.java:2040)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.spi.AbstractLogger.logIfEnabled(AbstractLogger.java:1907)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.slf4j.Log4jLogger.error(Log4jLogger.java:309)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog.storage.elasticsearch7.MessagesAdapterES7.logFailures(MessagesAdapterES7.java:152)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog.storage.elasticsearch7.MessagesAdapterES7.bulkIndexChunked(MessagesAdapterES7.java:136)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.ChunkedBulkIndexer.index(ChunkedBulkIndexer.java:43)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog.storage.elasticsearch7.MessagesAdapterES7.bulkIndex(MessagesAdapterES7.java:108)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.Messages.lambda$runBulkRequest$6(Messages.java:225)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at com.github.rholder.retry.AttemptTimeLimiters$NoAttemptTimeLimit.call(AttemptTimeLimiters.java:78)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at com.github.rholder.retry.Retryer.call(Retryer.java:160)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.Messages.runBulkRequest(Messages.java:225)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.Messages.bulkIndexRequests(Messages.java:148)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.Messages.bulkIndex(Messages.java:140)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.Messages.bulkIndex(Messages.java:120)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.outputs.ElasticSearchOutput.writeMessageEntries(ElasticSearchOutput.java:103)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.outputs.BlockingBatchedESOutput.flush(BlockingBatchedESOutput.java:129)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.outputs.BlockingBatchedESOutput.writeMessageEntry(BlockingBatchedESOutput.java:110)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.outputs.BlockingBatchedESOutput.write(BlockingBatchedESOutput.java:92)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.buffers.processors.OutputBufferProcessor$1.run(OutputBufferProcessor.java:189)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:180)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at java.base/java.util.concurrent.FutureTask.run(Unknown Source)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at java.base/java.lang.Thread.run(Unknown Source)

hope this gives more information

could you please post one of the lines of log you want to parse? I’d suggest to use a Grok pattern. Those run only once, and possibly only on the right messages. Running a regex on every message is quite expensive on the CPU.
Do you have error messages for unindexed logs?

2 Likes

Yes there are error logs but this is a different story
[466]: index [graylog_157], type [_doc], id [13915622-0f56-11ee-a4ab-00155d0fc30c], message [ElasticsearchException[Elasticsearch exception [type=illegal_argument_exception, reason=Limit of total fields [1000] has been exceeded]]]

device=β€œSFW” date= time=12:28:47 timezone=β€œβ€ device_name=β€œβ€ device_id= log_id=log_type=β€œFirewall” log_component=β€œβ€ log_subtype=β€œβ€ status=β€œAllow” priority=Information duration=0 fw_rule_id= fw_rule_name=β€œβ€ fw_rule_section=β€œLocal rule” nat_rule_id=0 nat_rule_name=β€œβ€ policy_type=1 sdwan_profile_id_request=0 sdwan_profile_name_request=β€œβ€ sdwan_profile_id_reply=0 sdwan_profile_name_reply=β€œβ€ gw_id_request=0 gw_name_request=β€œβ€ gw_id_reply=0 gw_name_reply=β€œβ€ sdwan_route_id_request=0 sdwan_route_name_request=β€œβ€ sdwan_route_id_reply=0 sdwan_route_name_reply=β€œβ€ user_name=β€œβ€ user_gp=β€œβ€ iap= ips_policy_id= appfilter_policy_id= application=β€œβ€ application_risk= application_technology=β€œβ€ application_category=β€œβ€ vlan_id=β€œβ€ ether_type=Unknown (0x0000) bridge_name=β€œβ€ bridge_display_name=β€œβ€ in_interface=β€œβ€ in_display_interface=β€œβ€ out_interface=β€œβ€ out_display_interface=β€œβ€ src_mac= dst_mac= src_ip= src_country_code= dst_ip= dst_country_code=1 protocol=β€œUDP” src_port= dst_port= sent_pkts=0 recv_pkts=0 sent_bytes=0 recv_bytes=0 tran_src_ip= tran_src_port=0 tran_dst_ip= tran_dst_port=0 srczonetype=β€œLAN” srczone=β€œβ€ dstzonetype=β€œLAN” dstzone=β€œβ€ dir_disp=β€œβ€ connevent=β€œβ€ connid=β€œβ€ vconnid=β€œβ€ hb_health=β€œβ€ message=β€œβ€ appresolvedby=β€œβ€ app_is_cloud=0 log_occurrence= flags=

Sorry but i have to cut information out.
thanks

how many different streams do you store on the index of your stream? There are more then 1000 field names used. Usually you can fix that either by spliting streams on different index sets, or by fixing some field names which are redundant/obsolete.

regarding to your example log: could you provide it with bogus-data? please keep your type of log, so an IP could be 10.0.0.0 in general or a display_name could be β€œJohn Doe” and so on. I’ll try to provide you with a Grok pattern.

1 Like

Hey @Marvin1

Adding on to @ihe

I noticed you using Graylog 5.1 and in your screenshot looks like you using Elasticsearch, what version of ES are you using?

1 Like

hey @gsmith I use elasticsearch-oss version 7.10.2 amd64

Hey @ihe i have 4 streams, but only 2 are running. The default stream and one I build for one of our hospitals. This stream has 29 rules ( for each Source must match exactly DC00.example.intern) for all sources used for this infrastructure

you are using the default index set for all your logs.
my tip:

  1. get your self familiar with the index model. Read this page from the docs to the end: Index model
  2. create separate streams for different topics with separate index sets for your data. Windows logs into a windows stream, firewall into Firewall and so on
  3. if this still does not help it will only affect a small part of your logs - not all streams/index sets will be flooded with too many fields.
2 Likes

Thank you very much for your help. Its a valid solution for one of my problems. <3

The best solution is indeed what @ihe said. But you can also increase the limit of 1000 fields with an ElasticSearch template. For example you can set 1500 or 2000 but do not increase more or you will have performance issues.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.