Hello Guys,
a few moth ago i created extractors for my Sophos firewall, these worked properly.
Now a new Firewall come to my Graylog, I created new extractors but these killed my outgoing Traffic. Only src_ip=([^\s]) and log_component=β([^β] )" work. As an example when i add dst_ip=([^\s]*) it dosnt work
On:
Debian 10
Graylog 5.1.2
Extractor configuration
Regular expression: dst_ip=([^\s]*) (try β correct ip)
Condition: Only attempt extraction if field contains string
Field contains string: device=βSFWβ (try β Matches! Extractor would run against this example.)
Store as field: fW_dst_ip
Extraction strategy: copy
Extractor title: Sophos Destination IP
I tried to give graylog and elastic more cores and RAM.
Use cut instead of copy.
Rename everything.
Use βAlways try to extractβ and βOnly attempt extraction if field matches regular expressionβ
Hope some one had a similar problem and can help me
Greetings
gsmith
(GSmith)
June 20, 2023, 1:35am
2
Hey @Marvin1
I have something similar.
dstip=+?((?:\d+.){3}\d+).+
I would check tyou resources when enabling that Extractor by using TOP or HTOP.
along with tailing your graylog log file, something like this.
tail -f /var/log/graylog-server/server.log
1 Like
Thanks for your response:
htop:
before
after
logs
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.log4j.MemoryLimitedCompressingFifoRingBuffer.add(MemoryLimitedCompressingFifoRingBuffer.java:76)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.log4j.MemoryAppender.append(MemoryAppender.java:83)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.AppenderControl.tryCallAppender(AppenderControl.java:161)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.AppenderControl.callAppender0(AppenderControl.java:134)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.AppenderControl.callAppenderPreventRecursion(AppenderControl.java:125)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.AppenderControl.callAppender(AppenderControl.java:89)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.LoggerConfig.callAppenders(LoggerConfig.java:542)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.LoggerConfig.processLogEvent(LoggerConfig.java:500)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:483)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:417)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.AwaitCompletionReliabilityStrategy.log(AwaitCompletionReliabilityStrategy.java:82)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.Logger.log(Logger.java:161)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.spi.AbstractLogger.tryLogMessage(AbstractLogger.java:2205)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.spi.AbstractLogger.logMessageTrackRecursion(AbstractLogger.java:2159)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.spi.AbstractLogger.logMessageSafely(AbstractLogger.java:2142)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.spi.AbstractLogger.logMessage(AbstractLogger.java:2040)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.spi.AbstractLogger.logIfEnabled(AbstractLogger.java:1907)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.apache.logging.slf4j.Log4jLogger.error(Log4jLogger.java:309)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog.storage.elasticsearch7.MessagesAdapterES7.logFailures(MessagesAdapterES7.java:152)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog.storage.elasticsearch7.MessagesAdapterES7.bulkIndexChunked(MessagesAdapterES7.java:136)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.ChunkedBulkIndexer.index(ChunkedBulkIndexer.java:43)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog.storage.elasticsearch7.MessagesAdapterES7.bulkIndex(MessagesAdapterES7.java:108)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.Messages.lambda$runBulkRequest$6(Messages.java:225)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at com.github.rholder.retry.AttemptTimeLimiters$NoAttemptTimeLimit.call(AttemptTimeLimiters.java:78)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at com.github.rholder.retry.Retryer.call(Retryer.java:160)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.Messages.runBulkRequest(Messages.java:225)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.Messages.bulkIndexRequests(Messages.java:148)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.Messages.bulkIndex(Messages.java:140)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.Messages.bulkIndex(Messages.java:120)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.outputs.ElasticSearchOutput.writeMessageEntries(ElasticSearchOutput.java:103)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.outputs.BlockingBatchedESOutput.flush(BlockingBatchedESOutput.java:129)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.outputs.BlockingBatchedESOutput.writeMessageEntry(BlockingBatchedESOutput.java:110)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.outputs.BlockingBatchedESOutput.write(BlockingBatchedESOutput.java:92)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at org.graylog2.buffers.processors.OutputBufferProcessor$1.run(OutputBufferProcessor.java:189)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:180)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at java.base/java.util.concurrent.FutureTask.run(Unknown Source)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
Jun 20 08:56:00 mon1 graylog-server[60972]: #011at java.base/java.lang.Thread.run(Unknown Source)
Jun 20 09:01:14 mon1 graylog-server[60972]: 2023-06-20 09:01:14,041 outputbuffer-processor-executor-1 ERROR An exception occurred processing Appender graylog-internal-logs java.lang.NullPointerException: Cannot invoke βcom.github.luben.zstd.ZstdOutputStream.close()β because βthis.compressedStreamβ is null
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.log4j.MemoryLimitedCompressingFifoRingBuffer.flush(MemoryLimitedCompressingFifoRingBuffer.java:95)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.log4j.MemoryLimitedCompressingFifoRingBuffer.add(MemoryLimitedCompressingFifoRingBuffer.java:76)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.log4j.MemoryAppender.append(MemoryAppender.java:83)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.AppenderControl.tryCallAppender(AppenderControl.java:161)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.AppenderControl.callAppender0(AppenderControl.java:134)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.AppenderControl.callAppenderPreventRecursion(AppenderControl.java:125)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.AppenderControl.callAppender(AppenderControl.java:89)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.LoggerConfig.callAppenders(LoggerConfig.java:542)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.LoggerConfig.processLogEvent(LoggerConfig.java:500)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:483)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:417)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.config.AwaitCompletionReliabilityStrategy.log(AwaitCompletionReliabilityStrategy.java:82)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.core.Logger.log(Logger.java:161)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.spi.AbstractLogger.tryLogMessage(AbstractLogger.java:2205)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.spi.AbstractLogger.logMessageTrackRecursion(AbstractLogger.java:2159)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.spi.AbstractLogger.logMessageSafely(AbstractLogger.java:2142)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.spi.AbstractLogger.logMessage(AbstractLogger.java:2040)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.log4j.spi.AbstractLogger.logIfEnabled(AbstractLogger.java:1907)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.apache.logging.slf4j.Log4jLogger.error(Log4jLogger.java:309)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog.storage.elasticsearch7.MessagesAdapterES7.logFailures(MessagesAdapterES7.java:152)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog.storage.elasticsearch7.MessagesAdapterES7.bulkIndexChunked(MessagesAdapterES7.java:136)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.ChunkedBulkIndexer.index(ChunkedBulkIndexer.java:43)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog.storage.elasticsearch7.MessagesAdapterES7.bulkIndex(MessagesAdapterES7.java:108)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.Messages.lambda$runBulkRequest$6(Messages.java:225)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at com.github.rholder.retry.AttemptTimeLimiters$NoAttemptTimeLimit.call(AttemptTimeLimiters.java:78)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at com.github.rholder.retry.Retryer.call(Retryer.java:160)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.Messages.runBulkRequest(Messages.java:225)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.Messages.bulkIndexRequests(Messages.java:148)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.Messages.bulkIndex(Messages.java:140)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.indexer.messages.Messages.bulkIndex(Messages.java:120)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.outputs.ElasticSearchOutput.writeMessageEntries(ElasticSearchOutput.java:103)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.outputs.BlockingBatchedESOutput.flush(BlockingBatchedESOutput.java:129)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.outputs.BlockingBatchedESOutput.writeMessageEntry(BlockingBatchedESOutput.java:110)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.outputs.BlockingBatchedESOutput.write(BlockingBatchedESOutput.java:92)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at org.graylog2.buffers.processors.OutputBufferProcessor$1.run(OutputBufferProcessor.java:189)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:180)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at java.base/java.util.concurrent.FutureTask.run(Unknown Source)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
Jun 20 09:01:14 mon1 graylog-server[60972]: #011at java.base/java.lang.Thread.run(Unknown Source)
hope this gives more information
ihe
June 20, 2023, 7:38am
4
could you please post one of the lines of log you want to parse? Iβd suggest to use a Grok pattern. Those run only once, and possibly only on the right messages. Running a regex on every message is quite expensive on the CPU.
Do you have error messages for unindexed logs?
2 Likes
Yes there are error logs but this is a different story
[466]: index [graylog_157], type [_doc], id [13915622-0f56-11ee-a4ab-00155d0fc30c], message [ElasticsearchException[Elasticsearch exception [type=illegal_argument_exception, reason=Limit of total fields [1000] has been exceeded]]]
device=βSFWβ date= time=12:28:47 timezone=ββ device_name=ββ device_id= log_id=log_type=βFirewallβ log_component=ββ log_subtype=ββ status=βAllowβ priority=Information duration=0 fw_rule_id= fw_rule_name=ββ fw_rule_section=βLocal ruleβ nat_rule_id=0 nat_rule_name=ββ policy_type=1 sdwan_profile_id_request=0 sdwan_profile_name_request=ββ sdwan_profile_id_reply=0 sdwan_profile_name_reply=ββ gw_id_request=0 gw_name_request=ββ gw_id_reply=0 gw_name_reply=ββ sdwan_route_id_request=0 sdwan_route_name_request=ββ sdwan_route_id_reply=0 sdwan_route_name_reply=ββ user_name=ββ user_gp=ββ iap= ips_policy_id= appfilter_policy_id= application=ββ application_risk= application_technology=ββ application_category=ββ vlan_id=ββ ether_type=Unknown (0x0000) bridge_name=ββ bridge_display_name=ββ in_interface=ββ in_display_interface=ββ out_interface=ββ out_display_interface=ββ src_mac= dst_mac= src_ip= src_country_code= dst_ip= dst_country_code=1 protocol=βUDPβ src_port= dst_port= sent_pkts=0 recv_pkts=0 sent_bytes=0 recv_bytes=0 tran_src_ip= tran_src_port=0 tran_dst_ip= tran_dst_port=0 srczonetype=βLANβ srczone=ββ dstzonetype=βLANβ dstzone=ββ dir_disp=ββ connevent=ββ connid=ββ vconnid=ββ hb_health=ββ message=ββ appresolvedby=ββ app_is_cloud=0 log_occurrence= flags=
Sorry but i have to cut information out.
thanks
ihe
June 20, 2023, 11:30am
6
how many different streams do you store on the index of your stream? There are more then 1000 field names used. Usually you can fix that either by spliting streams on different index sets, or by fixing some field names which are redundant/obsolete.
regarding to your example log: could you provide it with bogus-data? please keep your type of log, so an IP could be 10.0.0.0 in general or a display_name could be βJohn Doeβ and so on. Iβll try to provide you with a Grok pattern.
1 Like
gsmith
(GSmith)
June 20, 2023, 9:29pm
7
Hey @Marvin1
Adding on to @ihe
I noticed you using Graylog 5.1 and in your screenshot looks like you using Elasticsearch, what version of ES are you using?
1 Like
hey @gsmith I use elasticsearch-oss version 7.10.2 amd64
Hey @ihe i have 4 streams, but only 2 are running. The default stream and one I build for one of our hospitals. This stream has 29 rules ( for each Source must match exactly DC00.example.intern) for all sources used for this infrastructure
ihe
June 21, 2023, 1:07pm
10
you are using the default index set for all your logs.
my tip:
get your self familiar with the index model. Read this page from the docs to the end: Index model
create separate streams for different topics with separate index sets for your data. Windows logs into a windows stream, firewall into Firewall and so on
if this still does not help it will only affect a small part of your logs - not all streams/index sets will be flooded with too many fields.
2 Likes
Thank you very much for your help. Its a valid solution for one of my problems. <3
frantz
June 22, 2023, 8:04am
12
The best solution is indeed what @ihe said. But you can also increase the limit of 1000 fields with an ElasticSearch template. For example you can set 1500 or 2000 but do not increase more or you will have performance issues.
1 Like
system
(system)
Closed
July 6, 2023, 8:05am
13
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.