Pipeline for firewalls logs

Hi,

I want to create pipeline in graylog like we create in logstash in order to separate filds !!
Can someone help me please ?

I have logs from firewalls (Cisco-asa, Fortigate, Paloalto)

Take a look at the documentation for the processing pipeline:
http://docs.graylog.org/en/2.4/pages/pipelines.html

There are also a few blog posts on the Graylog blog which demonstrate various use cases:

You are not helping me with links, because I already have them.

I think to parse logs I don’t need the pipelines, can I give you my logstash configuration and tell me how can I do the same with graylog ?

may be using grok patterns in logstash will hep
https://grokdebug.herokuapp.com/

With cisco-asa log I parsed logs with grok, can you help me with paloalto logs :

1,2018/06/21 14:16:40,0009C101283,TRAFFIC,start,0,2018/06/21 14:16:40,193.240.221.122,69.172.216.55,0.0.0.0,0.0.0.0,KEOLIS OUT PUBLIC IP,,,ssl,vsys1,DMZ,Internet,ae2.1801,ae1.849,frghcslnetv03-04,2018/06/21 14:16:40,34302707,1,12173,443,0,0,0x0,tcp,allow,483,405,78,4,2018/06/21 14:16:41,0,any,0,97437831801,0x0,FR,CA,0,3,1,n/a,0,0,0,0,CTX_PROD,frghcfwdmz01m,from-policy

In logstash I parse them with : CSV filter

You have to use grok pattern in logstash and output it in json and you will have the fields in json format.

https://discuss.elastic.co/t/logstash-csv-to-json/31826.

Also you might want to put a post on Elastic forum.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.