I’m new around ELK and Graylog world, and trying to understand the basics on this SIEM.
Actually i went into a manual install of :
Elasticsearch for index
Kibana for direct search and visualization on index datas
Logstash for parsing / stashing (??)
Graylog seems to be doing the same work as Kibana, with search, visualization …
Collector-sidecar seems to contains all listeners for clients like NXlog and Filebeats.
Failed install of NXlog for now, i’ll solve this later.
However i cannot point out about how to send extracted log files as Splunk does on Graylog, i must use NXlog and Filebeats ? I wish to send any log files (no matter the type of logs) and sort out by regex filter and standard delimiters myself. Isn’t there a easier method to add my own logs on elasticsearch or graylog ?
So i actually have installed the above parts, and graylog can connect to elasticsearch index / cluster named graylog2 or graylog. However Kibana can’t see the elasticsearch index while graylog can see it and is in green state … very strange.
Is Kibana needed ? Graylog seems to do the same kind of work.
Everything is installed on the same Debian server, since i wish to put as a one offline analysis platform for education/testing purposes.
So that means Kibana does same work as Graylog, can they both be connected on the same elasticsearch server ?
By the way is there any way to send my own log files to parse, easily ?
I was thinking about mounting a remote folder (host is elasticsearch server) on a windows client, and then putting logs automatically on elasticsearch by a simple copy of log file on the remote accessible folder. There is no syslog involved, or any network communication needed.
But it seems there is no local storage for this and i got to use NXlog or Filebeats, right ?
Kibana mostly is used for querying and visualization, correct. So yes, it would be a duplicate of functionality if you already have Graylog. On the other hand, some people prefer Kibana, as supposedly it offers more in the way of visualization. And yes, you can let Kibana query the Graylog indexes, as long as the relevant user account is provided with access.
With regards to sending in your logfiles: that’s the whole point of Graylog and also what I’ve described above. You can use the Sidecar Collector (or another implementation of NXLog or BEATS) to track log files and eventlogs, to forward them into Graylog.
There is no way of simply uploading an existing file, because all data is inside the ElasticSearch data lake. There are no files accessible through the file system. Also, as Jan has pointed out in other threads: in order for Graylog to be able to work with log data, it needs to have gone through Graylog. You cannot simply chuck info into Elastic in hopes that Graylog finds it. Graylog does a bit of massaging and parsing on all data that comes in, adding some required metadata along with it.
Kibana query the Graylog indexes, as long as the relevant user account is provided with access.
It’s ok now, i see graylog_0 as index, solved.
Actually i’m using nxlog and i see data on graylog, however i don’t see my log file data when searching * for all timestamp date. I’m putting a file inside “C:\Program Files (x86)\nxlog\Files_Logs\*.log” but i’m only getting logs from my computer which is polluting, and isn’t relevant.
So i think i’m wrong here, below the example on nxlog manual :
This configuration will read from a file and forward messages via TCP.
No additional processing is done.
nxlog.conf
<Input messages>
Module im_file
File “/var/log/messages”
</Input>
<Output tcp>
Module om_tcp
Host 192.168.1.1
Port 514
</Output>
<Route messages_to_tcp>
Path messages => tcp
</Route>
Below the my configuration file of nxlog :
Module im_file
File 'C:\Program Files (x86)\nxlog\Files_Logs\\*.log'
PollInterval 1
SavePos True
ReadFromLast True
Recursive False
RenameCheck False
Exec $FileName = file_name(); # Send file name with each message
Module om_tcp
Host 192.168.1.103
Port 12201
OutputType GELF_TCP
# These fields are needed for Graylog
$gl2_source_collector = '${sidecar.nodeId}';
$collector_node_id = '${sidecar.nodeName}';
Unfortunately I don’t have prior experience with NXLog, so I wouldn’t know whether it’s enough to just chuck log files into an NXLog directory. I dunno ¯\(°_o)/¯
The example from the NXLog manual that you quoted does not seem relevant because it:
Is for Linux
Seemingly points towards a Syslog input on a receiver host (port 514)
Personally, I have only worked with the Sidecar (for Linux files, Windows files and Windows Eventlogs).
When working with the Sidecar, there’s a stack of configs that you need to make.
Install the Sidecar on the sending-side.
Configure the Sidecar with the right URI for your Graylog API.
Configure the Sidecar with the right tags (see below).
In Graylog, under Collector Configuration define tags. Each tag can be used to define any number of files or eventlogs that need to be monitored. For each of these tags, define the desired files and the desired output-to-input.
In Graylog, under Inputs define the required input.
In my case I mostly use the FileBEAT / WinLogBEAT parts of the Sidecar, thus I’ve made a BEATS input on the Graylog hosts. Each of the tags is configured to trace a bunch of files/logs and to forward them through FileBEAT / WinLogBEAT to the input on the Graylog side.
No, Kibana is not needed, nor is Logstash since Graylog can handle both the tasks. Kibana does offer a higher level of visualisation support, but Graylog isn’t bad at it either.
I’ve been trying to wrap my head around NXLog. From what I understand it’s YALM: Yet Another Log Manager, i.e. another competitor to Syslog/RSyslog/Syslog-NG/BEATS/etc. Correct? So if you’re already installing the Sidecar, you technically don’t need NXLog either, right?
Trying to do that right now, any advice on configuration files for direct log files sending ?
As i want to install client on graylog server and access to an empty “log folder” to send my extracted log files to.
Uhm… I’m having a hard time picturing that, because what you describe is utterly not how Graylog works. Could you please provide some more details on what you think you’re building?
Alright so, checking about Sidecar configuration for this !
Install the Sidecar on the sending-side.
Configure the Sidecar with the right URI for your Graylog API.
Configure the Sidecar with the right tags (see below).
In Graylog, under Collector Configuration define tags. Each tag can be used to define any number of files or eventlogs that need to be monitored. For each of these tags, define the desired files and the desired output-to-input.
In Graylog, under Inputs define the required input.
FileBEAT :
So graylog will recognize the log type i’ll send ? Since i’ll throw a lot of various logs files in the folder, can be anything from windows logs to application or linux logs.
There is configuration file to edit i guess … i’ll have to look into this afternoon.