Sourcing message from a mysql table

I have a few (closed source) applications that put log entrys into some mysql tables.

The tables have an id, timestamp, message, and sometimes a few more columns.

The goal is to have it pull in the logs live but also be able to catch up on existing logs.

Has anyone any experience with trying to source logs from a similar setup?

I saw the community elastic mysqlbeat but its very out of date and unmaintained, i also dont think its meant for this type of usecase.

as i was reading the docs, i have not come across any direct interaction as mentioned input as MySQL
as I know you need to use any script to extract information send to Graylog

yes I like the idea of directly pull the data from Graylog interface (may be feature request)

hey @ramindia && welcome.

@ramindia is correct you will need a script or something similar to get the data from MySQL and place it in a log file.

Example of mongoDb.

#!/usr/bin/expect -f

        set timeout 20

        set date [clock format [clock seconds] -format {%d-%m-%Y,%H:%M:%S}]

        spawn mongoexport  -u mongo_admin -p  primalFear  --collection=traffic --db=graylog --out=/var/log/streams/traffic.json 
 expect eof

Log shipper “Nxlog”

<Input streams>
    Module       im_file
    FILE         "/var/log/streams/*.json"
    SavePos       TRUE
    ReadFromLast  TRUE
    PollInterval  1
    #Exec  $Message = $raw_event;

Results. a list of streams that are attached to my input called " linux Servers Secure".

You are STAR

This was amazing I was looking but could not get the idea, so I was writing to Mysqls and raw log to get my solution to work.

can you please clarify here :

  1. (I am sure Linux may have installed utility except) - if not we need to install expect
    how frequently do we run this and do we need to use cronjob or any other method to get data in real-time? (how can we get only the latest data and push it to json
    i would like to do it with MySQL (later pgsql) - i found below the command syntax : (nice and you planted the idea into my brain to think last night)
    found a nice script (need to test ) - again as i mentioned before i need to figure out how frequently this to be running and how do i only get latest data to JSON by amending.
    mysql_to_json_csv/ at master · devbabar/mysql_to_json_csv · GitHub

  2. NXlog need to be installed on the device to ship the logs to Graylog (as i understand)

apologies cross posting my question in to other thread here.



apt-get install expect -y

any log shipper will do, once the log file is created you good to go.

I only use EXPECT that have > or # or something funky in the CLI. It makes it a lot easier to execute remote commands on switches or firewalls and even getting data from SQL. the rest I just use bash.

Thank you, one i was missing here , how do we run this in realtime - the bash script as Cron or any other method to create this json ?

I just use cron, but what ever you like would be good. Depending on how many times you execute
it be carfull on databases, dont want to do it often.

EDIT: What I mean by often is every second, This also depends on other query’s being made. I person query stuff like total’s , sum’s, etc… Might want to keep your eye on it.

Sure I understand, I was thinking of DB replication one for reading and one for Writing for a better outcome.

So when you pull from reading only DB Server, that will not have any impact.

Yes, agreed Monitor is the Prime here. and tune based on the results.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.