as i was reading the docs, i have not come across any direct interaction as mentioned input as MySQL
as I know you need to use any script to extract information send to Graylog
yes I like the idea of directly pull the data from Graylog interface (may be feature request)
This was amazing I was looking but could not get the idea, so I was writing to Mysqls and raw log to get my solution to work.
can you please clarify here :
(I am sure Linux may have installed utility except) - if not we need to install expect
how frequently do we run this and do we need to use cronjob or any other method to get data in real-time? (how can we get only the latest data and push it to json
i would like to do it with MySQL (later pgsql) - i found below the command syntax : (nice and you planted the idea into my brain to think last night) https://dev.mysql.com/doc/mysql-shell/8.0/en/mysql-shell-json-output.html
found a nice script (need to test ) - again as i mentioned before i need to figure out how frequently this to be running and how do i only get latest data to JSON by amending. mysql_to_json_csv/mysql_to_json.py at master · devbabar/mysql_to_json_csv · GitHub
NXlog need to be installed on the device to ship the logs to Graylog (as i understand)
apologies cross posting my question in to other thread here.
any log shipper will do, once the log file is created you good to go.
I only use EXPECT that have > or # or something funky in the CLI. It makes it a lot easier to execute remote commands on switches or firewalls and even getting data from SQL. the rest I just use bash.
I just use cron, but what ever you like would be good. Depending on how many times you execute
it be carfull on databases, dont want to do it often.
EDIT: What I mean by often is every second, This also depends on other query’s being made. I person query stuff like total’s , sum’s, etc… Might want to keep your eye on it.