Once the data is ingested, you can run an extractor like a grok pattern on it to split it up into the fields.
I just want to add: you should first ingest just one line and build the extractor out of this line and after that is ready ingest all messages. Because extractors will not work on already ingested messages.
I know people that use some script like tha following to ingest data with cron from systems that only provide a csv status.
#/bin/bash
DOWNURL=http://This.Is.the.url/status_csv.ph
GRAYLOGSERVER=192.168.X.XX
GRAYLOGINPUTPORT=5555
# This can be nc or netcat depending of the system
NETCATCOMMAND=nc
wget -O- -q ${DOWNURL} | while read LINE
do
if [[ ${#LINE} -gt 1 ]];then
echo $LINE | ${NETCATCOMMAND} ${GRAYLOGSERVER} ${GRAYLOGINPUTPORT}
fi
done