Escape json-related chars in a GELF TCP Input


(Altin Karaulli) #1

Dear All.

I am facing the following issue trying to bring in data to Graylog via GELF TCP Input.
If the data contains json-related characters, like { " } : , call these X,
then the record loading will fail.

For example, a field named SQL_TEXT, containing text like below, will fail:
AAA { “bbb”:“cccc cccc” dd } ee , ff AAA

How can I escape a certain X character before I bring it for input?

I tried with:
slash, double slash, slash and “n”, before and after X, but record doesn’t load again.

for example, for the field above, the JSON X-chars were escaped with a double slash before, as below:

"_SQL_TEXT":"AAA \\{ \\"bbb\\"\\:\\"cccc cccc\\" dd \\} ee \\, ff AAA"

can you please advise?

best regards
Altin

PS. escape I have done inside jSON _MYFIELD - where json-related chars are expected.
Then the Json syntax brings all fields together to make the record string to be send.


(Altin Karaulli) #2

I tested with a field containing a single double quote and it fails.
also tried to replace the double quote with

 &quot:

before building the field’s sting for JSON.

This is the way I escape special chars for XML when I send data to Splunk the same way
(TCP input). Obviously, the double-quote replaced, the record is inserted successfully to Graylog.
The problem is that when displayed on Graylog, it doesn’t display the double quote back, but just the replace characters. Looks I have the wrong escape again.

can someone help?

best regards
altin


(system) closed #3

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.