Graylog pipeline

I am doing a pipeline for the following logs messages. I transport the logs using Filebeat.

I created a pipeline, and check on the Rule simulator, nothing have shown. Would need you help in helping me point out the issue.

The following logs:

**2024-07-02 13:10:49.381**
Jul 2 13:10:41 ukm journal[1803]: sshmgr-backend[1803] INFO: [system] Job 443982 internal-process-manager: (created)
**2024-07-02 13:10:49.381**
Jul 2 13:10:41 ukm journal[1803]: sshmgr-backend[1803] INFO: [system] Job 443981 internal-scheduler: (created)
**2024-07-02 13:10:49.381**
Jul 2 13:10:41 ukm journal[1803]: sshmgr-backend[1803] INFO: [system] Job 443980 internal-purge-reporting-data: (created)
**2024-07-02 13:10:49.380**
Jul 2 13:10:41 ukm journal[1803]: sshmgr-backend[1803] INFO: [system] Job 443978 internal-purge-jobs: (created)
**2024-07-02 13:09:47.366**
Jul 2 13:09:46 ukm journal[1803]: sshmgr-backend[1803] INFO: [system] Job 443977 internal-process-manager: (created)
**2024-07-02 13:09:47.366**
Jul 2 13:09:46 ukm journal[1803]: sshmgr-backend[1803] INFO: [system] Job 443976 internal-scheduler: (created)
**2024-07-02 13:08:57.353**
Jul 2 13:08:50 ukm journal[1803]: sshmgr-backend[1803] INFO: [system] Job 443975 internal-process-manager: (created)
**2024-07-02 13:08:57.353**
Jul 2 13:08:50 ukm journal[1803]: sshmgr-backend[1803] INFO: [system] Job 443974 internal-scheduler: (created)
**2024-07-02 13:07:57.329**
Jul 2 13:07:55 ukm journal[1803]: sshmgr-backend[1803] INFO: [system] Job 443973 internal-process-manager: (created)
**2024-07-02 13:07:57.329**
Jul 2 13:07:54 ukm journal[1803]: sshmgr-backend[1803] INFO: [system] Job 443972 internal-scheduler: (created)

My pipeline :slight_smile:

rule "UKM"
when
    has_field("message")
then
    let message_field = to_string($message.message);
    
    // Extract Timestamp
    let timestamp = regex("^(\\w+ \\d+ \\d+:\\d+:\\d+)", message_field);
    set_field("timestamp", timestamp["0"]);
    
    // Extract Hostname
    let hostname = regex("^\\w+ \\d+ \\d+:\\d+:\\d+ (\\w+)", message_field);
    set_field("hostname", hostname["0"]);
    
    // Extract Process Info
    let process_info = regex("^\\w+ \\d+ \\d+:\\d+:\\d+ \\w+ ([^:]+):", message_field);
    set_field("process_info", process_info["0"]);
    
    // Extract Log Level
    let log_level = regex("([^:]+): \\w+ \\[[^\\]]+\\] (\\w+): \\[\\w+\\]", message_field);
    set_field("log_level", log_level["2"]);
    
    // Extract Job ID and Job Type
    let job_info = regex("Job (\\d+) ([^:]+): \\((\\w+)\\)", message_field);
    set_field("job_id", job_info["1"]);
    set_field("job_type", job_info["2"]);
    set_field("job_status", job_info["3"]);
end

We’ll need some more information.
How are you ingesting those logs? Do you see any messages if you disable that pipeline? Please share a sanitized version of one of those processed messages.
What are you trying to accomplish? Looks like you are trying to build a GELF message? That’s backwards … it has to be GELF already to be ingested by a GELF input. Otherwise use a different input that matches your log format.

Hey @patrickmann,

I am using Filebeat to send the logs over to Graylog.
After creating the pipeline, nothing is processed.

Will try to send it over by GELF format and see if my pipeline is working.

Thanks

I am able to create the pipeline and it working perfectly.

Thank for the help

1 Like