Send logs to ES with Bunyan as logging frame - node.js

In my Node.js app I am using Bunyan as logger library. It has bunyan-elasticsearch additional library in order to send the Bunyan generated logs to ES.
Here’s my implementation:

const esStream = new ElasticSearch({
    indexPattern: 'YYYY.MM.DD',
    type: 'logs',
    host: 'http://graylog-server:9000'
  });

const logger = bunyan.createLogger({
    name: 'my-service',
    streams: [
      {
        level: 'info',
        stream: esStream,
        formatter: 'pretty'
      }
    ]
});

The problem is that upon initialization, I get an error from elasticsearch lib:

{ Error: Not Found
   status: 404,
   displayName: 'NotFound',
   message: 'Not Found',
   path: '/2017.08.10/logs'
}

which I assume means that it does successfully connect to the Graylog ES server but cannot find the log files at the location /YYYY.MM.DD/logs
So, I am wondering if anyone has been working with Bunyan ES and Graylog and knows how to set this up?
Where/how are the logs stored on Graylog’s ES?

What types of inputs did you create in Graylog and what’s their configuration?

There’s only one input which can receive logs via HTTP out of the box, and that’s the GELF HTTP input which obviously requires the requests to use the GELF format.

See http://docs.graylog.org/en/2.3/pages/gelf.html#gelf-payload-specification and http://docs.graylog.org/en/2.3/pages/gelf.html#sending-gelf-messages-via-tcp-using-curl for details.

The GELF HTTP input does not process arbitrary JSON payloads.

I have updated my OP with the actual problem. After digging a bit into it I found that the problem is in the source. Can you re-read and look into that?

The messages are stored in the Elasticsearch indices managed by Graylog.

If you want to use Graylog for querying your messages, you have to ingest them with Graylog. If you write them directly into Elasticsearch, funny things™ will happen.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.