Coder Social home page Coder Social logo

winston-elasticsearch's Issues

TypeError: callback is not a function at bulk.forEach (.../node_modules/winston-elasticsearch/bulk_writer.js:85:10) at process._tickCallback (internal/process/next_tick.js:68:7)

Hello,

trying to use winston-elasticsearch against elasticsearch instances 6.1.4 or 6.2.4 results in crazy amounts of these error messages:

TypeError: callback is not a function
    at bulk.forEach (.../node_modules/winston-elasticsearch/bulk_writer.js:85:36)
    at Array.forEach (<anonymous>)
    at client.bulk.then (.../node_modules/winston-elasticsearch/bulk_writer.js:85:10)
    at process._tickCallback (internal/process/next_tick.js:68:7)

Packaging basics

There seem to be some basic packaging issues with the library, mainly:

  • Lack of tags and proper versioning
  • Including extraneous files like .eslintrc in npm bundle
  • Git has your specific .vscode IDE settings which is typically frowned upon.

I can make a pull request to fix these things once I get it to work. I'm bumping into the ever popular Rejected mapping bug like #49

Elasticsearch 6.1.4: Document mapping type name can\'t start with \'_\', found: [_doc]'

Hello,

I am using the following Elasticsearch instance:

{
    "name": "DxOVdqU",
    "cluster_name": "docker-cluster",
    "cluster_uuid": "IfOBOgZNQxaAz9U7ZajDEA",
    "version": {
        "number": "6.1.4",
        "build_hash": "d838f2d",
        "build_date": "2018-03-14T08:28:22.470Z",
        "build_snapshot": false,
        "lucene_version": "7.1.0",
        "minimum_wire_compatibility_version": "5.6.0",
        "minimum_index_compatibility_version": "5.0.0"
    },
    "tagline": "You Know, for Search"
}

This is running on localhost, port 9200 is mapped, works great via Postman etc. Now when I try to attach winston to this instance and log something, I get no documents in my index and the following error message:

Elasticsearch index error { _index: '3jvsbfjd7isxyurccnze',
  _type: '_doc',
  _id: 'ZxBwGmUBCZ46FrMrGSQX',
  status: 400,
  error:
   { type: 'invalid_type_name_exception',
     reason:
      'Document mapping type name can\'t start with \'_\', found: [_doc]' } }
TypeError: callback is not a function
    at bulk.forEach (/Users/simon/Projects/APPICS/src/appics-backend-common/node_modules/winston-elasticsearch/bulk_writer.js:85:36)
    at Array.forEach (<anonymous>)
    at client.bulk.then (/Users/simon/Projects/APPICS/src/appics-backend-common/node_modules/winston-elasticsearch/bulk_writer.js:85:10)
    at process._tickCallback (internal/process/next_tick.js:68:7)

This is currently completely stopping my efforts to use winston for logging from our app. Any idea what I can do to fix this?

this.client.bulk is not a function getting client issue with 0.7.0 for Elastic 6.3 and Winston 3.1.0

I am unable to use this transport for the following error stack,

"message":"uncaughtException: this.client.bulk is not a function\nTypeError: this.client.bulk is not a function\n at BulkWriter.flush (/Users/arpaul/Work/ALLISWELL/VAG/alliswell_nodeapp/node_modules/winston-elasticsearch/bulk_writer.js:67:22)\n

...
Elasticsearch 6.3 (with AWS using aws-elasticsearch-client)
Winston 3.1.0
Winston-elasticsearch 0.7.0

Can't start With Log error

node_modules/winston-elasticsearch/index.js:94
message,
^
SyntaxError: Unexpected token ,
at exports.runInThisContext (vm.js:73:16)

Cannot write to Elasticsearch 6.2.4 except using messageType: '_doc'

When I leave messageType to default, I get this error

Elasticsearch index error { type: 'illegal_argument_exception',
reason: 'Rejecting mapping update to [logs-2018.05.21] as the final mapping would have more than 1 type: [_doc, log]' }

So I suspected that this change (https://www.elastic.co/guide/en/elasticsearch/reference/6.x/removal-of-types.html) is affected so you can't put to logs-*/log and you have to put to logs-*/_doc instead

Allocation failed - JavaScript heap out of memory

Hi,
Received error .... using winston and winston-elasticsearch
any idea ?
Node 6.10
winston 2.31
winston-elasticsearch 0.5

==== JS stack trace =========================================

Security context: 000002A973FCFB61
1: log [C:\project\dist\server\node_modules\winston\lib\wins
ton\logger.js:~109] [pc=000000A6B0078D1A] (this=0000037A4E023269 ,level=000001472E47B4B9 <String[4]: info>)
2: arguments adaptor frame: 2->1
3: info [C:\project\dist\server\node_modules\winston\lib\winston\common.js:~51] [pc=000000A6B006D32B] (this=0000037A4E023269 ,ms.
..

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory

winston 3

Consider upgrading/adding support for winston 3

TypeError: Cannot read property 'timestamp' of undefined

I have

TypeError: Cannot read property 'timestamp' of undefined
at new Elasticsearch (.../node_modules/winston-elasticsearch/index.js:18:14)

when I use my custom logger.
logger.js code (copied from Readme#Usage):

var winston = require("winston");
var Elasticsearch = require("winston-elasticsearch");
 
var esTransportOpts = {
  level: "info"
};
winston.add(new winston.transports.Elasticsearch(), esTransportOpts);
module.exports = winston;

logger usage example:

var logger = require("./logger");
...
logger.log("info", "some message");

The problem is with opts variable. It is used without checking if it is undefined

winston-elasticsearch version: "0.7.0"
node version: v8.6.0

Try replace in index.js
line 17:

this.opts = opts || {};

with

if(!opts){
    opts = {};
}
this.opts = opts;

Inclusion of winston-elasticsearch breaks custom http_parser

I have a project with a custom http_parser binding, based on http-parser-js. It's set up like this:

process.binding('http_parser').HTTPParser = require('http-parser-js').HTTPParser;

All is fine until I require('winston-elasticsearch'). As soon as I do that, any custom methods I'm handling with the new parser are no longer valid, and the HTTP server starts emitting clientError.

I assume there's some dependency, perhaps a dependency of a dependency, that is also messing with the process bindings but I have been unable to locate it. I'm still digging to find the specific issue, but any pointers you might have as to where to look would be helpful. Thank you!

Memory limit for buffering of messages?

From the documentation:

Buffering of messages in case of inavailability of ES. The limit is the memory as all unwritten messages are kept in memory.

Can this be disabled? Or, can a reasonable limit be configurable?

I'd hate to have my process crashing due to an Elasticsearch failure or connectivity issue.

CPU usage when Elasticsearch is down

We have serious issues when our Elasticsearch server go down, the usage of CPU starts growing rapidly.

It would be really nice, if this module first tries to reconnect if some request fails on Timeout. And after it looks ok, it starts sending all the logs it couldnt send before.

Also growing of memory usage can kill the whole app, if it grows too much. An option "how much logs you keep" when cant log to Elastic would be nice. When you reach limit, it starts to discard them. We have non-critical usage sometimes (like monitoring), but this monitoring is able to crash the whole app - better to lose some logs in this case.

support for elasticsearch 6.X

Hello,

I'm trying to use this lib with elasticsearch 6.0 and I get problems with the template, when i changed to custome template I get no error but I dont see it indexing to elasticsearch

Does this lib supports EC v6.X, if not is there any plan to support it soon?

Request Timeout after 30000ms

I am using [email protected] and [email protected] logging in Kibana. I am getting below error.
{ Error: Request Timeout after 30000ms at /var/app/current/node_modules/elasticsearch/src/lib/transport.js:354:15 at Timeout.<anonymous> (/var/app/current/node_modules/elasticsearch/src/lib/transport.js:383:7) at Timeout.<anonymous> (/var/app/current/node_modules/async-listener/glue.js:188:31) at ontimeout (timers.js:498:11) at tryOnTimeout (timers.js:323:5) at Timer.listOnTimeout (timers.js:290:5) status: undefined, displayName: 'RequestTimeout', message: 'Request Timeout after 30000ms', body: undefined }

I have added the flag for not exit. winston.exitOnError = false; and added the option handleExceptions: true in transport. But still i am getting this error.

Please release 0.5.0 in npm

Current version in npm is 0.4.2, which is not fully compatible with ES5 as per your docs.

I would love to just npm install winston-elasticsearch instead of using your master commit from here :)

Thanks @vanthome

unwork with elasticserach 5.0

unwork with elasticserach 5.0. use es2.3.5 success, then upgrade to se5.0, it doesn't work; please check~!(use the latest package version2.3๏ผŒ winston and winsto-elasticsearch)

How to add some extra fields to the mapping.

Hi Everybody,

I'm looking for a way about how to include some additional fields to the basic MappingTemplate to do the information make more sense for us. But I have not luck to find a good example, so can you please give me a hand with that?

I tried to modify the transformer.js & the index-template-mapping.json files but it doesn't work.

We want to add some information like: module nada and device id in the data structure of winston-elasticsearch and be able to index it on Elasticsearch.

Thanks in advance

Application doesn't exit.

In a basic index.js code, I addded winston elasticsearch pointing to a server. Winston-elasticsearch trace shows successful connection, but after logging, the application doesn't exit.

Any suggestions?

Space in Debug log name

This plugin uses the string 'bulk writer' as its name for debug Unfortunately by having a space in the name, it is not possible to silent this output specifically:

This does not work

// notice the -bulk write below.
$ DEBUG='*,-express:router*,-nodemon:watch,-bulk writer' nodemon app/server.js 

This does work but is not ideal

// notice the -bulk*
$ DEBUG='*,-express:router*,-nodemon:watch,-bulk*' nodemon app/server.js 

Feature Request
The debug log be given a name without the space character, and consider making a more specific identifiable name for this debug log. Its standard to use : colon instead of spaces in debug names.

Elasticsearch index error in winston-elasticsearch v0.7.0

CREATE TEMPLATE------------------
Elasticsearch index error { type: 'mapper_parsing_exception',
reason: 'object mapping for [fields] tried to parse field [fields] as object, but found a concrete value' }
Elasticsearch index error { type: 'mapper_parsing_exception',
reason: 'object mapping for [fields] tried to parse field [fields] as object, but found a concrete value' }
Elasticsearch index error { type: 'mapper_parsing_exception',
reason: 'object mapping for [fields] tried to parse field [fields] as object, but found a concrete value' }
Elasticsearch index error { type: 'mapper_parsing_exception',
reason: 'object mapping for [fields] tried to parse field [fields] as object, but found a concrete value' }

"status" field name not written in elasticsearch

this is not working. Not errors thrown... the log is displayed in the console but not in elasticsearch....

    logger.info("something here", {
        status: "something",
    });

this is working (see that the status field has been renamed to sstatus with two s).

    logger.info("something here", {
        sstatus: "something",
    });

It was working until last night....

any ideas ?

Problem with typing

Hi. I have problem with typing for TS. In github this bug was fixed. But when will be It in npm version?

unable to add transport

Winston version 2.2.0

winston.add(winston.transports.Elasticsearch, esTransportOpts);
TypeError: transport is not a constructor
    at Logger.add (/elasticsearch/apps/synchrony/node_modules/winston/lib/winston/logger.js:475:41)
    at Object.winston.(anonymous function) [as add] (/elasticsearch/apps/synchrony/node_modules/winston/lib/winston.
js:87:34)
    at repl:1:9
    at defaultEval (repl.js:272:27)
    at bound (domain.js:280:14)
    at runBound (domain.js:293:12)
    at REPLServer.repl.eval (/usr/lib/node_modules/nesh/lib/plugins/doc.js:147:16)
    at REPLServer.<anonymous> (repl.js:441:10)
    at emitOne (events.js:101:20)
    at REPLServer.emit (events.js:188:7)
    at REPLServer.Interface._onLine (readline.js:224:10)
    at REPLServer.Interface._line (readline.js:566:8)
    at REPLServer.Interface._ttyWrite (readline.js:843:14)
    at ReadStream.onkeypress (readline.js:115:10)
    at emitTwo (events.js:111:20)
    at ReadStream.emit (events.js:191:7)

how to terminate winston-elasticsearch

I'm trying to finalize the winston-elasticsearch run but it never ends. always holding the execution

const winston = require('winston');
const Elasticsearch = require('winston-elasticsearch');


let logger = winston.createLogger({
    transports: [
        new Elasticsearch({
            level: 'info',
            flushInterval: 1
        })
    ]
});


function ass() {
logger.info('foo');
 
return null;
}

ass()

Question about buffer size & time

Hey there,

Thank you for all you've been doing here.

Looking at the code I couldn't tell for how long and what size limit is for the buffer used when a connection to Elasticsearch is lost.

I tested this with a trivial setup and it seems like for at least couple of minutes logs sent will not be lost even if ES is down/stopped, and when it comes back up all the logs are available as if they were successfully sent at the right timestamp.

  • Is this handled in winston-elasticsearch or perhaps winston itself?
  • For how long logs will be buffered before they're lost forever in case Elasticsearch is down for a considerable time?
  • Is there a size (either in terms of number of logs, or byte-size) limit on how much data is going to be buffered before they're lost?

Could you please give some feedback? ๐Ÿ™

Invalid transport, must be an object with a log method.

winston v3.1.0
winston-elasticsearch v0.7.5

Hello, I hope someone can help me with this issue? Is this a valid way to add Elasticsearch transport?

winston.add(winston.transports.Elasticsearch, {
            clientOpts: {
                host: 'http://uks-elk-01-p:9200'

            },
            indexPrefix: 'rhino-node-server'
});

The effect I am trying to achieve is for all winston logs to implicitly log to Elasticsearch. I don't want to add Elasticsearch transport to every new logger I create, so instead I have a WinstonConfigurator.js where I specify all winston transports and then I use that configuration for every new logger created.

Unfortunatey I get:

winston-transport\legacy.js:18
    throw new Error('Invalid transport, must be an object with a log method.');

I think there was something remotely related on winstonjs/winston-loggly#53

Especify document Id

@vanthome Is there a way to specify the document Id we will send to elasticsearch for each log we send to winston-elasticsearch? This became really important in my scenario since I want to override/update a document in elastic but I dont find the way whitout this feature. Something like the "document_id" we have in logstash

Cannot find module 'index-template-mapping.json'

If I leave out ensureMappingTemplate: false in the transport options object I will get following error:
Cannot find module 'index-template-mapping.json'
If I "force it" to get the file by adding __dirname to the path (const rawdata = fs.readFileSync(__dirname + '/index-template-mapping.json'); ) then I get an error in the end of my ES trace:
........
} } <- 400 { "error": { "root_cause": [ { "type": "illegal_argument_exception", "reason": "unknown setting [index.cache.field.type] please check that any required plugins are installed, or check the breaking changes documentation for removed settings" } ], "type": "illegal_argument_exception", "reason": "unknown setting [index.cache.field.type] please check that any required plugins are installed, or check the breaking changes documentation for removed settings" }, "status": 400 }

Arbitrary winston3 fields?

With Winston3, if I run:

logger.info({message: "Test", foo: "bar"});

There doesn't seem to be a way to get foo: "bar" into elasticsearch. The transformer isn't passed the entire info object from Winston - it only gets a {level, message, meta} object?

Invalid import statements in index.d.ts

In the index.d.ts file, the import statements are invalid when used in my TypeScript project. I get an error message saying that the module "has not default export".

It seems to be working if its instead changed to this style of importing:

  import * as TransportStream from 'winston-transport';
  import * as elasticsearch from 'elasticsearch';

Same approach as is used many other places, e.g. winston: https://github.com/winstonjs/winston/blob/master/index.d.ts

indexPrefix does not appear to work

Following the examples, the below works:

var esTransportOpts = {
  level: 'info',
  clientOpts: {
      host: constants.LOGGING_SERVER
  }
};

I can see logs in elastic search under logs-<date>

However I would like to change the index to MyAppName-<date>. I tried doing with the following:

var esTransportOpts = {
  level: 'info',
  indexPrefix: constants.LOGGING_PREFIX,
  indexSuffixPattern: 'YYYY.MM.DD',
  clientOpts: {
      host: constants.LOGGING_SERVER
  }
};

However these logs do not appear in elastic search and I cannot see any error messages as to why. Have the esTransportOpts changed?

Error during bulk writer flush is not recoverable

If there is an error while writing to Elasticsearch, the next bulk write is never scheduled. This can easily be reproduced by running a local Elasticsearch node and cycling it off and on during logging. After coming back online the code never resumes sending to ES. Your promise chain does not call schedule() when there is an error, so how is it supposed to resume?

BulkWriter.prototype.tick = function tick() {
debug('tick');
const thiz = this;
if (!this.running) { return; }
this.flush()
.catch((e) => {
throw e;
})
.then(() => {
thiz.schedule();
});
};

HTTP response status 200 but body declares error status 400

I was logging some data to elasticsearch that looked like the following.

A simple message, an evt name, an array of objects, and some rewrites for general app information

logger.info(`Tableau marks selected for ${_workbookName}/${_sheetName}` , {
                                    evt: 'tableauMarksSelection',
                                    marks: [ {..}, {..} ]
                                })

After utilizing this in elasticsearch for awhile it seemed that the marks field didnt really need to be an object...that I could just stringify the array and that would be sufficient for us.

Now marks gets stringified like the following

logger.info(`Tableau marks selected for ${_workbookName}/${_sheetName}` , {
                                    evt: 'tableauMarksSelection',
                                    marks: JSON.stringify([ {..}, {..} ])
                                })

After doing so I noticed that this particular log would NEVER send. So I started wrapping try/catches around, nothing. Wired up the logger.on('error', ...), nothing.

I knew it was something with this library b/c my console logs are working fine, as well as logging to file...so decided to debug the winston-elasticsearch module. After debugging through I finally crept up on the following code

elasticsearch/src/lib/transport.js

function respond(err, body, status, headers) {
    if (aborted) {
      return;
    }

    self._timeout(requestTimeoutId);
    var parsedBody;
    var isJson = !headers || (headers['content-type'] && ~headers['content-type'].indexOf('application/json'));

    if (!err && body) {
      if (isJson) {
        parsedBody = self.serializer.deserialize(body);
        if (parsedBody == null) {
          err = new errors.Serialization();
          parsedBody = body;
        }
      } else {
        parsedBody = body;
      }
    }

    // does the response represent an error?
    if (
      (!err || err instanceof errors.Serialization)
      && (status < 200 || status >= 300)
      && (!params.ignore || !_.include(params.ignore, status))
    ) {

      var errorMetadata = _.pick(params.req, ['path', 'query', 'body']);
      errorMetadata.statusCode = status;
      errorMetadata.response = body;

      if (status === 401 && headers && headers['www-authenticate']) {
        errorMetadata.wwwAuthenticateDirective = headers['www-authenticate'];
      }

      if (errors[status]) {
        err = new errors[status](parsedBody && parsedBody.error, errorMetadata);
      } else {
        err = new errors.Generic('unknown error', errorMetadata);
      }
    }

It appears that this code handles a lot of other potential statuses outside of a 200...but this particular error sends back a 200 in the http response but inside the body declares errors: true and a status: 400 inside the index response.

{
{
    "took": 3,
    "errors": true,
    "items": [
        {
            "index": {
                "_index": "logs-2017.11.21",
                "_type": "log",
                "_id": "AV_gQwfT3jrEIURbei0L",
                "status": 400,
                "error": {
                    "type": "mapper_parsing_exception",
                    "reason": "object mapping for [fields.marks] tried to parse field [marks] as object, but found a concrete value"
                }
            }
        }
    ]
}

Once I seen this body response I was able to test by simply by changing the marks field to tableauMarks

logger.info(`Tableau marks selected for ${_workbookName}/${_sheetName}` , {
    evt: 'tableauMarksSelection',
    tableauMarks: JSON.stringify(pairs)
})

And then the logs started going back into elasticsearch and my hunt was over =)

Was wanting to put this out here in case anyone else ran across it as this error is hidden deep within and does not easily expose itself.

ElasticSearch: 5.6.3 (docker container)

file path broken.

const rawdata = fs.readFileSync('index-template-mapping.json');

This line should be changed to:

const rawdata = fs.readFileSync(__dirname + '/index-template-mapping.json');

In order to actually reference the json file.

The reference is also not providing any method of error handling which can cause memory leaks and other nasties on heavy loads, and will fail in later versions of Nodejs.

Assign me if you want me to fix it.

Update element

Hello, first of all thanks for this open-source usefull project .

In my project we use Winston-es to insert & we want now to update some documents , is this possible with Winston-es ?

thks

Real buffering of messages in case of unavailable ES

Seen this as an unsupported feature currently but was hoping to discuss some ideas on how this could be implemented. I know we have lost messages due to network connectivity dropping and not having any type of local queue for when connectivity resumes.

@vanthome was hoping to hear some thoughts you might of had on implementing this type of feature?

Index name must have a prefix and suffix

I'd prefer my index just be 'foo-bar' and not 'foo-bar-2015-10-30' which is consistent with this thing before it was re-written. Perhaps an exposed hook to getIndexName is in order so that one can do whatever they want?

unknown setting - index.cache.field.type

hello,

i have setup elasticsearch(6.2.4) and kibana(6.2.4) in my local docker installation.
the docker images of both have been taken from the official docker hub site.

i have a node js application in which i am using winston(2.4.2), elasticsearch(14.2.2) and winston-elasticsearch(0.5.9) to try to log to my elasticsearch setup.

below is my code:
initialize logger -------------------------------

const es = require("elasticsearch");
const Elasticsearch = require("winston-elasticsearch");
const winston = require("winston");

const esClient = new es.Client({
            host: "<ip address>:9200",
            log: "trace",
            index: "oversight",
            sniffOnStart: true
          });

        this.elasticLogger = new (winston.Logger)({
            levels: this.logLevels.levels,
            transports: [
                new Elasticsearch({ client: esClient, colorize: true  })
            ],
            exceptionHandlers: [
                new Elasticsearch({ client: esClient, colorize: true })
            ]
            exitOnError: false,
            handleExceptions: true
        });

        this.elasticLogger.on("error", (err) => {
            console.log(`Elastic Logger failed at ${new Date()} with error ${err} !`);
        });

i also have a transport for console logging which is working fine and the connection between elasticsearch and kibana containers are also up and running.

when my application's docker container is up and starts to log, following are the stdout info:

Elasticsearch TRACE: 2018-04-25T14:43:36Z
  -> GET http://<ip address>:9200/_nodes/_all/http?filter_path=nodes.*.http.publish_address%2Cnodes.*.name%2Cnodes.*.hostname%2Cnodes.*.host%2Cnodes.*.version
  
  <- 200
  {
    "nodes": {
      "2knkp79ZS0W_iCTWiHaGxQ": {
        "name": "oversight_elasticsearch",
        "host": "<ip address>",
        "version": "6.2.4",
        "http": {
          "publish_address": "<ip address>:9200"
        }
      }
    }
  }
Elasticsearch TRACE: 2018-04-25T14:43:36Z
  -> HEAD http://<ip address>:9200/
  
  <- 200

but after some tries to create template logs i get the following error:

Elasticsearch TRACE: 2018-04-25T14:43:36Z
  -> PUT http://<ip address>:9200/_template/template_logs?create=true
 { ... a whole bunch of default settings...}

  <- 400
  {
    "error": {
      "root_cause": [
        {
          "type": "illegal_argument_exception",
          "reason": "unknown setting [index.cache.field.type] please check that any required plugins are installed, or check the breaking changes documentation for removed settings"
        }
      ],
      "type": "illegal_argument_exception",
      "reason": "unknown setting [index.cache.field.type] please check that any required plugins are installed, or check the breaking changes documentation for removed settings"
    },
    "status": 400
  }

upon searching the current github issues details and other pages, i found that the setting - index.cache.field.type has not been a valid setting since version 0.26.0.

please guide me on how to proceed with the setup or this issue. is there any other additional setup or setting that i might have missed ?

any help is greatly appreciated!

TypeError: callback is not a function

I am getting this error when logging a message:

TypeError: callback is not a function

My code:

const winston = require("winston");
const logger = new winston.Logger();
...
if( process.env.ELASTIC_SEARCH_LOGGING_URL ){

  var elasticsearch = require('elasticsearch');
  var client = new elasticsearch.Client({
    host: process.env.ELASTIC_SEARCH_LOGGING_URL,
    log: 'info'
  }); 

  logger.add( 
    require('winston-elasticsearch'),
    {
      client
    }        
  );
}
//this causes the error
logger.info("hi")

I am seeing this:

clock_1     | TypeError: callback is not a function
clock_1     |     at Elasticsearch.log (/usr/app/node_modules/winston-elasticsearch/index.js:105:5)
clock_1     |     at transportLog (/usr/app/node_modules/winston/lib/winston/logger.js:234:15)
clock_1     |     at /usr/app/node_modules/winston/node_modules/async/lib/async.js:157:13

I use [email protected],[email protected] and [email protected]. The ELASTIC_SEARCH_LOGGING_URL env variable is accurate.

Also posted it here:
https://stackoverflow.com/questions/52823107/winston-elasticsearch-creates-callback-error-when-logging

reconnect/retry

I wanted to verify if this is expected behavior. If my ES instance is off-line, logging stops as expected. However, once the ES instance is back on-line, logging does not resume. I have to restart the app to force a reconnection to ES and resume logging. Is this by design?

callback is not a function

I have a basic setup with,

new Elasticsearch({
    level: 'debug',
    index: 'api',
    clientOpts: {
      host: '192.168.254.25:9200'
    }
  })

logs are coming into ES but i'm seeing an infinite loop of,

api_1      | TypeError: callback is not a function
api_1       |     at bulk.forEach (/api/node_modules/winston-elasticsearch/bulk_writer.js:85:36)
api_1       |     at Array.forEach (<anonymous>)
api_1       |     at client.bulk.then (/api/node_modules/winston-elasticsearch/bulk_writer.js:85:10)
api_1       |     at <anonymous>
api_1       |     at process._tickDomainCallback (internal/process/next_tick.js:228:7)
api_1       | TypeError: callback is not a function
api_1       |     at bulk.forEach (/api/node_modules/winston-elasticsearch/bulk_writer.js:85:36)
api_1       |     at Array.forEach (<anonymous>)
api_1       |     at client.bulk.then (/api/node_modules/winston-elasticsearch/bulk_writer.js:85:10)
api_1       |     at <anonymous>
api_1       |     at process._tickDomainCallback (internal/process/next_tick.js:228:7)

Feel like I'm doing a dumb newbie mistake here. Any pointers?

Clarification for `buffering: false`

Hello!

First of all, I love the package and it has been very useful in my projects.
That said, I think the way buffering: false works is incorrect, or should be more clear in the documentation.
In my opnion, at least, when buffering is false the package should send every message, but what really happens is that the buffer is limited to a single message and discard the others that came at the same time.
So, I hope it could be either modified to send every message or more clear in the documentation.

Besides that, great package, great work and thank you.

unable to transform a field to ip or geo_point type

For a project I want to log an ip to elasticsearch. I manage this by adding a ip field / geo_point field at root of document, I add a trnasformer to populate these fields but when I look to the mapping given by elastic search nothing's beeing applied.

Here's my mapping :

mapping.js :

export default {
  index_patterns: 'log-*',
  settings: {
    number_of_shards: 1,
    number_of_replicas: 0,
    index: {
      refresh_interval: '5s',
      cache: { 'field.type': 'soft' },
      query: { default_field: 'message' },
      store: { compress: { stored: true, tv: true } },
      routing: { 'allocation.total_shards_per_node': 1 },
    },
  },
  mappings: {
    _default_: {
      _all: { enabled: false, omit_norms: true },
      _source: { enabled: true },
      _ttl: { enabled: true, default: '900d' },
      dynamic_templates: [
        {
          string_fields: {
            match: '*',
            match_mapping_type: 'keyword',
            mapping: {
              type: 'text',
              index: true,
              omit_norms: true,
              fields: {
                raw: { type: 'keyword', index: true, ignore_above: 256 },
              },
            },
          },
        },
      ],
      properties: {
        '@timestamp': { type: 'date' },
        '@version': { type: 'keyword' },
        message: { type: 'text', index: true },
        severity: { type: 'keyword', index: true },
        ip: { type: 'ip', index: true },
        geocoding: { type: 'geo_point', index: true },
        fields: {
          type: 'object',
          dynamic: true,
        },
      },
    },
  },
};

here's my transport's opts :

import mapping from './elastic/mapping';

// ...

const config = {
  mappingTemplate: mapping,
  transformer: (logData) => {
    const transformed = transformer(logData);

    if (transformed.fields.latlng) {
      transformed.geocoding = transformed.fields.latlng; // already beeing formatted
    }

    if (transformed.fields.ip) {
      transformed.ip = transformed.fields.ip; // already beeing formatted
    }

    return transformed;
  },
};

And finally the given mapping by elastic :

{
  "logs-2018.04.26": {
    "mappings": {
      "log": {
        "properties": {
          "@timestamp": {
            "type": "date"
          },
          "fields": {
            "properties": {
              "ip": {
                "type": "text",
                "fields": {
                  "keyword": {
                    "type": "keyword",
                    "ignore_above": 256
                  }
                }
              },
              "latlng": {
                "properties": {
                  "lat": {
                    "type": "float"
                  },
                  "lon": {
                    "type": "float"
                  }
                }
              },
              "level": {
                "type": "text",
                "fields": {
                  "keyword": {
                    "type": "keyword",
                    "ignore_above": 256
                  }
                }
              },
              "message": {
                "type": "text",
                "fields": {
                  "keyword": {
                    "type": "keyword",
                    "ignore_above": 256
                  }
                }
              },
              "method": {
                "type": "text",
                "fields": {
                  "keyword": {
                    "type": "keyword",
                    "ignore_above": 256
                  }
                }
              },
              "response-time": {
                "type": "text",
                "fields": {
                  "keyword": {
                    "type": "keyword",
                    "ignore_above": 256
                  }
                }
              },
              "socket": {
                "type": "text",
                "fields": {
                  "keyword": {
                    "type": "keyword",
                    "ignore_above": 256
                  }
                }
              },
              "status": {
                "type": "text",
                "fields": {
                  "keyword": {
                    "type": "keyword",
                    "ignore_above": 256
                  }
                }
              },
              "url": {
                "type": "text",
                "fields": {
                  "keyword": {
                    "type": "keyword",
                    "ignore_above": 256
                  }
                }
              },
              "user-agent": {
                "type": "text",
                "fields": {
                  "keyword": {
                    "type": "keyword",
                    "ignore_above": 256
                  }
                }
              }
            }
          },
          "geocoding": {
            "properties": {
              "lat": {
                "type": "float"
              },
              "lon": {
                "type": "float"
              }
            }
          },
          "ip": {
            "type": "text",
            "fields": {
              "keyword": {
                "type": "keyword",
                "ignore_above": 256
              }
            }
          },
          "message": {
            "type": "text",
            "fields": {
              "keyword": {
                "type": "keyword",
                "ignore_above": 256
              }
            }
          },
          "severity": {
            "type": "text",
            "fields": {
              "keyword": {
                "type": "keyword",
                "ignore_above": 256
              }
            }
          }
        }
      }
    }
  }
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.