Coder Social home page Coder Social logo

fluent-plugin-mongo's Introduction

MongoDB plugin for Fluentd

fluent-plugin-mongo provides input and output plugins for Fluentd (GitHub)

Requirements

|fluent-plugin-mongo|   fluentd  |  ruby  |
|-------------------|------------|--------|
|     >= 1.0.0      | >= 0.14.12 | >= 2.1 |
|     <  1.0.0      | >= 0.12.0  | >= 1.9 |

Installation

Gems

The gem is hosted at Rubygems.org. You can install the gem as follows:

$ fluent-gem install fluent-plugin-mongo

Plugins

Output plugin

mongo

Store Fluentd event to MongoDB database.

Configuration

Use mongo type in match.

<match mongo.**>
  @type mongo

  # You can choose two approaches, connection_string or each parameter
  # 1. connection_string for MongoDB URI
  connection_string mongodb://fluenter:10000/fluent

  # 2. specify each parameter
  database fluent
  host fluenter
  port 10000

  # collection name to insert
  collection test

  # Set 'user' and 'password' for authentication.
  # These options are not used when use connection_string parameter.
  user handa
  password shinobu

  # Set 'capped' if you want to use capped collection
  capped
  capped_size 100m

  # Specify date fields in record to use MongoDB's Date object (Optional) default: nil
  # Supported data types are String/Integer/Float/Fuentd EventTime.
  # For Integer type, milliseconds epoch and seconds epoch are supported.
  # eg: updated_at: "2020-02-01T08:22:23.780Z" or updated_at: 1580546457010
  date_keys updated_at

  # Specify id fields in record to use MongoDB's BSON ObjectID (Optional) default: nil
  # eg: my_id: "507f1f77bcf86cd799439011"
  object_id_keys my_id

  # Other buffer configurations here
</match>

For connection_string parameter, see docs.mongodb.com/manual/reference/connection-string/ article for more detail.

built-in placeholders

fluent-plugin-mongo support built-in placeholders. database and collection parameters can handle them.

Here is an example to use built-in placeholders:

<match mongo.**>
  @type mongo

  database ${tag[0]}

  # collection name to insert
  collection ${tag[1]}-%Y%m%d

  # Other buffer configurations here
  <buffer tag, time>
    @type memory
    timekey 3600
  </buffer>
</match>

In more detail, please refer to the officilal document for built-in placeholders: docs.fluentd.org/v1.0/articles/buffer-section#placeholders

mongo(tag mapped mode)

Tag mapped to MongoDB collection automatically.

Configuration

Use tag_mapped parameter in match of mongo type.

If tag name is “foo.bar”, auto create collection “foo.bar” and insert data.

<match forward.*>
  @type mongo
  database fluent

  # Set 'tag_mapped' if you want to use tag mapped mode.
  tag_mapped

  # If tag is "forward.foo.bar", then prefix "forward." is removed.
  # Collection name to insert is "foo.bar".
  remove_tag_prefix forward.

  # This configuration is used if tag not found. Default is 'untagged'.
  collection misc

  # Other configurations here
</match>

mongo_replset

Replica Set version of mongo.

Configuration

v0.8 or later
<match mongo.**>
  @type mongo_replset
  database fluent
  collection logs

  nodes localhost:27017,localhost:27018

  # The replica set name
  replica_set myapp

  # num_retries is threshold at failover, default is 60.
  # If retry count reached this threshold, mongo plugin raises an exception.
  num_retries 30

  # following optional parameters passed to mongo-ruby-driver.
  # See mongo-ruby-driver docs for more detail: https://docs.mongodb.com/ruby-driver/master/tutorials/ruby-driver-create-client/
  # Specifies the read preference mode
  #read secondary
</match>
v0.7 or ealier

Use mongo_replset type in match.

<match mongo.**>
  @type mongo_replset
  database fluent
  collection logs

  # each node separated by ','
  nodes localhost:27017,localhost:27018,localhost:27019

  # following optional parameters passed to mongo-ruby-driver.
  #name replset_name
  #read secondary
  #refresh_mode sync
  #refresh_interval 60
  #num_retries 60
</match>

Input plugin

mongo_tail

Tail capped collection to input data.

Configuration

Use mongo_tail type in source.

<source>
  @type mongo_tail
  database fluent
  collection capped_log

  tag app.mongo_log

  # waiting time when there is no next document. default is 1s.
  wait_time 5

  # Convert 'time'(BSON's time) to fluent time(Unix time).
  time_key time

  # Convert ObjectId to string
  object_id_keys ["id_key"]
</source>

You can also use url to specify the database to connect.

<source>
  @type mongo_tail
  url mongodb://user:[email protected]:10249,192.168.0.14:10249/database
  collection capped_log
  ...
</source>

This allows the plugin to read data from a replica set.

You can save last ObjectId to tail over server’s shutdown to file.

<source>
  ...

  id_store_file /Users/repeatedly/devel/fluent-plugin-mongo/last_id
</source>

Or Mongo collection can be used to keep last ObjectID.

<source>
  ...

  id_store_collection last_id
</source>

Make sure the collection is capped. The plugin inserts records but does not remove at all.

NOTE

replace_dot_in_key_with and replace_dollar_in_key_with

BSON records which include ‘.’ or start with ‘$’ are invalid and they will be stored as broken data to MongoDB. If you want to sanitize keys, you can use replace_dot_in_key_with and replace_dollar_in_key_with.

<match forward.*>
  ...
  # replace '.' in keys with '__dot__'
  replace_dot_in_key_with __dot__

  # replace '$' in keys with '__dollar__'
  # Note: This replaces '$' only on first character
  replace_dollar_in_key_with __dollar__
  ...
</match>

Broken data as a BSON

NOTE: This feature will be removed since v0.8

Fluentd event sometimes has an invalid record as a BSON. In such case, Mongo plugin marshals an invalid record using Marshal.dump and re-inserts its to same collection as a binary.

If passed following invalid record:

{"key1": "invalid value", "key2": "valid value", "time": ISODate("2012-01-15T21:09:53Z") }

then Mongo plugin converts this record to following format:

{"__broken_data": BinData(0, Marshal.dump result of {"key1": "invalid value", "key2": "valid value"}), "time": ISODate("2012-01-15T21:09:53Z") }

Mongo-Ruby-Driver cannot detect an invalid attribute, so Mongo plugin marshals all attributes excluding Fluentd keys(“tag_key” and “time_key”).

You can deserialize broken data using Mongo and Marshal.load. Sample code is below:

# _collection_ is an instance of Mongo::Collection
collection.find({'__broken_data' => {'$exists' => true}}).each do |doc|
  p Marshal.load(doc['__broken_data'].to_s) #=> {"key1": "invalid value", "key2": "valid value"}
end

ignore_invalid_record

If you want to ignore an invalid record, set true to ignore_invalid_record parameter in match.

<match forward.*>
  ...

  # ignore invalid documents at write operation
  ignore_invalid_record true

  ...
</match>

exclude_broken_fields

If you want to exclude some fields from broken data marshaling, use exclude_broken_fields to specfiy the keys.

<match forward.*>
  ...

  # key2 is excluded from __broken_data.
  # e.g. {"__broken_data": BinData(0, Marshal.dump result of {"key1": "invalid value"}), "key2": "valid value", "time": ISODate("2012-01-15T21:09:53Z")
  exclude_broken_fields key2

  ...
</match>

Specified value is a comma separated keys(e.g. key1,key2,key3). This parameter is useful for excluding shard keys in shard environment.

Buffer size limitation

Mongo plugin has the limitation of buffer size. Because MongoDB and mongo-ruby-driver checks the total object size at each insertion. If total object size gets over the size limitation, then MongoDB returns error or mongo-ruby-driver raises an exception.

So, Mongo plugin resets buffer_chunk_limit if configurated value is larger than above limitation:

  • Before v1.8, max of buffer_chunk_limit is 2MB

  • After v1.8, max of buffer_chunk_limit is 8MB

Tool

You can tail mongo capped collection.

$ mongo-tail -f

Test

Run following command:

$ bundle exec rake test

You can use ‘mongod’ environment variable for specified mongod:

$ mongod=/path/to/mongod bundle exec rake test

Note that source code in test/tools are from mongo-ruby-driver.

Copyright

Copyright

Copyright © 2011- Masahiro Nakagawa

License

Apache License, Version 2.0

fluent-plugin-mongo's People

Contributors

aferreira avatar ashie avatar castor91 avatar cherrot avatar cosmo0920 avatar dependabot[bot] avatar fetaro avatar hotchpotch avatar kailashyogeshwar85 avatar kiyoto avatar mcvulin avatar okkez avatar repeatedly avatar ryotarai avatar shunwen avatar soutaro avatar tagomoris avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

fluent-plugin-mongo's Issues

Capped collection not created

When using v0.8.0 the capped collection config is not applied.
Here's the config we use:

  <match docker.*>
    type mongo_replset
    tag_mapped
    remove_tag_prefix docker.
    flush_interval 10s
    max_retry_wait 60s
    host myhost.example.com
    replica_set rs0
    user logs
    password xxx
    database logs
    capped
    capped_size 2097152
  </match>

The resulting collection is not capped.

shard_key option is needed for broken data in shard environement.

out_mongo now works fine on mongod and mongos.
But invalid record handling still has a problem.

In shard environment, converting all fields into __broken_data is wrong.
Because mongos requires the shard key to insert a document.

So we should add shard_key option to mongo plugins.
If shard_key is specified, mongo plugins exclude the shard key from __broken_data.

fluent-plugin-mongo crash issue

in my project, i use fluent to agregate log data. for a few hours it will die.
here is my configuration file:

pos_file /home/emacle/tmp/pos_fil.conf tag platform type mytail time_format %Y-%m-%D %H:%M:%s,%L path /home/emacle/services/user_platform/tongbupan/logs/tongbupan.log pos_file /home/emacle/tmp/pos_fil.conf tag apps type mytail time_format %Y-%m-%D %H:%M:%s,%L path /mnt/logs/backend/infolog.log pos_file /home/emacle/tmp/pos_fil_front.conf tag platform_front type logtail time_format %Y-%m-%D %H:%M:%s,%L path /home/emacle/services/user_platform/tongbupan/logs/tongbupan.log pos_file /home/emacle/tmp/pos_fil_front.conf tag apps_front type logtail time_format %Y-%m-%D %H:%M:%s,%L path /mnt/logs/backend/infolog.log type mongo database emacledatabeta collection logs ignore_invalid_record true host 10.200.2.208 port 11811 utc false type mongo database emacledatabeta collection logs ignore_invalid_record true host 10.200.2.208 port 11811 utc false type mongo database emaclelogbeta collection infolog ignore_invalid_record true host 10.200.2.208 port 11811 utc false type mongo database emaclelogbeta collection infolog ignore_invalid_record true host 10.200.2.208 port 11811 utc false

after about 4 or 5 hours it will die
exception stack is following:
2012-12-18 18:29:05 +0800: restarting
2012-12-18 18:29:05 +0800: shutting down fluentd
2012-12-18 18:29:05 +0800: process finished code=0
2012-12-18 18:29:05 +0800: starting fluentd-0.10.27
2012-12-18 18:29:05 +0800: reading config file path="/etc/fluent/fluent.conf"
2012-12-18 18:29:05 +0800: adding source type="mytail"
2012-12-18 18:29:05 +0800: adding source type="mytail"
2012-12-18 18:29:05 +0800: adding source type="logtail"
2012-12-18 18:29:05 +0800: adding source type="logtail"
2012-12-18 18:29:05 +0800: adding source type="login"
2012-12-18 18:29:05 +0800: adding source type="login"
2012-12-18 18:29:05 +0800: adding match pattern="platform" type="mongo"
2012-12-18 18:29:05 +0800: unexpected error error="Input/output error - "
2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/gems/1.9.1/gems/bson-1.6.4/lib/bson.rb:82:in write' 2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/gems/1.9.1/gems/bson-1.6.4/lib/bson.rb:82:inwarn'
2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/gems/1.9.1/gems/bson-1.6.4/lib/bson.rb:82:in rescue in <top (required)>' 2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/gems/1.9.1/gems/bson-1.6.4/lib/bson.rb:62:in<top (required)>'
2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/1.9.1/rubygems/custom_require.rb:36:in require' 2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/1.9.1/rubygems/custom_require.rb:36:inrequire'
2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/gems/1.9.1/gems/mongo-1.6.4/lib/mongo.rb:53:in <top (required)>' 2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/1.9.1/rubygems/custom_require.rb:36:inrequire'
2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/1.9.1/rubygems/custom_require.rb:36:in require' 2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/gems/1.9.1/gems/fluent-plugin-mongo-0.6.9/lib/fluent/plugin/out_mongo.rb:32:ininitialize'
2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/gems/1.9.1/gems/fluentd-0.10.27/lib/fluent/plugin.rb:99:in new' 2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/gems/1.9.1/gems/fluentd-0.10.27/lib/fluent/plugin.rb:99:innew_impl'
2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/gems/1.9.1/gems/fluentd-0.10.27/lib/fluent/plugin.rb:45:in new_output' 2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/gems/1.9.1/gems/fluentd-0.10.27/lib/fluent/engine.rb:83:inblock in configure'
2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/gems/1.9.1/gems/fluentd-0.10.27/lib/fluent/engine.rb:73:in each' 2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/gems/1.9.1/gems/fluentd-0.10.27/lib/fluent/engine.rb:73:inconfigure'
2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/gems/1.9.1/gems/fluentd-0.10.27/lib/fluent/engine.rb:51:in parse_config' 2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/gems/1.9.1/gems/fluentd-0.10.27/lib/fluent/supervisor.rb:247:inrun_configure'
2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/gems/1.9.1/gems/fluentd-0.10.27/lib/fluent/supervisor.rb:88:in block in start' 2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/gems/1.9.1/gems/fluentd-0.10.27/lib/fluent/supervisor.rb:180:incall'
2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/gems/1.9.1/gems/fluentd-0.10.27/lib/fluent/supervisor.rb:180:in main_process' 2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/gems/1.9.1/gems/fluentd-0.10.27/lib/fluent/supervisor.rb:155:inblock in supervise'
2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/gems/1.9.1/gems/fluentd-0.10.27/lib/fluent/supervisor.rb:154:in fork' 2012-12-18 18:29:05 +0800: /usr/local/lib/ruby/gems/1.9.1/gems/fluentd-0.10.27/lib/fluent/supervisor.rb:154:insupervise'

2012-12-18 18:29:05 +0800: process finished code=256
2012-12-18 18:29:05 +0800: process died within 1 second. exit.

i have checked the source file of mongo plugin. it is " require 'mongo' " in line 32.
i have know idea about why this happened.

my fluentd mongo plugin is 0.6.7
fluentd is 0.10.25
my ruby is 1.9.3

need your help

Data is not coming into MongoDb

Hi!
I am trying insert logs which are in json format into mongodb.
I am running fluent with below configuration file. Db is created in mongodb but data is not coming.
Could you please help me with this.

Thanks,
Praveen.

td-agent.conf.txt

error="Unknown output plugin 'mongo'."

OS : windows 10
version : td-agent-3.0.1

command

fluentd -c C:/opt/td-agent/etc/td-agent/td-agent.conf

error : error_class=Fluent::ConfigError error="Unknown output plugin 'mongo'. Run 'gem search -rd fluent-plugin' to find plugins"

command

fluent-gem install fluent-plugin-mongo

C:\opt\td-agent\embedded\bin>fluent-gem install fluent-plugin-mongo
WARN: Unresolved specs during Gem::Specification.reset:
      msgpack (< 2.0.0, >= 0.7.0)
      serverengine (< 3.0.0, >= 2.0.4)
      tzinfo (~> 1.0)
      tzinfo-data (~> 1.0)
      ffi (>= 0)
      win32-api (>= 1.4.5)
WARN: Clearing out unresolved specs.
Please report a bug if this causes problems.
Building native extensions.  This could take a while...
ERROR:  Error installing fluent-plugin-mongo:
        ERROR: Failed to build gem native extension.

    current directory: C:/opt/td-agent/embedded/lib/ruby/gems/2.3.0/gems/bson-4.
3.0/ext/bson
C:/opt/td-agent/embedded/bin/ruby.exe -r ./siteconf20180405-4432-lg8nfr.rb extco
nf.rb
creating Makefile

current directory: C:/opt/td-agent/embedded/lib/ruby/gems/2.3.0/gems/bson-4.3.0/
ext/bson
make "DESTDIR=" clean
'make' is not recognized as an internal or external command,
operable program or batch file.

current directory: C:/opt/td-agent/embedded/lib/ruby/gems/2.3.0/gems/bson-4.3.0/
ext/bson
make "DESTDIR="
'make' is not recognized as an internal or external command,
operable program or batch file.

make failed, exit code 1

Gem files will remain installed in C:/opt/td-agent/embedded/lib/ruby/gems/2.3.0/
gems/bson-4.3.0 for inspection.
Results logged to C:/opt/td-agent/embedded/lib/ruby/gems/2.3.0/extensions/x64-mi
ngw32/2.3.0/bson-4.3.0/gem_make.out

C:\opt\td-agent\embedded\bin>

I installed devkit(rubyinstaller-devkit-2.4.4) for this, but still not working

BSON or JSON

Hey guys,

I found an issue with fluent-plugin-mongo, when I send a BSON format hash, this one would not get to MongoDB:
{10 => "blah"}

However if I change the key to a string, then it works:
{"10" => "blah"}

In case fluent-plugin-mongo should accept BSON then there is probably a bug, if fluent-plugin-mongo should not accept BSON then it should throw up an error.

Let me know if you need more info.

Cheers!
Sylvain

mongo_tail error on slow log collection system.profiling

2014-01-10 11:53:37 +0700 [warn]: emit transaction failed error_class=NoMethodError error=#<NoMethodError: undefined method to_msgpack' for 2014-01-10 03:14:13 UTC:Time> 2014-01-10 11:53:37 +0700 [warn]: /etc/td-agent/plugin/out_elasticsearch.rb:33:in to_msgpack'
2014-01-10 11:53:37 +0700 [warn]: /etc/td-agent/plugin/out_elasticsearch.rb:33:into_msgpack' 2014-01-10 11:53:37 +0700 [warn]: /etc/td-agent/plugin/out_elasticsearch.rb:33:in format'
2014-01-10 11:53:37 +0700 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.41/lib/fluent/mixin.rb:96:inblock in format_stream' 2014-01-10 11:53:37 +0700 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.41/lib/fluent/event.rb:54:in call'
2014-01-10 11:53:37 +0700 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.41/lib/fluent/event.rb:54:ineach' 2014-01-10 11:53:37 +0700 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.41/lib/fluent/mixin.rb:93:in format_stream'
2014-01-10 11:53:37 +0700 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.41/lib/fluent/output.rb:235:inemit' 2014-01-10 11:53:37 +0700 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.41/lib/fluent/match.rb:36:in emit'
2014-01-10 11:53:37 +0700 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.41/lib/fluent/engine.rb:151:inemit_stream' 2014-01-10 11:53:37 +0700 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.41/lib/fluent/engine.rb:131:in emit'
2014-01-10 11:53:37 +0700 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-mongo-0.7.1/lib/fluent/plugin/in_mongo_tail.rb:105:inblock in tailoop' 2014-01-10 11:53:37 +0700 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-mongo-0.7.1/lib/fluent/plugin/in_mongo_tail.rb:83:in loop'
2014-01-10 11:53:37 +0700 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-mongo-0.7.1/lib/fluent/plugin/in_mongo_tail.rb:83:intailoop' 2014-01-10 11:53:37 +0700 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-mongo-0.7.1/lib/fluent/plugin/in_mongo_tail.rb:60:in block in run'
2014-01-10 11:53:37 +0700 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-mongo-0.7.1/lib/fluent/plugin/in_mongo_tail.rb:59:inloop' 2014-01-10 11:53:37 +0700 [warn]: /usr/lib64/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-mongo-0.7.1/lib/fluent/plugin/in_mongo_tail.rb:59:in run'

Dynamic Collection Name From Record Field

Hi,
I think it would be great to use any field of record by tag_mapped property. For example i would name my collections by AppId on my event.
There are some plugins which allow me to rename tag but, those plugins doesn't seem to allow ruby capabilities and i couldn't parse record like, record["AppId"].
Writing Hard-Coded tag name wouldn't help me, cause new applications added to system daily.
Is this an issue, or could you direct me any other approach?
Any help will be appreciated.

Compatible with fluentd 0.12.5

I'm using fluentd 0.12.5 but fluent-plugin-mongo requires fluentd 0.10.
I have modified the gemspec from "fluentd", "~> 0.10.9" to "fluentd", ">= 0.10.9" manually. It seems that the plugin works just fine.
Is there any compatibility problems with the newer fluentd tha I don't know?

error_class="Mongo::ConnectionTimeoutError"

Hi, my conf file is like:

<match fx*>
  # plugin type
  type mongo

  # mongodb db + collection
  database bigdata
  collection log_store

  # mongodb host + port
  host 127.0.0.1
  port 27017

  # interval
  flush_interval 10s

  # buffer_queue_limit
  buffer_queue_limit 256mb
  buffer_chunk_limit 8388608
  num_threads 3

  # make sure to include the time key
  include_time_key true
</match>

And when I check the logs, It give me a error:

2015-08-31 09:16:33 +0800 [warn]: temporarily failed to flush the buffer. next_retry=2015-08-31 09:16:29 +0800 error_class="Mongo::ConnectionTimeoutError" error="could not obtain connection within 5.0 seconds. The max pool size is currently 1; consider increasing the pool size or timeout." plugin_id="object:3f9ca6a16080"
  2015-08-31 09:16:33 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.2/lib/mongo/connection/pool.rb:261:in `block in checkout'
  2015-08-31 09:16:33 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.2/lib/mongo/connection/pool.rb:259:in `loop'
  2015-08-31 09:16:33 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.2/lib/mongo/connection/pool.rb:259:in `checkout'

Can you give me some suggestions??
Thank you very much!

time => timestamp

How to change time to timestamp for writing mongodb??

from:
"time": ISODate("2014-04-21T03:06:57.0Z")

to:
"timestamp": ISODate("2014-04-21T03:06:57.0Z")

thanks

failed to flush the buffer - Mongo_replset does not reconnect to replica set

We're using td-agent 2.3.1 and we are getting the following error messages from time to time.
After the error message the output plugin out_mongo_replset does not reconnect to the replica set and stays in this stuck state until it's restarted. No new messages are saved in MongoDB.

td-agent[18558]: 2016-09-24 22:02:21 +0000 [warn]: Failed to connect to Replica Set. Try to retry: retry number = 1
td-agent[18558]: 2016-09-24 22:02:21 +0000 [warn]: retry succeeded. plugin_id="object:153dc74"
td-agent[18558]: 2016-09-24 22:09:52 +0000 [warn]: temporarily failed to flush the buffer. next_retry=2016-09-24 22:09:33 +0000 error_class="NameError" error="uninitialized constant IRB::Abort" plugin_id="object:153dc74"
td-agent[18558]: 2016-09-24 22:09:52 +0000 [warn]: suppressed same stacktrace
td-agent[18558]: 2016-09-24 22:10:20 +0000 [warn]: temporarily failed to flush the buffer. next_retry=2016-09-24 22:09:35 +0000 error_class="NameError" error="uninitialized constant IRB::Abort" plugin_id="object:153dc74"
td-agent[18558]: 2016-09-24 22:10:20 +0000 [warn]: suppressed same stacktrace
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: temporarily failed to flush the buffer. next_retry=2016-09-24 22:09:39 +0000 error_class="Mongo::OperationTimeout" error="Timed out waiting on socket read." plugin_id="object:153dc74"
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/networking.rb:336:in `rescue in receive_message_on_socket'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/networking.rb:330:in `receive_message_on_socket'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/networking.rb:191:in `receive_header'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/networking.rb:182:in `receive'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/functional/authentication.rb:446:in `auth_command'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/functional/authentication.rb:409:in `issue_scram'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/functional/authentication.rb:225:in `issue_authentication'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/connection/pool.rb:321:in `block in check_auths'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/2.1.0/set.rb:263:in `each_key'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/2.1.0/set.rb:263:in `each'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/connection/pool.rb:320:in `check_auths'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/connection/pool.rb:285:in `block (2 levels) in checkout'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/connection/pool.rb:267:in `synchronize'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/connection/pool.rb:267:in `block in checkout'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/connection/pool.rb:260:in `loop'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/connection/pool.rb:260:in `checkout'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/mongo_replica_set_client.rb:413:in `get_socket_from_pool'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/mongo_replica_set_client.rb:371:in `block in checkout_reader'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/mongo_replica_set_client.rb:354:in `checkout'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/mongo_replica_set_client.rb:369:in `checkout_reader'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/cursor.rb:623:in `checkout_socket_from_connection'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/cursor.rb:550:in `block in send_initial_query'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/functional/logging.rb:55:in `block in instrument'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/functional/logging.rb:20:in `instrument'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/functional/logging.rb:54:in `instrument'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/cursor.rb:547:in `send_initial_query'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/cursor.rb:532:in `refresh'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/cursor.rb:139:in `next'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/db.rb:607:in `command'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/collection_writer.rb:353:in `block in batch_message_send'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/functional/logging.rb:55:in `block in instrument'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/functional/logging.rb:20:in `instrument'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/functional/logging.rb:54:in `instrument'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/collection_writer.rb:352:in `batch_message_send'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/collection_writer.rb:82:in `block in batch_write_incremental'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/collection_writer.rb:57:in `catch'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/collection_writer.rb:57:in `batch_write_incremental'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/collection.rb:1184:in `batch_write'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/mongo-1.12.5/lib/mongo/collection.rb:411:in `insert'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-mongo-0.7.12/lib/fluent/plugin/out_mongo.rb:154:in `operate'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-mongo-0.7.12/lib/fluent/plugin/out_mongo_replset.rb:45:in `block in operate'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-mongo-0.7.12/lib/fluent/plugin/out_mongo_replset.rb:61:in `rescue_connection_failure'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-mongo-0.7.12/lib/fluent/plugin/out_mongo_replset.rb:44:in `operate'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-mongo-0.7.12/lib/fluent/plugin/out_mongo.rb:133:in `write'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.20/lib/fluent/buffer.rb:345:in `write_chunk'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.20/lib/fluent/buffer.rb:324:in `pop'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.20/lib/fluent/output.rb:329:in `try_flush'
td-agent[18558]: 2016-09-24 22:10:40 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.20/lib/fluent/output.rb:140:in `run'
systemd[1]: Stopping td-agent...

Convert numbers in hash keys to strings

Sometimes we see "keys must be strings or symbols" error in fluent logs. It is cause of numbers in hash we need to log.

It will be great to have an option to convert any numeric keys in hashes to strings to avoid problems with BSON serialization.

Problem installing mongo plugin on windows machine

Hi!
I want to use the fluent-plugin-mongo to push events to mongodb.
I have installed fluentd on a windows machine. It works fine.

I want to use mongo plugin => plugin is not installed.
Website tells to use "fluent-gem install fluent-plugin-mongo" to install.
Using this I get an error:
'make' is not recognized as an internal or external command,
operable program or batch file.

fluentd or rubydevkit does not include a make command.
I installed gnuwin32 to use the make.

I then get this error:

make "DESTDIR="
generating bson_native-x64-mingw32.def
make: *** No rule to make target `/C/opt/td-agent/embedded/include/ruby-2.4.0/ruby.h', needed by 
`bson_native.o'.  Stop.

What can I do to install and use mongo plugin in fluentd?

Thanks
Andreas

Can't save time field with sub-second

I can't save time field with sub-seconds (milliseconds).

I found that the MongoOutput.format() method has some problem.

    def format(tag, time, record)
      [time, record].to_msgpack
    end

The type of the 'time' argument is Fluent::EventTime and it contains sub-second field.
But to_msgpack can't handle this correctly, and it seems to be encoded as Fixnum so sub-seconds fields is lost.

undefined method `time_key' for nil:NilClass

I cannot insert into our mongo with 1.0.0.rc2.
fluentd v0.14.21 printed followings.

failed to flush the buffer. retry_time=0 next_retry_seconds=2017-09-20 13:07:21 +0000 chunk="5599e28e651d49684934c52d2a6eec30" error_class=NoMethodError error="undefined method `time_key' for nil:NilClass"
/usr/lib/ruby/gems/2.3.0/gems/fluent-plugin-mongo-1.0.0.rc2/lib/fluent/plugin/out_mongo.rb:194:in `collect_records'
/usr/lib/ruby/gems/2.3.0/gems/fluent-plugin-mongo-1.0.0.rc2/lib/fluent/plugin/out_mongo.rb:176:in `write'
/usr/lib/ruby/gems/2.3.0/gems/fluentd-0.14.21/lib/fluent/plugin/output.rb:1061:in `try_flush'
/usr/lib/ruby/gems/2.3.0/gems/fluentd-0.14.21/lib/fluent/plugin/output.rb:1286:in `flush_thread_run'
/usr/lib/ruby/gems/2.3.0/gems/fluentd-0.14.21/lib/fluent/plugin/output.rb:438:in `block (2 levels) in start'
/usr/lib/ruby/gems/2.3.0/gems/fluentd-0.14.21/lib/fluent/plugin_helper/thread.rb:78:in `block in thread_create'

Am i doing something wrong in my setting?

<match app_log>
    @type mongo
    connection_string "mongodb://#{ENV['FLUENT_MONGO_NODES']}/app"
    collection app_log

    capped
    capped_size 1024m

    <buffer>
      @type file
      path "#{ENV['FLUENT_LOG_DIR']}/buffer/mongo-app_log.*"
      flush_interval 10s
    </buffer>
  </match>

P.S.
Looking at out_mongo.rb:194, @inject_config may be for only testing but it's not checked if exists.

MongoDB and Ruby-Driver have a size limit of insert operation.

Fluentd has a buffer, so Fluentd stores some objects to its buffer.
MongoDB or Ruby-Driver raise "Exceded maximum insert size of 16,000,000 bytes" exception when
buffering size gets over the size limit of insert operation.

We can check MongoDB version and reset Fluentd's buffer size to version-dependent MongoDB's max size.

fluentd stopped logging due to outdated position file

I just had a customer case where fluentd stopped sending mongodb logs. Removing the mongodb.pos file solved the issue.

I believe fluentd (or this plugin?) somehow did not notice that the mongodb log file was rotated. I do not know enough about fluentd or mongodb to suggest any mitigation strategy.

I Wanna Use Mongo Driver 1.12

Hi, I have question.

I wanna use MondoDB 3.0, because customer hope it.

But this plug-in supports Mongo Driver 1.9.

@fluent-plugin-mongo.gemspec
gem.add_dependency "mongo", "~> 1.9"

If this plug-in supports Mongo Driver 1.12, I can use MongoDB 3.0
(It's specified in Mongo Offical Site)

Do you have any problems, even If I use this plug-in with Mongo Driver 1.12?

Can't get log data

Hey guys, would you please let me know what wrong with the following configuration? Thank you in advance.

Use case

I want to use fluentd to output log data into mongodb

Configuration

<source>
  @type forward
  port 24224
  bind 0.0.0.0
</source>

<match log.*>
  @type stdout
</match>

# Single MongoDB
<match log.**>
  @type mongo
  database logger
  collection label

  # Set 'tag_mapped' if you want to use tag mapped mode.
  tag_mapped

  # for capped collection
  capped
  capped_size 256m

  # key name of timestamp
  time_key timestamp

  # flush
  flush_interval 10s
</match>

Test

I used a script provided by fluent-logger-node:

'use strict'
var log4js = require('log4js');
log4js.addAppender(require('fluent-logger').support.log4jsAppender('log', {
   host: 'localhost',
   port: 24224,
   timeout: 3.0,
   levelTag: true
}));

var logger = log4js.getLogger('foo');

function emit(){
    logger.info('this log record is sent to fluent daemon');
}


setInterval(emit, 1000)

Expected result

Having log records in collection label of database logger

Actual result

> use logger
switched to db logger
> db.label.find()

date is always rewritten with current timestamp

I have following config of fluentd:

<source>
    type tail
    path /tmp/alog.json
    pos_file /tmp/alog.json.pos
    tag alog
    format json
  </source>
  <match *>
    type copy
    <store>
      type stdout
    </store>
    <store>
      @type mongo
      host localhost
      port 27017
      database test
      collection audit_log
      time_key eventdate
      flush_interval 10s
    </store>
  </match>

I parse following log file:
{"user": "user9","message": "Log in","domain": "customer","eventdate" : "2016-01-01T12:10:30Z"}

However when I check mongo the record is always with the date of submission to mongo, not with the date from the logfile:

{ "_id" : ObjectId("5745baca175ed22164000031"), "user" : "user9", "message" : "Log in", "domain" : "customer", "eventdate" : ISODate("2016-05-25T14:46:31Z") }

how to support mutlprocess

HI
i try to modified code for support multiprocess for mongo plugin, below is code:
def start
get_or_create_collection(@collection) unless @tag_mapped
@buffer.buffer_chunk_limit = available_buffer_chunk_limit
detach_multi_process do
super
end
end
but not work,below is error infomatin
2015-07-09 12:45:45 +0800 [warn]: failed to emit error="undefined method next' for nil:NilClass" pid=11030 2015-07-09 12:45:45 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.7/lib/fluent/buffer.rb:184:inblock in emit'
2015-07-09 12:45:45 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/2.1.0/monitor.rb:211:in mon_synchronize' 2015-07-09 12:45:45 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.7/lib/fluent/buffer.rb:179:inemit'
2015-07-09 12:45:45 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.7/lib/fluent/output.rb:251:in emit' 2015-07-09 12:45:45 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-mongo-0.7.8/lib/fluent/plugin/out_mongo.rb:116:inemit'
2015-07-09 12:45:45 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.7/lib/fluent/process.rb:152:in block in output_forward_main' 2015-07-09 12:45:45 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.7/lib/fluent/process.rb:205:incall'
2015-07-09 12:45:45 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.7/lib/fluent/process.rb:205:in block in read_event_stream' 2015-07-09 12:45:45 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.7/lib/fluent/process.rb:202:ineach'
2015-07-09 12:45:45 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.7/lib/fluent/process.rb:202:in read_event_stream' 2015-07-09 12:45:45 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.7/lib/fluent/process.rb:147:inoutput_forward_main'

Ruby's Date objects are not stored properly

When you pass Ruby's date related objects (such as Date, Time, DateTime) to Fluent::Logger#post, out_mongo stores it as a plain string in mongodb.
I think it must be stored as an ISODate("...") format, like the way usual timestamps are stored.


My code:
Fluent::Logger.post('mytag', {text: 'foo', created_at: DateTime.now})

Results in:

{
  "_id" : ObjectId("50b37f022ed6f3145700005d"),
  "text" : "foo",
  "created_at" : "2012-11-26T23:38:51+09:00",
  "time" : ISODate("2012-11-26T14:38:51Z")
}

Is this fluent-plugin-mongo's problem? Or is it to_msgpack related problem? I'm quite not sure, but it must be related with the serialization process.

mongo_replset unexpectedly requires 'host' in configuration

According to README.rdoc, the configuration of "mongo_replset" is defined as follows:

<match mongo.**>
  type mongo_replset
  database fluent
  collection logs

  # each node separated by ','
  nodes localhost:27017,localhost:27018,localhost:27019
  ...

In reality, it doesn't work as I expect. Given the configuration like this:

<match mongo.**>
  type mongo_replset
  database fluent
  collection logs
  nodes mg1:27017,mg2:27017,mg3:27017
  ...

I got the following message:

# fluentd -v
2012-02-23 15:35:55 +0900: fluent/supervisor.rb:143:supervise: starting fluentd-0.10.13
...
2012-02-23 15:35:55 +0900: fluent/engine.rb:79:block in configure: adding match pattern="**" type="mongo_replset"
2012-02-23 15:35:55 +0900: plugin/out_mongo.rb:186:rescue in available_buffer_chunk_limit: Failed to connect to 'mongod'. Please restart 'fluentd' after 'mongod' started: Failed to connect to a master node at localhost:27017

I needed to add "host" in addtion to "nodes":

<match mongo.**>
  type mongo_replset
  database fluent
  collection logs
  host mg1             # THIS LINE REQUIRED!
  nodes mg1:27017,mg2:27017,mg3:27017
  ...

I suppose mongo_replset should refer to "nodes" instead of "host".

Cheers,

Insert a record (ISODate) to MongoDB

What could i do differently to achieve the target ?

data (csv)

Leonardo,2015-09-02 12:36:15:
Michelangelo,2015-09-03 06:23:11:

configure

<source>
  type tail
  tag mongo
  path /home/vagrant/fluentd/data/*
  format csv
  keys user,date
  types user:string, date:string
  pos_file /home/vagrant/fluentd/pos_file/temp.pos  
</source>

<match forward1.**>
...

  type mongo
  host localhost
  port 27017
  database test
  collection foo
</match>

MongoDB Query Document
current

db.foo.find()
{ "_id" : ..., "user" : "Leonardo", "date" : "2015-09-02 12:36:15:" , "time" : ISODate("2015-09-17T05:42:15Z")}
{ "_id" : ..., "user" : "Michelangelo", "date" : "2015-09-03 06:23:11:", "time" : ISODate("2015-09-17T05:42:15Z") }

target

db.foo.find()
{ "_id" : ..., "user" : "Leonardo", "date" : ISODate("2015-09-02T12:36:15Z") , "time" : ISODate("2015-09-17T05:42:15Z")}
{ "_id" : ..., "user" : "Michelangelo", "date" : ISODate("2015-09-03T06:23:11Z"), "time" : ISODate("2015-09-17T05:42:15Z") }

version mismatch with mongo-ruby-driver 2.0.x

It seems that API call of mongo-ruby-driver is not matched with latest version (2.0.1)

For example, creating client code in out_mongo.rb is below.

db = Mongo::MongoClient.new(@host, @port, @connection_options).db(@database)

However example to create client in the document is below.

client = Mongo::Client.new([ '127.0.0.1:27017' ], :database => 'music')

Then, fluent-plugin-mongo has following error with mongo-ruby-driver 2.0.1.

2015-03-30 15:22:40 +0000 [error]: unexpected error error_class=NameError error=#<NameError: uninitialized constant Mongo::MongoClient>
  2015-03-30 15:22:40 +0000 [error]: /var/lib/gems/2.2.0/gems/fluent-plugin-mongo-0.7.8/lib/fluent/plugin/out_mongo.rb:215:in `get_connection'
  2015-03-30 15:22:40 +0000 [error]: /var/lib/gems/2.2.0/gems/fluent-plugin-mongo-0.7.8/lib/fluent/plugin/out_mongo.rb:192:in `get_or_create_collection'
  2015-03-30 15:22:40 +0000 [error]: /var/lib/gems/2.2.0/gems/fluent-plugin-mongo-0.7.8/lib/fluent/plugin/out_mongo.rb:87:in `start'
  2015-03-30 15:22:40 +0000 [error]: /var/lib/gems/2.2.0/gems/fluentd-0.12.7/lib/fluent/agent.rb:67:in `block in start'
  2015-03-30 15:22:40 +0000 [error]: /var/lib/gems/2.2.0/gems/fluentd-0.12.7/lib/fluent/agent.rb:66:in `each'
  2015-03-30 15:22:40 +0000 [error]: /var/lib/gems/2.2.0/gems/fluentd-0.12.7/lib/fluent/agent.rb:66:in `start'
  2015-03-30 15:22:40 +0000 [error]: /var/lib/gems/2.2.0/gems/fluentd-0.12.7/lib/fluent/root_agent.rb:104:in `start'
  2015-03-30 15:22:40 +0000 [error]: /var/lib/gems/2.2.0/gems/fluentd-0.12.7/lib/fluent/engine.rb:201:in `start'
  2015-03-30 15:22:40 +0000 [error]: /var/lib/gems/2.2.0/gems/fluentd-0.12.7/lib/fluent/engine.rb:151:in `run'
  2015-03-30 15:22:40 +0000 [error]: /var/lib/gems/2.2.0/gems/fluentd-0.12.7/lib/fluent/supervisor.rb:481:in `run_engine'
  2015-03-30 15:22:40 +0000 [error]: /var/lib/gems/2.2.0/gems/fluentd-0.12.7/lib/fluent/supervisor.rb:140:in `block in start'
  2015-03-30 15:22:40 +0000 [error]: /var/lib/gems/2.2.0/gems/fluentd-0.12.7/lib/fluent/supervisor.rb:266:in `call'
  2015-03-30 15:22:40 +0000 [error]: /var/lib/gems/2.2.0/gems/fluentd-0.12.7/lib/fluent/supervisor.rb:266:in `main_process'
  2015-03-30 15:22:40 +0000 [error]: /var/lib/gems/2.2.0/gems/fluentd-0.12.7/lib/fluent/supervisor.rb:241:in `block in supervise'
  2015-03-30 15:22:40 +0000 [error]: /var/lib/gems/2.2.0/gems/fluentd-0.12.7/lib/fluent/supervisor.rb:240:in `fork'
  2015-03-30 15:22:40 +0000 [error]: /var/lib/gems/2.2.0/gems/fluentd-0.12.7/lib/fluent/supervisor.rb:240:in `supervise'
  2015-03-30 15:22:40 +0000 [error]: /var/lib/gems/2.2.0/gems/fluentd-0.12.7/lib/fluent/supervisor.rb:134:in `start'
  2015-03-30 15:22:40 +0000 [error]: /var/lib/gems/2.2.0/gems/fluentd-0.12.7/lib/fluent/command/fluentd.rb:167:in `<top (required)>'
  2015-03-30 15:22:40 +0000 [error]: /usr/lib/ruby/2.2.0/rubygems/core_ext/kernel_require.rb:54:in `require'
  2015-03-30 15:22:40 +0000 [error]: /usr/lib/ruby/2.2.0/rubygems/core_ext/kernel_require.rb:54:in `require'
  2015-03-30 15:22:40 +0000 [error]: /var/lib/gems/2.2.0/gems/fluentd-0.12.7/bin/fluentd:6:in `<top (required)>'
  2015-03-30 15:22:40 +0000 [error]: /usr/local/bin/fluentd:23:in `load'
  2015-03-30 15:22:40 +0000 [error]: /usr/local/bin/fluentd:23:in `<main>'

I'm not an expert in them and I'm sorry if I miss something.

Thanks.

fluent-plugin-mongo fails when trying to use mongo authentication

When trying to use fluent-plugin-mongo against a server that has authentication enabled, the plugin is throwing an error. This appears related to a call to serverStatus(), which is an admin command.

root@fluent-vm:~# fluentd -vv -c fluent.conf
...
2012-11-30 01:04:17 -0800: plugin/out_mongo.rb:206:rescue in available_buffer_chunk_limit: Operation failed. Probably, 'mongod' needs an authentication: Database command 'serverStatus' failed: (host: 'fluent-vm'; version: '2.2.2'; process: 'mongod'; pid: '13058'; uptime: '42.0'; uptimeMillis: '41408'; uptimeEstimate: '38.0'; localTime: '2012-11-30 09:04:17 UTC'; locks: '{"."=>{"timeLockedMicros"=>{"R"=>2804, "W"=>16228}, "timeAcquiringMicros"=>{"R"=>2621, "W"=>298}}, "admin"=>{"timeLockedMicros"=>{"r"=>146, "w"=>0}, "timeAcquiringMicros"=>{"r"=>10, "w"=>0}}, "local"=>{"timeLockedMicros"=>{"r"=>4, "w"=>0}, "timeAcquiringMicros"=>{"r"=>2, "w"=>0}}, "fluent_test"=>{"timeLockedMicros"=>{"r"=>169, "w"=>0}, "timeAcquiringMicros"=>{"r"=>10, "w"=>0}}}'; globalLock: '{"totalTime"=>41408000, "lockTime"=>16228, "currentQueue"=>{"total"=>0, "readers"=>0, "writers"=>0}, "activeClients"=>{"total"=>0, "readers"=>0, "writers"=>0}}'; mem: '{"bits"=>64, "resident"=>63, "virtual"=>450, "supported"=>true, "mapped"=>160, "mappedWithJournal"=>320}'; connections: '{"current"=>2, "available"=>19998}'; extra_info: '{"note"=>"fields vary by platform", "heap_usage_bytes"=>25612864, "page_faults"=>53}'; indexCounters: '{"btree"=>{"accesses"=>0, "hits"=>0, "misses"=>0, "resets"=>0, "missRatio"=>0.0}}'; backgroundFlushing: '{"flushes"=>0, "total_ms"=>0, "average_ms"=>0.0, "last_ms"=>0, "last_finished"=>1970-01-01 00:00:00 UTC}'; cursors: '{"totalOpen"=>0, "clientCursors_size"=>0, "timedOut"=>0}'; network: '{"bytesIn"=>1213, "bytesOut"=>2992, "numRequests"=>13}'; opcounters: '{"insert"=>0, "query"=>2, "update"=>0, "delete"=>0, "getmore"=>0, "command"=>14}'; asserts: '{"regular"=>0, "warning"=>0, "msg"=>0, "user"=>1, "rollovers"=>0}'; writeBacksQueued: 'false'; dur: '{"commits"=>29, "journaledMB"=>0.0, "writeToDataFilesMB"=>0.0, "compression"=>0.0, "commitsInWriteLock"=>0, "earlyCommits"=>0, "timeMs"=>{"dt"=>3001, "prepLogBuffer"=>0, "writeToJournal"=>0, "writeToDataFiles"=>0, "remapPrivateView"=>0}}'; recordStats: '{"accessesNotInMemory"=>0, "pageFaultExceptionsThrown"=>0}'; errmsg: 'exception: unauthorized db:admin ns:admin lock type:0 client:127.0.0.1'; code: '10057'; ok: '0.0').
2012-11-30 01:04:17 -0800: fluent/supervisor.rb:170:supervise: process finished code=256
2012-11-30 01:04:17 -0800: fluent/supervisor.rb:173:supervise: process died within 1 second. exit.

To reproduce:

Setup a simple config file

root@fluent-vm:~# cat fluent.conf 
<match fluent_test.**>
  type mongo

  database fluent_test
  host 127.0.0.1
  user fluent_test
  password fluent_test

  tag_mapped
</match>

Create a user account in mongo:

root@fluent-vm:~# mongo
MongoDB shell version: 2.2.2
connecting to: test
> use admin
switched to db admin
> db.auth('admin', <password>)
1
> use fluent_test
switched to db fluent_test
> db.addUser('fluent_test', 'fluent_test')
{
    "user" : "fluent_test",
    "readOnly" : false,
    "pwd" : "e6ff6cbb78c8b716951541ae986ebb86",
    "_id" : ObjectId("50b871ed1600948cabadf0c1")
}
> show users
{
    "_id" : ObjectId("50b871ed1600948cabadf0c1"),
    "user" : "fluent_test",
    "readOnly" : false,
    "pwd" : "e6ff6cbb78c8b716951541ae986ebb86"
}

Run fluent:

root@fluent-vm:~# fluentd -vv -c fluent.conf
2012-11-30 01:04:17 -0800: fluent/supervisor.rb:153:supervise: starting fluentd-0.10.29
2012-11-30 01:04:17 -0800: fluent/supervisor.rb:235:read_config: reading config file path="fluent.conf"
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered buffer plugin 'file'
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered buffer plugin 'memory'
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered input plugin 'debug_agent'
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered input plugin 'exec'
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered input plugin 'forward'
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered input plugin 'gc_stat'
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered input plugin 'http'
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered input plugin 'object_space'
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered input plugin 'status'
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered input plugin 'tcp'
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered input plugin 'unix'
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered input plugin 'syslog'
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered input plugin 'tail'
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered output plugin 'copy'
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered output plugin 'exec'
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered output plugin 'exec_filter'
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered output plugin 'file'
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered output plugin 'forward'
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered output plugin 'null'
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered output plugin 'roundrobin'
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered output plugin 'stdout'
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered output plugin 'tcp'
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered output plugin 'unix'
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered output plugin 'test'
2012-11-30 01:04:17 -0800: fluent/engine.rb:81:block in configure: adding match pattern="fluent_test.**" type="mongo"
2012-11-30 01:04:17 -0800: fluent/plugin.rb:89:register_impl: registered output plugin 'mongo'

**Notice: C extension not loaded. This is required for optimum MongoDB Ruby driver performance.
  You can install the extension as follows:
  gem install bson_ext

  If you continue to receive this message after installing, make sure that the
  bson_ext gem is in your load path and that the bson_ext and mongo gems are of the same version.

2012-11-30 01:04:17 -0800: plugin/out_mongo.rb:70:configure: Setup mongo configuration: mode = tag mapped
2012-11-30 01:04:17 -0800: plugin/out_mongo.rb:206:rescue in available_buffer_chunk_limit: Operation failed. Probably, 'mongod' needs an authentication: Database command 'serverStatus' failed: (host: 'fluent-vm'; version: '2.2.2'; process: 'mongod'; pid: '13058'; uptime: '42.0'; uptimeMillis: '41408'; uptimeEstimate: '38.0'; localTime: '2012-11-30 09:04:17 UTC'; locks: '{"."=>{"timeLockedMicros"=>{"R"=>2804, "W"=>16228}, "timeAcquiringMicros"=>{"R"=>2621, "W"=>298}}, "admin"=>{"timeLockedMicros"=>{"r"=>146, "w"=>0}, "timeAcquiringMicros"=>{"r"=>10, "w"=>0}}, "local"=>{"timeLockedMicros"=>{"r"=>4, "w"=>0}, "timeAcquiringMicros"=>{"r"=>2, "w"=>0}}, "fluent_test"=>{"timeLockedMicros"=>{"r"=>169, "w"=>0}, "timeAcquiringMicros"=>{"r"=>10, "w"=>0}}}'; globalLock: '{"totalTime"=>41408000, "lockTime"=>16228, "currentQueue"=>{"total"=>0, "readers"=>0, "writers"=>0}, "activeClients"=>{"total"=>0, "readers"=>0, "writers"=>0}}'; mem: '{"bits"=>64, "resident"=>63, "virtual"=>450, "supported"=>true, "mapped"=>160, "mappedWithJournal"=>320}'; connections: '{"current"=>2, "available"=>19998}'; extra_info: '{"note"=>"fields vary by platform", "heap_usage_bytes"=>25612864, "page_faults"=>53}'; indexCounters: '{"btree"=>{"accesses"=>0, "hits"=>0, "misses"=>0, "resets"=>0, "missRatio"=>0.0}}'; backgroundFlushing: '{"flushes"=>0, "total_ms"=>0, "average_ms"=>0.0, "last_ms"=>0, "last_finished"=>1970-01-01 00:00:00 UTC}'; cursors: '{"totalOpen"=>0, "clientCursors_size"=>0, "timedOut"=>0}'; network: '{"bytesIn"=>1213, "bytesOut"=>2992, "numRequests"=>13}'; opcounters: '{"insert"=>0, "query"=>2, "update"=>0, "delete"=>0, "getmore"=>0, "command"=>14}'; asserts: '{"regular"=>0, "warning"=>0, "msg"=>0, "user"=>1, "rollovers"=>0}'; writeBacksQueued: 'false'; dur: '{"commits"=>29, "journaledMB"=>0.0, "writeToDataFilesMB"=>0.0, "compression"=>0.0, "commitsInWriteLock"=>0, "earlyCommits"=>0, "timeMs"=>{"dt"=>3001, "prepLogBuffer"=>0, "writeToJournal"=>0, "writeToDataFiles"=>0, "remapPrivateView"=>0}}'; recordStats: '{"accessesNotInMemory"=>0, "pageFaultExceptionsThrown"=>0}'; errmsg: 'exception: unauthorized db:admin ns:admin lock type:0 client:127.0.0.1'; code: '10057'; ok: '0.0').
2012-11-30 01:04:17 -0800: fluent/supervisor.rb:170:supervise: process finished code=256
2012-11-30 01:04:17 -0800: fluent/supervisor.rb:173:supervise: process died within 1 second. exit.

Notice that an exception is thrown at plugin/out_mongo.rb:206

Collection created is not capped

This is the fluentd config I have:

<match mongo.**>
  type mongo

  database fluentd

  # tags are collections
  tag_mapped

  capped
  capped_size 1024m

  remove_tag_prefix mongo.

  host 127.0.0.1
  port 27017

  flush_interval 10s

  include_time_key true
</match>

In mongo shell:

> db.dev.apache.access.isCapped()
false
> db.version()
3.2.4

It would seem the db is not capped as it should. I tried dropping the database and restarting fluentd, same result.

error="undefined method `to_msgpack'

Problem(s)

2016-08-11 21:49:35 +0000 [warn]: emit transaction failed: error_class=NoMethodError error="undefined method to_msgpack' for BSON::ObjectId('572a626be5f72f000600000c'):BSON::ObjectId" tag="ast13-ast-1" 2016-08-11 21:49:35 +0000 [warn]: /usr/lib/ruby/gems/2.3.0/gems/fluent-plugin-elasticsearch-1.5.0/lib/fluent/plugin/out_elasticsearch.rb:174:into_msgpack'
2016-08-11 21:49:35 +0000 [warn]: /usr/lib/ruby/gems/2.3.0/gems/fluent-plugin-elasticsearch-1.5.0/lib/fluent/plugin/out_elasticsearch.rb:174:in `format'

Steps to replicate

  <source>
    type mongo_tail
    url mongodb://mongodb/grid_server
    collection container_logs
    tag_key name
    time_key created_at
    id_store_collection container_logs_tail
  </source>
  <match **>
    type elasticsearch
    logstash_format true
    host elasticsearchhost.example
    port 9200
    index_name fluentd
    type_name fluentd
    include_tag_key true
  </match>

Upgrade Mongo Ruby driver to 2.x series

Problems when using Ruby Mongo Driver 2.1 series in fluent-plugin-mongo

How To Upgrade

  • Insert records one by one (become slower)
  • Use bulk insert and ignore writing errors
  • Use bulk insert but detect writing errors by fluent-plugin-mongo side

Which upgrading strategy is better?

Benchmarks

Sited from okkez#1
Benchmarking code: https://gist.github.com/cosmo0920/b33c96e071c5c6923304

With Mongo Driver 1.12.3

insert one by one (legacy)
       user     system      total        real
   2.880000   0.530000   3.410000 (  5.365804)
bulk insert (legacy)
       user     system      total        real
   0.200000   0.010000   0.210000 (  0.314387)

With Mongo Driver 2.1.2

insert one by one
       user     system      total        real
   4.670000   0.860000   5.530000 (  8.229355)
bulk insert
       user     system      total        real
   0.150000   0.000000   0.150000 (  0.238776)

Insert in mongos : operate_invalid_records

All went ok after this patch #22 (comment)
Until I get another error :

2013-01-15 17:49:50 +0100: temporarily failed to flush the buffer, next retry will be at 2013-01-15 17:49:43 +0100. error="String not valid UTF-8" instance=70329051781420
  2013-01-15 17:49:50 +0100: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/bson-1.6.4/lib/bson/bson_c.rb:25:in `serialize'
  2013-01-15 17:49:50 +0100: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/bson-1.6.4/lib/bson/bson_c.rb:25:in `serialize'
  2013-01-15 17:49:50 +0100: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/mongo-1.6.4/lib/mongo/collection.rb:972:in `block in insert_documents'
  2013-01-15 17:49:50 +0100: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/mongo-1.6.4/lib/mongo/collection.rb:971:in `each'
  2013-01-15 17:49:50 +0100: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/mongo-1.6.4/lib/mongo/collection.rb:971:in `insert_documents'
  2013-01-15 17:49:50 +0100: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/mongo-1.6.4/lib/mongo/collection.rb:353:in `insert'
  2013-01-15 17:49:50 +0100: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-mongo-0.6.11/lib/fluent/plugin/out_mongo.rb:152:in `operate_invalid_records'
  2013-01-15 17:49:50 +0100: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-mongo-0.6.11/lib/fluent/plugin/out_mongo.rb:124:in `operate'
  2013-01-15 17:49:50 +0100: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluent-plugin-mongo-0.6.11/lib/fluent/plugin/out_mongo.rb:112:in `write'
  2013-01-15 17:49:50 +0100: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.30/lib/fluent/buffer.rb:279:in `write_chunk'
  2013-01-15 17:49:50 +0100: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.30/lib/fluent/buffer.rb:263:in `pop'
  2013-01-15 17:49:50 +0100: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.30/lib/fluent/output.rb:303:in `try_flush'
  2013-01-15 17:49:50 +0100: /usr/lib/fluent/ruby/lib/ruby/gems/1.9.1/gems/fluentd-0.10.30/lib/fluent/output.rb:120:in `run'

It appear sometimes while inserting and makes fluentd crash, when exclude_broken_fields option is set.
I didn't find a way to dump insert requests that contain wrong UTF8

no such file to load -- fluent/plugin/mongo_util (LoadError)

root@linux:/var/lib/gems/1.9.2/gems/fluentd-0.10.28/lib/fluent/plugin# fluentd -v
2012-11-04 00:55:41 +0800: fluent/supervisor.rb:153:supervise: starting fluentd-0.10.28
2012-11-04 00:55:41 +0800: fluent/supervisor.rb:235:read_config: reading config file path="/etc/fluent/fluent.conf"
2012-11-04 00:55:41 +0800: fluent/engine.rb:65:block in configure: adding source type="forward"
2012-11-04 00:55:41 +0800: fluent/engine.rb:81:block in configure: adding match pattern="ff.file" type="mongo"
/var/lib/gems/1.9.2/gems/fluent-plugin-mongo-0.6.10/lib/fluent/plugin/out_mongo.rb:7:in require': no such file to load -- fluent/plugin/mongo_util (LoadError) from /var/lib/gems/1.9.2/gems/fluent-plugin-mongo-0.6.10/lib/fluent/plugin/out_mongo.rb:7:inclass:MongoOutput'
from /var/lib/gems/1.9.2/gems/fluent-plugin-mongo-0.6.10/lib/fluent/plugin/out_mongo.rb:4:in <module:Fluent>' from /var/lib/gems/1.9.2/gems/fluent-plugin-mongo-0.6.10/lib/fluent/plugin/out_mongo.rb:1:in<top (required)>'
from /var/lib/gems/1.9.2/gems/fluentd-0.10.28/lib/fluent/plugin.rb:152:in require' from /var/lib/gems/1.9.2/gems/fluentd-0.10.28/lib/fluent/plugin.rb:152:inblock in try_load_plugin'
from /var/lib/gems/1.9.2/gems/fluentd-0.10.28/lib/fluent/plugin.rb:149:in reverse_each' from /var/lib/gems/1.9.2/gems/fluentd-0.10.28/lib/fluent/plugin.rb:149:intry_load_plugin'
from /var/lib/gems/1.9.2/gems/fluentd-0.10.28/lib/fluent/plugin.rb:97:in new_impl' from /var/lib/gems/1.9.2/gems/fluentd-0.10.28/lib/fluent/plugin.rb:45:innew_output'
from /var/lib/gems/1.9.2/gems/fluentd-0.10.28/lib/fluent/engine.rb:83:in block in configure' from /var/lib/gems/1.9.2/gems/fluentd-0.10.28/lib/fluent/engine.rb:73:ineach'
from /var/lib/gems/1.9.2/gems/fluentd-0.10.28/lib/fluent/engine.rb:73:in configure' from /var/lib/gems/1.9.2/gems/fluentd-0.10.28/lib/fluent/engine.rb:51:inparse_config'
from /var/lib/gems/1.9.2/gems/fluentd-0.10.28/lib/fluent/supervisor.rb:247:in run_configure' from /var/lib/gems/1.9.2/gems/fluentd-0.10.28/lib/fluent/supervisor.rb:88:inblock in start'
from /var/lib/gems/1.9.2/gems/fluentd-0.10.28/lib/fluent/supervisor.rb:180:in call' from /var/lib/gems/1.9.2/gems/fluentd-0.10.28/lib/fluent/supervisor.rb:180:inmain_process'
from /var/lib/gems/1.9.2/gems/fluentd-0.10.28/lib/fluent/supervisor.rb:155:in block in supervise' from /var/lib/gems/1.9.2/gems/fluentd-0.10.28/lib/fluent/supervisor.rb:154:infork'
from /var/lib/gems/1.9.2/gems/fluentd-0.10.28/lib/fluent/supervisor.rb:154:in supervise' from /var/lib/gems/1.9.2/gems/fluentd-0.10.28/lib/fluent/supervisor.rb:83:instart'
from /var/lib/gems/1.9.2/gems/fluentd-0.10.28/lib/fluent/command/fluentd.rb:129:in <top (required)>' from /var/lib/gems/1.9.2/gems/fluentd-0.10.28/bin/fluentd:6:inrequire'
from /var/lib/gems/1.9.2/gems/fluentd-0.10.28/bin/fluentd:6:in <top (required)>' from /etc/alternatives/gem-bin/fluentd:19:inload'

gem install failure with 0.6.6

$ sudo gem install -v 0.6.5 fluent-plugin-mongo
Successfully installed fluent-plugin-mongo-0.6.5
1 gem installed
Installing ri documentation for fluent-plugin-mongo-0.6.5...
Installing RDoc documentation for fluent-plugin-mongo-0.6.5...

$ sudo gem install -v 0.6.6 fluent-plugin-mongo
ERROR: While executing gem ... (NoMethodError)
undefined method `call' for nil:NilClass

Any idea?

number was saved as String

If I send a log from a Java client like this:

dataLogger.log("access", new HashMap<String, Object>() {{
            put("uri", request.getRequestURI());
            put("timestamp", System.currentTimeMillis());
        }});

The "timestamp" field's value in MongoDB is actually saved as String instead of Number. Is there any option to make it to Number?

"journaled" write concern support

Is the "journaled" write concern option supported? (http://docs.mongodb.org/manual/reference/write-concern/#j-option). After looking at the code, it doesn't seem like it - just want to confirm please.

When looking at this line (https://github.com/fluent/fluent-plugin-mongo/blob/master/lib/fluent/plugin/out_mongo.rb#L72), shouldn't @connection_options[:w] be @connection_options[:write]? (https://github.com/mongodb/mongo-ruby-driver/blob/master/lib/mongo/client.rb#L111)

Thanks for your time

log tail CLI

To tail the logs stored in MongoDB, it would be nice to have command line interface.

getting Failed to connect to any given member

I'm trying to have fluentd forward data to a MongoDb replica set. Up until the attempt of adding the mongo_replset, fluentd configuration was working.

Here's the relevant config part:

<match log.**>
  type copy
  <store>
    type mongo
    database dbname
    collection fluentd
    host localhost
    port 27017
    flush_interval 10s
  </store>
  <store>
    type http
    endpoint_url  http://url.com:port/
    http_method     post
    serializer      json
    rate_limit_msec 100
  </store>
  <store>
    type mongo_replset
    database dbname
    collection fluentd
    nodes xx:yy:zz:xx:27999
  </store>
  <store>
    type stdout
  </store>
</match>

Enabling the 3rd <store> block causes the following error to appear in td-agent.log:

2014-12-04 13:26:58 +0100 [error]: unexpected error error_class=Mongo::ConnectionFailure error=#<Mongo::ConnectionFailure: Failed to connect to any given member.>

I believe that the replica set is properly connected, as I can connect to it from the fluentd server using mongo --host xx:yy:zz:xx --port 27999

configure of ouput plugins raises an exception when mongod is down

Since version 0.5.0, ouput plugins check mongo's version to set :buffer_chunk_limit.
mongod_version method tries to connect a target mongod server,
but raises an exception when target mongod server is down.

For the safety of ouput operation, configure method should rescue an execption and
set :buffer_chunk_limit to 2MB if mongod_version raised an exception.

Replica connection

When I'm starting fluent with mongo_replset type, it gives me an error:

config error file="fluent.conf" error_class=Fluent::ConfigError error="'replica_set' parameter is required"

It's clear from the source that nodes param was replaced by replica_set parameter and relies on host and port params inherited from out_mongo but it is not clear from the documentation.

Another question I have is what should I give as a host/port values if I have more than 1 replica member and I can't be sure that all nodes will be available in this replica all the time?

Plugin inserts null values into array

This is my fluentd.conf file:

<source>
  @type tcp
  tag tcp.events
  format json
  port 24224
</source>

<match **>
  @type mongo
  host mongodb
  port 27017
  database logs
  collection logs
  replace_dot_in_key_with _
</match>

When I'm trying to insert a record such as this:

"{"host":"ABC5","timestamp":"147051740.871","level":"1","arrayName":["8", "9"],"thread":"MessageBroker-3","containerId":"?""} \n" (this is how it looks as Java string)

I always get an array with name "arrayName" and two empty (null) values in it. Maybe plugin does not support MongoDB arrays?

Fluent keeps way too few records

Here's my conf:

<source>
  type tail
  path /var/lib/docker/containers/*/*-json.log
  pos_file /var/log/fluentd-docker.pos
  time_format %Y-%m-%dT%H:%M:%S
  tag docker.*
  format json
</source>

<match docker.var.lib.docker.containers.*.*.log>
  type record_reformer
  container_id ${tag_parts[5]}
  tag docker.all
</match>

<match docker.all>

  # plugin type
  type mongo

  # mongodb db + collection
  database main
  collection fluents

  # for capped collection
  capped
  capped_size 5120m

  # json
  format json

  # interval
  flush_interval 2s

  # make sure to include the time key
  include_time_key true

  host xxx
  port xxx

</match>

Running two consecutive queries:

> db.fluents.find()
{ "_id" : ObjectId("54a2df06f058ac000f001488"), "log" : "154.122.124.96 - - [30/Dec/2014:17:21:10 +0000] \"GET / HTTP/1.1\" 304 0 \"-\" \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36\" \"-\"\n", "stream" : "stdout", "container_id" : "e0b87ff04c2999d0a70e1d623b0f1f0293d3430486544d7319e7fa667555a268", "time" : ISODate("2014-12-30T17:21:10Z") }
{ "_id" : ObjectId("54a2df08f058ac00260000b5"), "log" : "154.122.124.96 - - [30/Dec/2014:17:21:10 +0000] \"GET / HTTP/1.1\" 304 0 \"-\" \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36\" \"-\"\n", "stream" : "stdout", "container_id" : "e0b87ff04c2999d0a70e1d623b0f1f0293d3430486544d7319e7fa667555a268", "time" : ISODate("2014-12-30T17:21:10Z") }
{ "_id" : ObjectId("54a2df08f058ac00260000b6"), "log" : "2014/12/30 17:21:11 [error] 7#0: *30 open() \"/usr/share/nginx/html/favicon.ico\" failed (2: No such file or directory), client: 154.122.124.96, server: localhost, request: \"GET /favicon.ico HTTP/1.1\", host: \"31.3.0.19:43306\"\n", "stream" : "stderr", "container_id" : "e0b87ff04c2999d0a70e1d623b0f1f0293d3430486544d7319e7fa667555a268", "time" : ISODate("2014-12-30T17:21:11Z") }
{ "_id" : ObjectId("54a2df08f058ac00260000b7"), "log" : "154.122.124.96 - - [30/Dec/2014:17:21:11 +0000] \"GET /favicon.ico HTTP/1.1\" 404 570 \"-\" \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36\" \"-\"\n", "stream" : "stdout", "container_id" : "e0b87ff04c2999d0a70e1d623b0f1f0293d3430486544d7319e7fa667555a268", "time" : ISODate("2014-12-30T17:21:11Z") }
{ "_id" : ObjectId("54a2df09f058ac000f001489"), "log" : "2014/12/30 17:21:11 [error] 7#0: *30 open() \"/usr/share/nginx/html/favicon.ico\" failed (2: No such file or directory), client: 154.122.124.96, server: localhost, request: \"GET /favicon.ico HTTP/1.1\", host: \"31.3.0.19:43306\"\n", "stream" : "stderr", "container_id" : "e0b87ff04c2999d0a70e1d623b0f1f0293d3430486544d7319e7fa667555a268", "time" : ISODate("2014-12-30T17:21:11Z") }
{ "_id" : ObjectId("54a2df09f058ac000f00148a"), "log" : "154.122.124.96 - - [30/Dec/2014:17:21:11 +0000] \"GET /favicon.ico HTTP/1.1\" 404 570 \"-\" \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36\" \"-\"\n", "stream" : "stdout", "container_id" : "e0b87ff04c2999d0a70e1d623b0f1f0293d3430486544d7319e7fa667555a268", "time" : ISODate("2014-12-30T17:21:11Z") }
{ "_id" : ObjectId("54a2df4bf058ac000f00148b"), "log" : "2014-12-30T17:22:19.113+0000 [clientcursormon] mem (MB) res:65 virt:543\n", "stream" : "stdout", "container_id" : "3fba10c1a8e54fdb66e69d6c03147fcf40f9871bede827bc42e0eef0918c9634", "time" : ISODate("2014-12-30T17:22:19Z") }
{ "_id" : ObjectId("54a2df4bf058ac000f00148c"), "log" : "2014-12-30T17:22:19.113+0000 [clientcursormon]  mapped (incl journal view):320\n", "stream" : "stdout", "container_id" : "3fba10c1a8e54fdb66e69d6c03147fcf40f9871bede827bc42e0eef0918c9634", "time" : ISODate("2014-12-30T17:22:19Z") }
{ "_id" : ObjectId("54a2df4bf058ac000f00148d"), "log" : "2014-12-30T17:22:19.113+0000 [clientcursormon]  connections:25\n", "stream" : "stdout", "container_id" : "3fba10c1a8e54fdb66e69d6c03147fcf40f9871bede827bc42e0eef0918c9634", "time" : ISODate("2014-12-30T17:22:19Z") }
{ "_id" : ObjectId("54a2df4df058ac00260000b8"), "log" : "2014-12-30T17:22:19.113+0000 [clientcursormon] mem (MB) res:65 virt:543\n", "stream" : "stdout", "container_id" : "3fba10c1a8e54fdb66e69d6c03147fcf40f9871bede827bc42e0eef0918c9634", "time" : ISODate("2014-12-30T17:22:19Z") }
{ "_id" : ObjectId("54a2df4df058ac00260000b9"), "log" : "2014-12-30T17:22:19.113+0000 [clientcursormon]  mapped (incl journal view):320\n", "stream" : "stdout", "container_id" : "3fba10c1a8e54fdb66e69d6c03147fcf40f9871bede827bc42e0eef0918c9634", "time" : ISODate("2014-12-30T17:22:19Z") }
{ "_id" : ObjectId("54a2df4df058ac00260000ba"), "log" : "2014-12-30T17:22:19.113+0000 [clientcursormon]  connections:25\n", "stream" : "stdout", "container_id" : "3fba10c1a8e54fdb66e69d6c03147fcf40f9871bede827bc42e0eef0918c9634", "time" : ISODate("2014-12-30T17:22:19Z") }
> db.fluents.find()
{ "_id" : ObjectId("54a2df09f058ac000f001489"), "log" : "2014/12/30 17:21:11 [error] 7#0: *30 open() \"/usr/share/nginx/html/favicon.ico\" failed (2: No such file or directory), client: 154.122.124.96, server: localhost, request: \"GET /favicon.ico HTTP/1.1\", host: \"31.3.0.19:43306\"\n", "stream" : "stderr", "container_id" : "e0b87ff04c2999d0a70e1d623b0f1f0293d3430486544d7319e7fa667555a268", "time" : ISODate("2014-12-30T17:21:11Z") }
{ "_id" : ObjectId("54a2df09f058ac000f00148a"), "log" : "154.122.124.96 - - [30/Dec/2014:17:21:11 +0000] \"GET /favicon.ico HTTP/1.1\" 404 570 \"-\" \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36\" \"-\"\n", "stream" : "stdout", "container_id" : "e0b87ff04c2999d0a70e1d623b0f1f0293d3430486544d7319e7fa667555a268", "time" : ISODate("2014-12-30T17:21:11Z") }
{ "_id" : ObjectId("54a2df4bf058ac000f00148b"), "log" : "2014-12-30T17:22:19.113+0000 [clientcursormon] mem (MB) res:65 virt:543\n", "stream" : "stdout", "container_id" : "3fba10c1a8e54fdb66e69d6c03147fcf40f9871bede827bc42e0eef0918c9634", "time" : ISODate("2014-12-30T17:22:19Z") }
{ "_id" : ObjectId("54a2df4bf058ac000f00148c"), "log" : "2014-12-30T17:22:19.113+0000 [clientcursormon]  mapped (incl journal view):320\n", "stream" : "stdout", "container_id" : "3fba10c1a8e54fdb66e69d6c03147fcf40f9871bede827bc42e0eef0918c9634", "time" : ISODate("2014-12-30T17:22:19Z") }
{ "_id" : ObjectId("54a2df4bf058ac000f00148d"), "log" : "2014-12-30T17:22:19.113+0000 [clientcursormon]  connections:25\n", "stream" : "stdout", "container_id" : "3fba10c1a8e54fdb66e69d6c03147fcf40f9871bede827bc42e0eef0918c9634", "time" : ISODate("2014-12-30T17:22:19Z") }
{ "_id" : ObjectId("54a2df4df058ac00260000b8"), "log" : "2014-12-30T17:22:19.113+0000 [clientcursormon] mem (MB) res:65 virt:543\n", "stream" : "stdout", "container_id" : "3fba10c1a8e54fdb66e69d6c03147fcf40f9871bede827bc42e0eef0918c9634", "time" : ISODate("2014-12-30T17:22:19Z") }
{ "_id" : ObjectId("54a2df4df058ac00260000b9"), "log" : "2014-12-30T17:22:19.113+0000 [clientcursormon]  mapped (incl journal view):320\n", "stream" : "stdout", "container_id" : "3fba10c1a8e54fdb66e69d6c03147fcf40f9871bede827bc42e0eef0918c9634", "time" : ISODate("2014-12-30T17:22:19Z") }
{ "_id" : ObjectId("54a2df4df058ac00260000ba"), "log" : "2014-12-30T17:22:19.113+0000 [clientcursormon]  connections:25\n", "stream" : "stdout", "container_id" : "3fba10c1a8e54fdb66e69d6c03147fcf40f9871bede827bc42e0eef0918c9634", "time" : ISODate("2014-12-30T17:22:19Z") }
{ "_id" : ObjectId("54a2e077f058ac000f00148e"), "log" : "2014-12-30T17:27:19.169+0000 [clientcursormon] mem (MB) res:65 virt:543\n", "stream" : "stdout", "container_id" : "3fba10c1a8e54fdb66e69d6c03147fcf40f9871bede827bc42e0eef0918c9634", "time" : ISODate("2014-12-30T17:27:19Z") }
{ "_id" : ObjectId("54a2e077f058ac000f00148f"), "log" : "2014-12-30T17:27:19.169+0000 [clientcursormon]  mapped (incl journal view):320\n", "stream" : "stdout", "container_id" : "3fba10c1a8e54fdb66e69d6c03147fcf40f9871bede827bc42e0eef0918c9634", "time" : ISODate("2014-12-30T17:27:19Z") }
{ "_id" : ObjectId("54a2e077f058ac000f001490"), "log" : "2014-12-30T17:27:19.169+0000 [clientcursormon]  connections:25\n", "stream" : "stdout", "container_id" : "3fba10c1a8e54fdb66e69d6c03147fcf40f9871bede827bc42e0eef0918c9634", "time" : ISODate("2014-12-30T17:27:19Z") }
{ "_id" : ObjectId("54a2e079f058ac00260000bb"), "log" : "2014-12-30T17:27:19.169+0000 [clientcursormon] mem (MB) res:65 virt:543\n", "stream" : "stdout", "container_id" : "3fba10c1a8e54fdb66e69d6c03147fcf40f9871bede827bc42e0eef0918c9634", "time" : ISODate("2014-12-30T17:27:19Z") }
{ "_id" : ObjectId("54a2e079f058ac00260000bc"), "log" : "2014-12-30T17:27:19.169+0000 [clientcursormon]  mapped (incl journal view):320\n", "stream" : "stdout", "container_id" : "3fba10c1a8e54fdb66e69d6c03147fcf40f9871bede827bc42e0eef0918c9634", "time" : ISODate("2014-12-30T17:27:19Z") }
{ "_id" : ObjectId("54a2e079f058ac00260000bd"), "log" : "2014-12-30T17:27:19.169+0000 [clientcursormon]  connections:25\n", "stream" : "stdout", "container_id" : "3fba10c1a8e54fdb66e69d6c03147fcf40f9871bede827bc42e0eef0918c9634", "time" : ISODate("2014-12-30T17:27:19Z") }

Any idea why? since the disk size is not an issue. Thanks in advance.

Misc for database

Would be great if there was a possibility to dynamically select the database using "misc" how is it possible for the collection.

Federico

Failed to insert large amount of data in a single time

Hello.
I tried to insert large amount of data in a single time using your plugin (actually the data is just an large array). But it didn't work. What I get is not the original data but some binary data instead. However, I successfully insert the data with the official Mongo PHP extension on my local machine.
I have gone through your code and I assumed that it didn't work on Line 152 of your "out_mongo.rb" file.
I am just wondering how I can use your plugin successfully? Look forward to your reply. Thank you.

connection_string does not allow srv type uri

As you know, mongodb offers two kinds of uri.
The first one starts from 'mongodb://', and the other starts from 'mongodb+srv://'.
The connection_string allows only the first one, and when give the srv type as connection string, it raises Invalid URI error.
But, the mongo ruby driver this library uses, already enabled to use srv type as uri, so this library may can use both uris by just changing mongo driver version or the way of making mongo client.(maybe around below)

Mongo::Client.new(@connection_string)

The error message is below:

=Mongo::Error::InvalidURI error="Bad URI: mongodb+srv://~~~~~~~\nInvalid scheme. Scheme must be 'mongodb://'\nMongoDB URI must be in the following format: mongodb://[username:password@]host1[:port1][,host2[:port2],...[,hostN[:portN]]][/[database][?options]]\nPlease see the following URL for more information: http://docs.mongodb.org/manual/reference/connection-string/\n"

Fluentd Not Sending Data to MongoDB

I am trying to send the parsed data from Fluentd to MongoDB. My system configuration is as below:

fluentd-system-configuration

And MongoDB plugin: fluent-plugin-mongo | 1.1.0

My data is being parsed but I am not able to send it to Mongo. Here is my config file:

<match *>
  @type mongo
  host 127.0.0.1
  port 27017
  user tti-prod
  password mypassword
  database fluent-testrun
  collection requests
  capped
  capped_size 100m
</match>

<source>
  @type tail
  path /home/my-app/logs/%Y/%b/app.log
  tag request.main
  format /^\[(?<time>[^\]]*)\] (?<ip>[^ ]*) (?<method>\w*) (?<url>[^ ]*) (?<format>[^ ]*) (?<size>\d*) (?<agent>[^ ]*) (?<status_code>\d*) (?<duration>\d*)$/
  time_format %Y-%m-%d %H:%M:%S 
  pos_file /tmp/fluentd--1516870436.pos
</source>

Can anyone explain this. There is no error log as such.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.