Coder Social home page Coder Social logo

logstash-output-mongodb's Introduction

Logstash Plugin

Travis Build Status

This is a plugin for Logstash.

It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way.

Documentation

Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one central location.

Need Help?

Need help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/logstash discussion forum.

Developing

1. Plugin Developement and Testing

Code

  • To get started, you'll need JRuby with the Bundler gem installed.

  • Create a new plugin or clone and existing from the GitHub logstash-plugins organization. We also provide example plugins.

  • Install dependencies

bundle install

Test

  • Update your dependencies
bundle install
  • Run tests
bundle exec rspec

2. Running your unpublished Plugin in Logstash

2.1 Run in a local Logstash clone

  • Edit Logstash Gemfile and add the local plugin path, for example:
gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome"
  • Install plugin
# Logstash 2.3 and higher
bin/logstash-plugin install --no-verify

# Prior to Logstash 2.3
bin/plugin install --no-verify
  • Run Logstash with your plugin
bin/logstash -e 'filter {awesome {}}'

At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.

2.2 Run in an installed Logstash

You can use the same 2.1 method to run your plugin in an installed Logstash by editing its Gemfile and pointing the :path to your local plugin development directory or you can build the gem and install it using:

  • Build your plugin gem
gem build logstash-filter-awesome.gemspec
  • Install the plugin from the Logstash home
# Logstash 2.3 and higher
bin/logstash-plugin install --no-verify

# Prior to Logstash 2.3
bin/plugin install --no-verify
  • Start Logstash and proceed to test the plugin

Contributing

All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin.

Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here.

It is more important to the community that you are able to contribute.

For more information about contributing, see the CONTRIBUTING file.

logstash-output-mongodb's People

Contributors

colinsurprenant avatar dedemorton avatar electrical avatar guyboertje avatar jakelandis avatar jordansissel avatar jsvd avatar kurtado avatar ph avatar rmmorrison avatar robbavey avatar rocat avatar sdmeisner avatar thikonom avatar yaauie avatar ycombinator avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

logstash-output-mongodb's Issues

uninitialized constant Mongo::URIParser

I also have this problem,does anybody can give me some suggests??
Thank you very much!

The version:

Logstash:  V1.5.4
Mongodb: V 3.0.6

The conf file:

output {
        mongodb {
                collection => "log_store"
                generateId => true
                database => "bigdata"

                uri => "mongodb://xxxx.com:27017"
       }
}

The error log:

{:timestamp=>"2015-09-01T10:22:53.713000+0800", :message=>"The error reported is: \n  uninitialized constant Mongo::URIParser"}
{:timestamp=>"2015-09-01T10:32:22.014000+0800", :message=>"The error reported is: \n  uninitialized constant Mongo::URIParser"}
{:timestamp=>"2015-09-06T14:03:08.574000+0800", :message=>"The error reported is: \n  uninitialized constant Mongo::URIParser"}

bulk size limit

Please post all product and debugging questions on our forum. Your questions will reach our wider community members there, and if we confirm that there is a bug, then we can open a new issue here.

For all general issues, please provide the following details for fast resolution:

  • Version:
  • Operating System:
  • Config File (if you have sensitive info, please remove it):
  • Sample Data:
  • Steps to Reproduce:

For mongodb 3.6, the max write batch size is 10,000. Can you change this value and improve mongodb output throughput?
Please refer to https://docs.mongodb.com/manual/reference/method/db.collection.insertMany/

logstash 1.4.2: mongodb output plugin doesn't support SCRAM-SHA-1 authentication mechanism for mongodb 3.0

(This issue was originally filed by @svezzoli at elastic/logstash#3103)


We need to use logstash 1.4.2 to store documents into MongoDB version 3.0 with new SCRAM-SHA-1 authentication mechanism enabled.

No documents are written in mongodb with following configuration file:
output {
mongodb {
collection => '%{type}'
database => 'nfmdata'
uri => 'mongodb://nfmoss:[email protected]'
}
}

If we add to uri parameter the connection authentication option:
uri => 'mongodb://nfmoss:[email protected]/?authMechanism=SCRAM-SHA-1'

we have the following error:
Mongo::MongoArgumentError: Invalid value "scram-sha-1" for authmechanism: must be one of GSSAPI, MONGODB-CR, MONGODB-X509, PLAIN

It seems that mongodb output plugin has to be lined up to MongoDB Java driver version compatible with MongoDB version 3.0, that is at least 2.13.0 version.
Otherwise, authentication will fail because the server is expecting the driver to use the new SCRAM-SHA1 authentication protocol rather than the MONGODB-CR authentication protocol that it replaces.

Double quoted @timestamp field

I'm using your logstash-output-mongodb plugin to send data to mongo database.
Below an example record stored in mongodb
"@timestamp" : "\"2016-01-26T14:26:49.333Z\""
As you can see, two double quotes are present "\".....\"": this field isn't modified by my central.conf file.
I've tested my output with this code

output {
    mongodb {
      collection => "%{typelog}"
      database => "atest"
      uri => "mongodb://127.0.0.1/"
      codec => "json"
    }
   stdout { codec => rubydebug }
}

And this is what i obtain from stdout

{
           "@timestamp" => "2016-01-30T17:12:07.602Z",
     .....
}

This is my configuration

logstash 2.1.1
mongodb version v3.0.9

In stdout the double quotes problem is not present: what's wrong?
Thanks

Failed to send event to MongoDB, retrying in 3 seconds {:event=>#<LogStash::Event:0x115d2456>, :exception=>#<Mongo::Error::NoServerAvailable: No server is available matching preference: #<Mongo::ServerSelector::Primary:0x101bcedd @tag_sets=[], @server_selection_timeout=30, @options={:database=>"Logs", :user=>"username", :password=>"passwd"}>>}

I am trying to output log data from a local text file to mongodb:
`
input
{
file {
path => "/home/username/Data"
type => "cisco-asa"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}

filter
{
grok {
match => { "message" => "^%{SYSLOGTIMESTAMP:syslog_timestamp} %{HOSTNAME:device_src} %%{CISCO_REASON:facility}-%{INT:severity_level}-%{CISCO_REASON:f>
}

    date {
            match => ["syslog_timestamp", "MMM dd HH:mm:ss" ]
            target => "@timestamp"
    }

}

output
{
stdout {
codec => dots
}

    mongodb {
      id => "mongo-cisco"
      collection => "Cisco ASA"
      database => "Logs"
      uri => "mongodb://username:[email protected]:27017/Logs"
      codec => "json"
    }

}
`
logstash version: 7.11.1

When I add "+srv" to the uri, logstash shuts down immediately after startup yet the command I use to connect to the DB from the mongo shell is:
mongo "mongodb+srv://username:[email protected]:27017/Logs"
the same filter works fine when I ingest data in elasticsearch.
Please help, I need this for my end-of-studies project.

Logstash sync from Oracle to MongoDB (updating records)

We are using an pipeline file to sync data from oracle to mongo to perfom an update operation.
When logstash is pushing data to mongo we want logstash to do an db.collection.save() instead of db.collection.insert() as this an update operation the records should get updated instead of getting inserted newly.

Logstash by default is using db.collection().insert and pushing all the records. So this is causing an issue when "_id"(which is unique for an single doc) is same for two records mongo doesn't allow logstash to insert the record.

So is there any option available in output mongo plugin of logstash to prevent this issue.

Failed to send event to MongoDB ... ArgumentError: wrong number of arguments calling `initialize`

Hi everyone,

I have a basic logstash configuration which gets documents from elasticsearch, with no filter at all, and is supposed to insert them into mongodb in the output.

I keep getting this error repeatedly:

19:10:52.534 [[main]>worker2] WARN  logstash.outputs.mongodb - Failed to send event to MongoDB {:event=>2017-04-12T19:10:49.471Z %{host} %{message}, :exception=>#<ArgumentError: wrong number of arguments calling `initialize` (2 for 0)>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-mongodb-3.0.1/lib/logstash/outputs/mongodb.rb:54:in `receive'", "/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:92:in `multi_receive'", "org/jruby/RubyArray.java:1613:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:92:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:19:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator.rb:47:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:390:in `output_batch'", "org/jruby/RubyHash.java:1342:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:389:in `output_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:346:in `worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:306:in `start_workers'"]}
^C19:10:54.473 [SIGINT handler] WARN  logstash.runner - SIGINT received. Shutting down the agent.
19:10:54.477 [LogStash::Runner] WARN  logstash.agent - stopping pipeline {:id=>"main"}

Version:
not sure, but the latest downloaded through logstash-plugin, running logstash 5.3

Operating System:
Ubuntu 16.10

Config File (if you have sensitive info, please remove it):

input {
  elasticsearch {
  hosts => ["*******","****", "***"]
  index => "******"
  # get the same error with or without a query
  user => "********"
  password => "**********"
  }
}
filter {}
output {
	mongodb {
		collection => "******"
		database => "*******"
		generateId => true
		uri => "*************"
	}
}

Sample Data:
I have a very basic elasticsearch document structure which has worked with other plugins in logstash ...

Steps to Reproduce:
Simply calling the configuration file returns this error

skip on duplicate key doesn't work

This piece of code doesn't seem to work (https://github.com/logstash-plugins/logstash-output-mongodb/blob/master/lib/logstash/outputs/mongodb.rb):

if e.error_code == 11000
# On a duplicate key error, skip the insert.
# We could check if the duplicate key err is the _id key
# and generate a new primary key.
# If the duplicate key error is on another field, we have no way
# to fix the issue.
else

I have inserts failing due to duplicate key and it's sleeping and retrying instead of skipping the insert. I actually added some debug code and confirmed it's going into else statement:

if e.code == 11000
@logger.warn("*************************** ignored duplicate insert! _")
# On a duplicate key error, skip the insert.
# We could check if the duplicate key err is the _id key
# and generate a new primary key.
# If the duplicate key error is on another field, we have no way
# to fix the issue.
else
@logger.warn("_
* will sleep and retry *********************************")
sleep @retry_delay
retry
end

Results:
{:timestamp=>"2015-07-09T18:25:40.110000+0000", :message=>"Failed to send event to MongoDB", :event=>#<LogStash::Event:0x6a73dcc9 @accessors=#<LogStash::Util::Accessors:0x5ca446b8 @store={"ts"=>"2015-07-07T00:44:59.000Z", "label"=>"Flights-Search", "pos"=>"CHELWBTPRF-10.karmalab.net", "port"=>"55518", "server"=>"CHELWBTPRF-10", "status"=>"200", "size"=>"55836", "duration"=>453.113, "version"=>"trunk-trunk.ci.1406499", "_id"=>"89bd51974f7abb59622c3a537727331b_1277136"}, @lut={"@Version"=>[{"ts"=>"2015-07-07T00:44:59.000Z", "label"=>"Flights-Search", "pos"=>"CHELWBTPRF-10.karmalab.net", "port"=>"55518", "server"=>"CHELWBTPRF-10", "status"=>"200", "size"=>"55836", "duration"=>453.113, "version"=>"trunk-trunk.ci.1406499", "_id"=>"89bd51974f7abb59622c3a537727331b_1277136"}, "@Version"], "@timestamp"=>[{"ts"=>"2015-07-07T00:44:59.000Z", "label"=>"Flights-Search", "pos"=>"CHELWBTPRF-10.karmalab.net", "port"=>"55518", "server"=>"CHELWBTPRF-10", "status"=>"200", "size"=>"55836", "duration"=>453.113, "version"=>"trunk-trunk.ci.1406499", "_id"=>"89bd51974f7abb59622c3a537727331b_1277136"}, "@timestamp"], "[ts][$date]"=>[{"$date"=>"2015-07-07T00:44:59.000Z"}, "$date"], "ts"=>[{"ts"=>"2015-07-07T00:44:59.000Z", "label"=>"Flights-Search", "pos"=>"CHELWBTPRF-10.karmalab.net", "port"=>"55518", "server"=>"CHELWBTPRF-10", "status"=>"200", "size"=>"55836", "duration"=>453.113, "version"=>"trunk-trunk.ci.1406499", "_id"=>"89bd51974f7abb59622c3a537727331b_1277136"}, "ts"], "[label]"=>[{"ts"=>"2015-07-07T00:44:59.000Z", "label"=>"Flights-Search", "pos"=>"CHELWBTPRF-10.karmalab.net", "port"=>"55518", "server"=>"CHELWBTPRF-10", "status"=>"200", "size"=>"55836", "duration"=>453.113, "version"=>"trunk-trunk.ci.1406499", "_id"=>"89bd51974f7abb59622c3a537727331b_1277136"}, "label"]}>, @DaTa={"ts"=>"2015-07-07T00:44:59.000Z", "label"=>"Flights-Search", "pos"=>"CHELWBTPRF-10.karmalab.net", "port"=>"55518", "server"=>"CHELWBTPRF-10", "status"=>"200", "size"=>"55836", "duration"=>453.113, "version"=>"trunk-trunk.ci.1406499", "_id"=>"89bd51974f7abb59622c3a537727331b_1277136"}, @cancelled=false>, :exception=>#<Mongo::OperationFailure: Database command 'insert' failed: (ok: '1'; n: '0'; writeErrors: '[{"index"=>0, "code"=>11000, "errmsg"=>"insertDocument :: caused by :: 11000 E11000 duplicate key error index: logstash.accessLogs.$id dup key: { : "89bd51974f7abb59622c3a537727331b_1277136" }"}]').>, :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/mongo-1.10.2-java/lib/mongo/db.rb:576:in command'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/mongo-1.10.2-java/lib/mongo/collection_writer.rb:314:insend_write_command'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/mongo-1.10.2-java/lib/mongo/functional/logging.rb:55:in instrument'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/mongo-1.10.2-java/lib/mongo/functional/logging.rb:20:ininstrument'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/mongo-1.10.2-java/lib/mongo/functional/logging.rb:54:in instrument'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/mongo-1.10.2-java/lib/mongo/collection_writer.rb:313:insend_write_command'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/mongo-1.10.2-java/lib/mongo/collection.rb:1076:in send_write'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/mongo-1.10.2-java/lib/mongo/collection.rb:419:ininsert'", "/opt/logstash/lib/logstash/outputs/mongodb.rb:66:in receive'", "/opt/logstash/lib/logstash/outputs/base.rb:86:inhandle'", "(eval):59:in initialize'", "org/jruby/RubyProc.java:271:incall'", "/opt/logstash/lib/logstash/pipeline.rb:266:in output'", "/opt/logstash/lib/logstash/pipeline.rb:225:inoutputworker'", "/opt/logstash/lib/logstash/pipeline.rb:152:in `start_outputs'"], :level=>:warn}
{:timestamp=>"2015-07-09T18:25:40.112000+0000", :message=>"****************************** will sleep and retry *********************************", :level=>:warn}

[Enhancement] Allow continue-on-error when insert duplicate records

In mongodb i created an unique index on collection. This is to make sure we dont have duplicated records on database when logstash reprocess a file. But currently we cant do this silently and logstash keeps throwing error sth like An unexpected error occurred! {:error=>#<Mongo::Error::OperationFailure: E11000 duplicate key error collection:
This can be fixed by this solution. https://stackoverflow.com/questions/34153058/mongodb-multi-document-insert-ignore-custom-duplicate-field-error

Please can this be improved because its really helpful ?

Logstash MongoDB Output plugin 3.1.7 error

  1. Logstash version (e.g. bin/logstash --version) : 8.2.0
  2. Plugin version : logstash-output-mongodb 3.1.7
  3. Logstash installation : source expanded from tar or zip archive
  4. How is Logstash being run : Via command line
  5. How was the Logstash Plugin installed : bin/logstash-plugin

OS version (uname -a if on a Unix-like system): Bigsur

Description of the problem including expected versus actual behavior:
Getting below error:

[WARN ][logstash.outputs.mongodb ][main] MONGODB | Failed to handshake with : 27017 ArgumentError: wrong number of arguments (given 2, expected 1)

[WARN ][logstash.outputs.mongodb ][main] MONGODB | Error checking :27017: ArgumentError: wrong number of arguments (given 2, expected 1)

Steps to reproduce:

  1. Check Mongo is running correctly and able to connect though with mongosh and Compass UI
  2. Configure logstash configuration with mongo as output
  3. run with bin/logstash from CLI

enhancement - Add support for client certificates when using SSL

In the current version SSL connections are supported but there is no way to use client certificates. It can be added with something like this in lib/logstash/outputs/mongodb.rb:

   config :ssl_ca_cert, :validate => :string, :required => false                    
   config :ssl_cert, :validate => :string, :required => false                       
   config :ssl_key, :validate => :string, :required => false           

     conn = Mongo::Client.new(@uri,                                                 
       ssl_ca_cert: @ssl_ca_cert,                                                   
       ssl_cert: @ssl_cert,                                                         
       ssl_key: @ssl_key                                                            
     )          

I can submit either a patch or a pull request if you prefer that.

logstash-input-mongodb

Hi!

Just a small question: Why is there no input plugin for mongodb? Seems to be a very good fit?

There used to be a river thing but it's deprecated and now there is this mongo-connector but it's giving me errors about the oplog all the time and generally not working (yet).

Intermittently unable to insert data

The data flow is as follows๏ผš

wrk(Modern HTTP benchmarking tool) ---> logstash ----> mongodb

  • Version:
  1. logstash-5.6.7
  2. logstash-output-mongo: 3.1.3
  3. MongoDB: 3.4.13
  • Operating System:
  1. CentOS 7.2 (4 Core and 8G Mem)
  • Config File

jvmใ€startupใ€logstash is default config

input {
        http {
            host => "0.0.0.0"
            port => 50001
            type => "metric"
            threads => 256
        }
}

filter {
    
        if [type] == "metric" {

            if [message] == "epoch,value"{
                drop { }
            }

            ruby {

                code => "
                        event.set('insertdate', event.get('@timestamp'))
                        event.set('[@metadata][type]', event.get('type'))
                        event.set('[@metadata][collection]', event.get('collection'))
                        event.set('expireAt', Time.now + (7*24*60*60))
                        "
            }

             mutate {
                split => {"message" => ","}
                add_field => {
                        "time" => "%{message[0]}"
                }
            }

            ruby {
                code => "
                        event.set('value', event.get('message').drop(1))
                        "
            }

            mutate {
                join => {
                    "value" => ","
                }
                
                remove_field => ["host","@timestamp","@version","headers","tags","message","type","collection"]
            }
  
        }
}


output {

        if  [@metadata][type] == "metric"  {
            mongodb {
                bulk => true
                bulk_interval => 1
                bulk_size => 900
                codec => json
                collection => "%{[@metadata][collection]}"
                isodate => true
                database => "metric"
                uri => "mongodb://aaa:[email protected]/admin"
            }
}
  • Sample Data:
{
    "collection":"03406bdd-560f-4525-89a1-8929dbe72e16.app.rsptime'",
    "message":"1488181693.035,xxxxxxxxxxxxxxxxxxxxxxxxxxx..."  # avg length is  512Byte
}
  • Steps to Reproduce:
  1. start logstash by config
  2. wrk process post request
  3. logstash insert data
  4. use mongostat to monitor ๏ผŒ a continuous 0 value ๏ผˆinsert column๏ผ‰appears

Shell> mongostat -u xxx -p xxxx --authenticationDatabase admin
insert query update delete getmore command ... ...
1000
1440
....
0
0
... // about 1-2 minutes
0
1200
1100
... // about 1-3 minutes
0
0
...
(repeat)

  1. At the moment, mongodb can't query๏ผˆ db.xxx.find() is blocked๏ผ‰

At the same time, I wrote a test program with golang๏ผˆuse bulk๏ผ‰, and it's work well,
the value of the inserted column is around 5000๏ผŒand mongo query is work well

Plugin failed with logstash 1.5.0.rc2

Hello!

I'm trying to use this plugin with logstash 1.5.0.rc2.
plugin installed via following command:
cd /opt/logstash; ./bin/plugin install logstash-output-mongodb
It throws following error on start:
The error reported is:
uninitialized constant Mongo::URIParser

My config:

output {
stdout {
codec => rubydebug
}
mongodb {
database => "db"
collection => "logs"
uri => "mongodb://127.0.0.1:27017"
}
}

Is something wrong with config or ?

Logstash 2.4 to MongoDB 3.4

I'm currently using Logstash 2.4 on a windows server. I'm attempting to write log messages to MongoDB 3.4. Logstash sends several messages then stops with a 'Failed to send event to MongoDB' error message. Logstash is using the mongo-2.0.6. According to https://docs.mongodb.com/ecosystem/drivers/driver-compatibility-reference/#ruby-driver-compatibility I need to use mongo-2.4.0. How do I upgrade the driver? I'm hoping the latest version of the mongo driver will fix my issue.

Logstash output getting failed with mongodb as output

Hi All,

I am trying to read multiple input json files and ingesting it into mongodb.
Below is my LS conf file:

input{
file {
path => "D:/_sources/*.*"
start_position => "beginning"
type => "json"
sincedb_path=>"D:/sample.text"
}
}

output{
	mongodb {
		    id => "my_mongodb_plugin_id"
			collection => "configurations"
			database => "trueview"
			uri => "mongodb://localhost:27017/trueview"        
			codec => "json"
	}
}

and my input file is ss.json which is present under "D:/_sources/" folder.

{
    "GUID": "b92b650c-4f4a-438b-aa79-ca947329a9eb",
    "friendlyName":"supernova",
    "_type": "database",
    "_properties": {
        "auth":{
            "type": "Basic",
            "username": "admin",
            "password": "admin"
        },
        "_path": "http://localhost:8080",
        "serverMapsServiceSuffix":"/image.png"
    }
}

But it is not able to ingest anything in mongodb. Don't know what is wrong here with my inputs.

cannot install into logstash 5.x

  • Version: logstash 5.1.1
  • Operating System: ubuntu 16.04
  • Steps to Reproduce: logstash-plugin install logstash-output-mongodb

Using logstash-plugin to install this plugin fails with lots of error messages

...
...
Running `bundle update` will rebuild your snapshot from scratch, using only
the gems in your Gemfile, which may resolve the conflict.
Bundler could not find compatible versions for gem "logstash-core":
  In snapshot (Gemfile.lock):
    logstash-core (= 5.1.1)

  In Gemfile:
    logstash-core-plugin-api (>= 0) java depends on
      logstash-core (= 5.1.1) java

    logstash-output-mongodb (>= 0) java depends on
      logstash-core (< 2.0.0, >= 1.4.0) java

    logstash-core (>= 0) java

Running `bundle update` will rebuild your snapshot from scratch, using only
the gems in your Gemfile, which may resolve the conflict.
Bundler could not find compatible versions for gem "logstash":
  In Gemfile:
    logstash-output-mongodb (>= 0) java depends on
      logstash (< 2.0.0, >= 1.4.0) java
Could not find gem 'logstash (< 2.0.0, >= 1.4.0) java', which is required by gem 'logstash-output-mongodb (>= 0) java', in any of the sources.

Can you please build and upload a gem file that is compatible to Logstash 5.x?

enhancement - Change name of "@timestamp" field

First of all, I don't know if I have to register this here or if it is in other site. If I have opened this in the wrong place, can you redirect me to the place I should register it?

I'm using "logstash-output-mongodb" to avoid some process that I had to do on some CSV files, which I had to transform using a MongoDB Javascript function before loading the data in the collection. In that collection, the timestamp field is named "ts", but when the output filter inserts into the collection it names it "@timestamp"

Logstash doesn't support the change of name of "@timestamp" field, but would it be possible an enhancement that allows changing the name of the field before inserting it in MongoDB?

Thanks!

Container crashes after plugin installation

Hi guys,

I've a problem with the installation of this plugin in the contatiner version of logstash oss 6.7.1.
I'm running the container on a Centos 7 machine.
I'm just using the base conf provided by the container, but as soon as the container finishes to install the plugin, the container crashes with exit code 0 and no logs in docker logs.

This is the command I launch:
docker run -d --name logstash --network host docker.elastic.co/logstash/logstash-oss:6.7.1 sh -c 'bin/logstash-plugin install logstash-output-mongodb'

The container just outputs:
Validating logstash-output-mongodb
Installing logstash-output-mongodb
Installation successful

But after the successful installation command it exits with exit code 0:
1f5efe22c4fb docker.elastic.co/logstash/logstash-oss:6.7.1 "/usr/local/bin/dockรขโ‚ฌยฆ" 2 minutes ago Exited (0) 2 minutes ago logstash

The container crashes just if I install this plugin, otherwise it works fine.

Thank you for your support!

Multiple workers breaking auth

Hello
It seems when I set workers to 2 or higher I get login issues when connecting to MongoDB. This doesn't happen when workers is set to 1

No tests

Hi,
this plugin has no unit test, this would be to be fixed in order to check for regressions, bugs, etc.

MongoDB output (isodate and generateId parameters)

(This issue was originally filed by @Danny-Blackjack at https://github.com/elastic/logstash-contrib/issues/61)


Hi,

I am using logstash to write the events into elasticsearch and mongodb. I ran into some issues, but also found a possible fix.

This is the logstash configuration:

output {
    mongodb {
      uri => "mongodb://elasticsearch-server:27017"
      database => "msops"
      collection => "logstash_%{type}"
      #isodate => true
    }
    elasticsearch {
      host => "elasticsearch-server"
    }
}

All works well by default.
I ran into problems when setting the MongoDB @timestamp as an isodate using 'isodate => true'. This causes an exception "Failed to flush outgoing items" in logstash (see full stack further below), causing the elasticsearch flush to fail. Interesting how the mongodb output plugin can cause the ES output to fail.

I also run into problems setting documentId => true'. Here the problem was in ES server, with the mongodb '_id' identifier not matching the ES one. (see also 2nd stack trace below).

I think I found the problem in lib/logstash/outputs/mongodb.rb. Line 59 'document=event.to_hash' makes the mongodb document instance point to the same ES event instance. So changing the @timestamp or _id changed the document/event for both mongodb output and ES output. The fix -which works- is to create a new Hash instance in line 59:
document = Hash.new { event.to_hash }
(I'm not a Ruby developer though, so there maybe better ways to do this).

      if @isodate
        # the mongodb driver wants time values as a ruby Time object.
        # set the @timestamp value of the document to a ruby Time object, then.
        #document = event.to_hash
       # Possible fix: create a new Hash instance for mongodb
        document = Hash.new { event.to_hash }
      else
        # Note: .merge creates a new Hash
        document = event.to_hash.merge("@timestamp" => event["@timestamp"].to_json)
      end

Using logstash 1.4.1 and logstash-contrub on Java 6 runtime.

isodate => true exception
{:timestamp=>"2014-06-03T10:39:58.269000+1000", :message=>"Failed to flush outgoing items", :outgoing_count=>2, :exception=>java.lang.ClassCastException: org.jruby.RubySymbol cannot be cast to java.lang.String, :backtrace=>["org.elasticsearch.common.xcontent.XContentBuilder.writeMap(org/elasticsearch/common/xcontent/XContentBuilder.java:1095)", "org.elasticsearch.common.xcontent.XContentBuilder.map(org/elasticsearch/common/xcontent/XContentBuilder.java:1015)", "org.elasticsearch.action.index.IndexRequest.source(org/elasticsearch/action/index/IndexRequest.java:338)", "org.elasticsearch.action.index.IndexRequest.source(org/elasticsearch/action/index/IndexRequest.java:327)", "java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:597)", "LogStash::Outputs::Elasticsearch::Protocols::NodeClient.build_request(/home/didata/projects/workspace_monitoring/installs/logstash/lib/logstash/outputs/elasticsearch/protocol.rb:217)", "LogStash::Outputs::Elasticsearch::Protocols::NodeClient.build_request(/home/didata/projects/workspace_monitoring/installs/logstash/lib/logstash/outputs/elasticsearch/protocol.rb:217)", "LogStash::Outputs::Elasticsearch::Protocols::NodeClient.bulk(/home/didata/projects/workspace_monitoring/installs/logstash/lib/logstash/outputs/elasticsearch/protocol.rb:205)", "LogStash::Outputs::Elasticsearch::Protocols::NodeClient.bulk(/home/didata/projects/workspace_monitoring/installs/logstash/lib/logstash/outputs/elasticsearch/protocol.rb:205)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1613)", "LogStash::Outputs::Elasticsearch::Protocols::NodeClient.bulk(/home/didata/projects/workspace_monitoring/installs/logstash/lib/logstash/outputs/elasticsearch/protocol.rb:204)", "LogStash::Outputs::Elasticsearch::Protocols::NodeClient.bulk(/home/didata/projects/workspace_monitoring/installs/logstash/lib/logstash/outputs/elasticsearch/protocol.rb:204)", "LogStash::Outputs::ElasticSearch.flush(/home/didata/projects/workspace_monitoring/installs/logstash/lib/logstash/outputs/elasticsearch.rb:321)", "LogStash::Outputs::ElasticSearch.flush(/home/didata/projects/workspace_monitoring/installs/logstash/lib/logstash/outputs/elasticsearch.rb:321)", "RUBY.buffer_flush(/home/didata/projects/workspace_monitoring/installs/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.17/lib/stud/buffer.rb:219)", "org.jruby.RubyHash.each(org/jruby/RubyHash.java:1339)", "RUBY.buffer_flush(/home/didata/projects/workspace_monitoring/installs/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.17/lib/stud/buffer.rb:216)", "RUBY.buffer_initialize(/home/didata/projects/workspace_monitoring/installs/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.17/lib/stud/buffer.rb:112)", "org.jruby.RubyKernel.loop(org/jruby/RubyKernel.java:1521)", "RUBY.buffer_initialize(/home/didata/projects/workspace_monitoring/installs/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.17/lib/stud/buffer.rb:110)", "java.lang.Thread.run(java/lang/Thread.java:662)"], :level=>:warn}
documentId => true
2014-06-02 08:45:04,949][DEBUG][action.bulk              ] [Hawkshaw] [logstash-2014.06.01][1] failed to execute bulk item (index) index {[logstash-2014.06.01][collectd][GjZqRd6UTpeVQNv9BkksUg], source[{"@version":"1","@timestamp":"2014-06-01T22:44:28.437Z","host":"mint15","plugin":"load","collectd_type":"load","shortterm":3.34,"midterm":2.56,"longterm":1.72,"type":"collectd","_id":"538baccce4b032f7fb3523e9"}]}
org.elasticsearch.index.mapper.MapperParsingException: failed to parse [_id]
    at org.elasticsearch.index.mapper.core.AbstractFieldMapper.parse(AbstractFieldMapper.java:418)
    at org.elasticsearch.index.mapper.internal.IdFieldMapper.parse(IdFieldMapper.java:291)
    at org.elasticsearch.index.mapper.object.ObjectMapper.serializeValue(ObjectMapper.java:616)
    at org.elasticsearch.index.mapper.object.ObjectMapper.parse(ObjectMapper.java:469)
    at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:515)
    at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:462)
    at org.elasticsearch.index.shard.service.InternalIndexShard.prepareCreate(InternalIndexShard.java:363)
    at org.elasticsearch.action.bulk.TransportShardBulkAction.shardIndexOperation(TransportShardBulkAction.java:427)
    at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:160)
    at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction.performOnPrimary(TransportShardReplicationOperationAction.java:556)
    at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction$1.run(TransportShardReplicationOperationAction.java:426)
    at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
    at java.lang.Thread.run(Thread.java:662)
Caused by: org.elasticsearch.index.mapper.MapperParsingException: Provided id [GjZqRd6UTpeVQNv9BkksUg] does not match the content one [538baccce4b032f7fb3523e9]
    at org.elasticsearch.index.mapper.internal.IdFieldMapper.parseCreateField(IdFieldMapper.java:310)
    at org.elasticsearch.index.mapper.core.AbstractFieldMapper.parse(AbstractFieldMapper.java:408)
    ... 13 more

uninitialized constant Mongo::URIParser

Hi,

I'm using logstash 1.5.2
When i try to run with a simple config i have this error
uninitialized constant Mongo::URIParser

  • output{
    mongodb {
    collection => "type"
    database => "logs"
    uri => "mongodb://localhost"
    }*

Is there any workaround for this issue ?

Thank you !

error when using generateId (wrong number of argument)

When i use the generateId option, it raise an exception :

wrong number of arguments calling `initialize` (2 for 0)
/opt/logstash-2.2.0/vendor/bundle/jruby/1.9/gems/logstash-output-mongodb-2.0.3/lib/logstash/outputs/mongodb.rb:54

according to doc http://www.rubydoc.info/github/mongodb/bson-ruby/BSON/ObjectId#from_string-class_method
you should replace mongodb.rb line 54

document["_id"] = BSON::ObjectId.new(nil, event["@timestamp"])

by something like

document["_id"] =  BSON::ObjectId.from_time(event["@timestamp"])

with event["@timestamp"] a Time object or timestamp

logstash mongodb output - wrong number of arguments

Hi,
I am trying to create something similar to data streaming between MySQL and MongoDB, that is MySQL -> MongoDB.

Here's the conf file..

input {
  jdbc {
    jdbc_driver_library => "/usr/share/logstash/mysql-connector-java-5.1.47/mysql-connector-java-5.1.47.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    jdbc_connection_string => "jdbc:mysql://localhost:3306/foo"
    jdbc_user => "*****"
    jdbc_password => "*****"
    tracking_column => "update_ts"
    use_column_value => true
    statement => "SELECT * FROM foo.foobar where update_ts >:sql_last_value;"
    schedule => "* * * * *"
  }
}

filter {}

output {
  mongodb{
    uri => "mongodb://localhost:27017"
    database => "mongo_foo"
    collection => "logstash_op"
    id => "my_id"
    codec => "json"
    }
}

I use this command to start stash: bin/logstash -f /etc/logstash/conf.d/sample_config.conf
This is the error I get.

Tue Jul 16 15:04:00 IST 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.`
`[INFO ] 2019-07-16 15:04:00.870 [Ruby-0-Thread-20: :1] jdbc - (0.005352s) SELECT * FROM kafka.foobar where update_ts >0;`
**`[WARN ] 2019-07-16 15:04:01.397 [[main]>worker2] mongodb - Failed to send event to MongoDB, retrying in 3 seconds {:event=>#<LogStash::Event:0x2e76d513>, :exception=>#<ArgumentError: wrong number of arguments (given 2, expected 0..1)>}`**
`[WARN ] 2019-07-16 15:04:04.416 [[main]>worker2] mongodb - Failed to send event to MongoDB, retrying in 3 seconds {:event=>#<LogStash::Event:0x2e76d513>, :exception=>#<ArgumentError: wrong number of arguments (given 2, expected 0..1)>}

I'm totally new to logstash, so I'm guessing I must be doing something wrong.

PLEASE HELP..!

The process exits when a duplicate _id is inserted

logstash-output-mongodb plugin version: 3.1.6
logstash version: 7.6.2
mongodb server version: 4.0.9

When I am inserting data, if the specified _id conflicts, the process will exit. The exception information is as follows

logstash[16300]: warning: thread "Ruby-0-Thread-84: :1" terminated with exception (report_on_exception is true):
logstash[16300]: Mongo::Error::BulkWriteError: Mongo::Error::BulkWriteError: : E11000 duplicate key error collection: adx_requests_at_hour_11.request_logs_1 index: _id_ dup key: { : "1d2ad817737849978c21a7937859001a" } (11000)
logstash[16300]: validate! at /usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.4/lib/mongo/bulk_write/result.rb:184
logstash[16300]: result at /usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.4/lib/mongo/bulk_write/result_combiner.rb:83
logstash[16300]: execute at /usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.4/lib/mongo/bulk_write.rb:87
logstash[16300]: bulk_write at /usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.4/lib/mongo/collection.rb:591
logstash[16300]: insert_many at /usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.4/lib/mongo/collection.rb:568
logstash[16300]: register at /usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-mongodb-3.1.6/lib/logstash/outputs/mongodb.rb:68
logstash[16300]: each at org/jruby/RubyHash.java:1428
logstash[16300]: register at /usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-mongodb-3.1.6/lib/logstash/outputs/mongodb.rb:66
logstash[16300]: synchronize at org/jruby/ext/thread/Mutex.java:164
logstash[16300]: register at /usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-mongodb-3.1.6/lib/logstash/outputs/mongodb.rb:65
logstash[16300]: [2021-01-14T11:56:30,580][FATAL][logstash.runner          ] An unexpected error occurred! {:error=>#<Mongo::Error::BulkWriteError: Mongo::Error::BulkWriteError: : E11000 duplicate key error collection: adx_requests_at_hour_11.request_logs_1 index: _id_ dup key: { : "1d2ad817737849978c21a7937859001a" } (11000)>, :backtrace=>["/usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.4/lib/mongo/bulk_write/result.rb:184:in `validate!'", "/usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.4/lib/mongo/bulk_write/result_combiner.rb:83:in `result'", "/usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.4/lib/mongo/bulk_write.rb:87:in `execute'", "/usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.4/lib/mongo/collection.rb:591:in `bulk_write'", "/usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.4/lib/mongo/collection.rb:568:in `insert_many'", "/usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-mongodb-3.1.6/lib/logstash/outputs/mongodb.rb:68:in `block in register'", "org/jruby/RubyHash.java:1428:in `each'", "/usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-mongodb-3.1.6/lib/logstash/outputs/mongodb.rb:66:in `block in register'", "org/jruby/ext/thread/Mutex.java:164:in `synchronize'", "/usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-mongodb-3.1.6/lib/logstash/outputs/mongodb.rb:65:in `block in register'"]}
logstash[16300]: connecting to redis: logstash:filter:seller:60006, value: Nox
logstash[16300]: [2021-01-14T11:56:32,204][ERROR][org.logstash.Logstash    ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit
systemd[1]: logstash@adx_request_tsv.service: main process exited, code=exited, status=1/FAILURE

my output conf

mongodb {
    bulk => true
    bulk_interval => 3
    bulk_size => 900
    
    uri => "mongodb://127.0.0.1:10050/adx_requests_at_hour_23"
    database => "adx_requests_at_hour_23"
    collection => "request_logs_%{mongo_i}"
    generateId => false
}

In case #10, I found that the problem has been solved, which makes me confused. Is there some config error?

The sync can't be stopped after date transfer over, It still cycle running.

When I use the logstash-output-jdbc to sync date from elaticsearch V2.3.4 to mongodb V3.2.7, the plugin can't be stop immediately when the date end and the lines of date >= 10000.
The sync can't be stopped after date transfer over, It still cycle running from begin.
So I think it is a bug.
My logstash version is 2.3.4.

Thanks from your reply!

SSL/TLS Support?

I try to get i working with ssl but fail.

Without ssl (works if mongodb server is set to "allowSSL", what means that the output do not use ssl!):

output {
if [type] == "mytype" {
mongodb {
uri => "mongodb://myserver/"
database => "mydatabase"
collection => "mycollection"
id => "myid"
}
}
}

What i tried....
With ssl (mongodb server is set to "requireSSL", what means no ssl no data....):

output {
if [type] == "mytype" {
mongodb {
uri => "mongodb://myserver/mydatabase?ssl=true"
database => "mydatabase"
collection => "mycollection"
id => "myid"
}
}
}

That doesnt work and in mongodb log i see that the server tells "requires ssl".

Reconnect to MongoDB

Please post all product and debugging questions on our forum. Your questions will reach our wider community members there, and if we confirm that there is a bug, then we can open a new issue here.

For all general issues, please provide the following details for fast resolution:

  • Version:
  • Operating System:
  • Config File (if you have sensitive info, please remove it):
  • Sample Data:
  • Steps to Reproduce:

failed to send event to MongoDB, retrying in 3 seconds {:event=>#LogStash::Event:0x2367ca72, :exception=>#<ArgumentError: wrong number of arguments (2 for 1)>}

I wand merge data from MySQL to mongo
this my
input {
jdbc {jdbc_connection_string => "jdbc:mysql://fgh:3306/fgh?useSSL=false"jdbc_user => "fghfgh"
jdbc_password => "fghfghhfgfgh"jdbc_driver_library => "/dfghmysql-connector-java-8.0.18.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"statement => "SELECT * FROM fdh"clean_run => true}}
output {mongodb {id => "dddddddd"collection => "dfd"database => "springboottest" uri => "mongodb://localhost:27017"codec => "json"# generateId => trueisodate => true}}

the version i using

  1. logstash-6.2.2
  2. logstash-output-mongodb (3.1.6)
  3. logstash-input-jdbc (4.3.3)

gettiing error

[2019-11-20T13:34:48,843][WARN ][logstash.outputs.mongodb ] Failed to send event to MongoDB, retrying in 3 seconds {:event=>#LogStash::Event:0x2367ca72, :exception=>#<ArgumentError: wrong number of arguments (2 for 1)>}
[2019-11-20T13:34:49,297][WARN ][logstash.shutdownwatcher ] {"inflight_count"=>531, "stalling_thread_info"=>{"other"=>[{"thread_id"=>34, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in pop'"}, {"thread_id"=>35, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in pop'"}, {"thread_id"=>36, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in pop'"}, {"thread_id"=>37, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in pop'"}, {"thread_id"=>38, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in pop'"}, {"thread_id"=>39, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in pop'"}, {"thread_id"=>40, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in pop'"}, {"thread_id"=>41, "name"=>nil, "current_call"=>"[...]/vendor/bundle/jruby/2.3.0/gems/logstash-output-mongodb-3.1.6/lib/logstash/outputs/mongodb.rb:123:in sleep'"}]}}

stackoverflow link
elastic

Logstash not able to load mongodb output plugin

I am getting following error with logstash 6.2.3 (also on logstash 7.7.1) while using mongodb output plugin. I am running logstash inside a container. The plugin is installed without any problem and I can also list it, but it fails to load. Is there any known reason/fix available for this issue?

[2020-06-04T14:45:12,791][ERROR][logstash.plugins.registry] Tried to load a plugin's code, but failed. {:exception=>#<LoadError: load error: mongo/server/connection -- java.lang.NoSuchMethodError: java.nio.ByteBuffer.flip()Ljava/nio/ByteBuffer;>, :path=>"logstash/outputs/mongodb", :type=>"output", :name=>"mongodb"}
[2020-06-04T14:45:12,815][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:bundlestats, :exception=>"Java::JavaLang::IllegalStateException", :message=>"Unable to configure plugins: (PluginLoadingError)

Couldn't find any output plugin named 'mongodb'. Are you sure this is correct? Trying to load the mongodb output plugin resulted in this error: load error: mongo/server/connection -- java.lang.NoSuchMethodError: java.nio.ByteBuffer.flip()Ljava/nio/ByteBuffer;", :backtrace=>["org.logstash.config.ir.CompiledPipeline.(CompiledPipeline.java:119)", "org.logstash.execution.JavaBasePipelineExt.initialize(JavaBasePipelineExt.java:80)", "org.logstash.execution.JavaBasePipelineExt$INVOKER$i$1$0$initialize.call(JavaBasePipelineExt$INVOKER$i$1$0$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:837)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1169)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuperSplatArgs(IRRuntimeHelpers.java:1156)", "org.jruby.ir.targets.InstanceSuperInvokeSite.invoke(InstanceSuperInvokeSite.java:39)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$initialize$0(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:43)", "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:82)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:70)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:332)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:86)", "org.jruby.RubyClass.newInstance(RubyClass.java:939)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:332)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:86)", "org.jruby.ir.instructions.CallBase.interpret(CallBase.java:552)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:361)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:86)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:73)", "org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:207)", "usr.share.logstash.logstash_minus_core.lib.logstash.agent.RUBY$block$converge_state$2(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:342)", "org.jruby.runtime.CompiledIRBlockBody.callDirect(CompiledIRBlockBody.java:138)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:52)", "org.jruby.runtime.Block.call(Block.java:139)", "org.jruby.RubyProc.call(RubyProc.java:318)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:105)", "java.lang.Thread.run(Thread.java:748)"]}

MongoDB Atlas Integration with Logstash through logstash-output-mongodb plugin

Hi
I am trying to connect Logstash with MongoDB Atlas using logstash-output-mongodb plugin. However, some issues happen.
Currently, I am using logstash 6.7.2, and logtsash-output-plugin version 3.1.6
The integration I want to implement is Kafka --> Logtsash --> MogoDB Atlas (Logstash reads fine from Kafka ).
The following is the configuration in the output:

mongodb {
uri => "mongodb+srv://user:password@connection_string_uri.net/dbname?replicaSet=myRepl&retryWrites=true&w=majority&ssl=true"
database => "dbname"
collection => "collection_name"
generateId => true
id => "afng_audit_info"
}

Currently the issue is:

[2020-02-01T11:12:50,275][WARN ][logstash.outputs.mongodb ] Failed to send event to MongoDB, retrying in 3 seconds {:event=>#LogStash::Event:0x98eb8c07, :exception=>#<Mongo::Error::NoServerAvailable: No primary server is available in cluster: #<Cluster topology=ReplicaSetNoPrimary[cluster-name-shard-00-00-cptml.azure.mongodb.net:27017,cluster-name-shard-00-01-cptml.azure.mongodb.net:27017,cluster-name-shard-00-02-cptml.azure.mongodb.net:27017,name=myRepl] servers=[#,#,#]> with timeout=30, LT=0.015>}

I am sure the cluster is enable: I can access through Compass or Mongo Shell

Please, some tips, advices o recommendations using logstash-output-mongodb to integrate to MongoDB Atlas

Thanks in advance

logstash output error with json filter

Issue: I use logstash transfer data into mongodb, If I use filter json section to format field, I got error as below, but if I do not use filter section , mongodb can receive the data.

[2018-01-19T10:01:30,998][WARN ][logstash.outputs.mongodb ] Failed to send event to MongoDB {:event=>#<LogStash::Event:0x52e4bfb6>, :exception=>#<NoMethodError: undefined method `to_json' for nil:NilClass
Did you mean?  to_bson>, 

:backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-mongodb-3.1.3/lib/logstash/outputs/mongodb.rb:81:in `receive'", 

"/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:92:in `block in multi_receive'", "org/jruby/RubyArray.java:1734:in `each'", 

"/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:92:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:22:in `multi_receive'", 

"/usr/share/logstash/logstash-core/lib/logstash/output_delegator.rb:50:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:487:in `block in output_batch'", 

"org/jruby/RubyHash.java:1343:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:486:in `output_batch'", 

"/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:438:in `worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:393:in `block in start_workers'"]}

  • Version: logstash-output-mongodb-3.1.3
  • logstash version: 6.1
  • Operating System: ubuntu 16.04
  • Config File (if you have sensitive info, please remove it):
input {
  beats {
    port => 5044
    host => "0.0.0.0"
  }
}
filter {
  json {
     source => "message"
     remove_field => ["message"]
     add_field => {
      "timestamp" => "%{@timestamp}"
     }
  }
}
output {
   stdout { codec => json_lines }
   mongodb {
     uri => "mongodb://localhost:27017"
     database => "test"
     collection => "col1"
   }
}
  • Sample Data:
    {'channelId': 'ios', 'BusinessRunTime': '2018-01-19 10:04:14.654'}
    {'channelId': 'ios', 'BusinessRunTime': '2018-01-19 10:04:24.663'}

  • Steps to Reproduce:
    I use filebeat read a file with sample data, then transfer to logstash ,then transfer to mongo

The version of mongodb output plugin present in logstash 1.4.2 doesn't support SCRAM-SHA-1 authentication mechanism for mongodb 3.0

We use logstash 1.4.2 to store documents into MongoDB version 3.0 with new SCRAM-SHA-1 authentication mechanism enabled.

No documents are written in mongodb with following configuration file:
output {
mongodb {
collection => '%{type}'
database => 'nfmdata'
uri => 'mongodb://nfmoss:[email protected]'
}
}

If we add to uri parameter the connection authentication option:
uri => 'mongodb://nfmoss:[email protected]/?authMechanism=SCRAM-SHA-1'

we have the following error:
Mongo::MongoArgumentError: Invalid value "scram-sha-1" for authmechanism: must be one of GSSAPI, MONGODB-CR, MONGODB-X509, PLAIN

I think that mongodb output plugin has to be aligned to MongoDB Ruby driver 2.0.0 version, the only one version compatible with MongoDB version 3.0.
With the actual version of MongoDB Ruby driver (1.10.2), the authentication toward MongoDB will fail because the server is expecting the driver to use the new SCRAM-SHA1 authentication protocol rather than the MONGODB-CR authentication protocol that it replaces.

It is possible to have this line up?
Thanks.

logstash-output-mongodb works only with mongodb 2.0.6

Hi,
I tried to use logstash-output-mongodb and push to mongodb version 3.0.4 but its not working.
then i observed that its not working with mongodb 3.0.4 and throwing the error

@metadata_accessors=#<LogStash::Util::Accessors:0x63ff2639 @store={}, @lut={}>, @cancelled=false>, :exception=>#<Mongo::Error::OperationFailure: assertion src/mongo/db/storage/mmap_v1/extent.h:77 (8)>, :backtrace=>["/Users/chakrapani.k/Downloads/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/mongo-2.0.6/lib/mongo/operation/result.rb:214:in validate!'", "/Users/chakrapani.k/Downloads/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/mongo-2.0.6/lib/mongo/operation/write/insert.rb:72:inexecute_write_command'", "/Users/chakrapani.k/Downloads/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/mongo-2.0.6/lib/mongo/operation/write/insert.rb:62:in execute'", "/Users/chakrapani.k/Downloads/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/mongo-2.0.6/lib/mongo/collection.rb:190:ininsert_many'", "/Users/chakrapani.k/Downloads/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/mongo-2.0.6/lib/mongo/collection.rb:175:in insert_one'", "/Users/chakrapani.k/Downloads/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-output-mongodb-2.0.3/lib/logstash/outputs/mongodb.rb:56:inreceive'", "/Users/chakrapani.k/Downloads/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-core-2.1.1-java/lib/logstash/outputs/base.rb:81:in handle'", "(eval):27:inoutput_func'", "/Users/chakrapani.k/Downloads/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-core-2.1.1-java/lib/logstash/pipeline.rb:277:in outputworker'", "/Users/chakrapani.k/Downloads/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-core-2.1.1-java/lib/logstash/pipeline.rb:194:instart_outputs'"], :level=>:warn}

Then i tried using this plugin with mongodb 2.0.6 and its working perfectly and able to push to mongodb.

Is there any workaround or fix for resolving this hard dependency on 2.0.6 and be able to push to mongo3 also.

regards,
chakri

cannot remove @timestamp field

Please post all product and debugging questions on our forum. Your questions will reach our wider community members there, and if we confirm that there is a bug, then we can open a new issue here.

For all general issues, please provide the following details for fast resolution:

  • Version: logstash 6.1.1
  • Operating System: Amazon Linux AMI
  • Config File (if you have sensitive info, please remove it): test.conf
input {
  stdin {
  }
}

filter {
  mutate {
    remove_field => ["@timestamp"]
  }
}

output {
    mongodb {
      uri => ...
      database => ...
      collection => ...
    }
}
  • Sample Data: just type whatever message to stdin after running logstash
  • Steps to Reproduce:
$ bin/logstash -f test.conf
> hello
>>>
[2018-07-26T05:47:05,981][WARN ][logstash.outputs.mongodb ] Failed to send event to MongoDB 
{:event=>#<LogStash::Event:0x6c6dbc2b>, :exception=>#<NoMethodError: undefined method `to_json' for nil:NilClass
Did you mean?  to_bson>, :backtrace=>[
  "/home/ec2-user/elastic/logstash-6.1.1/vendor/bundle/jruby/2.3.0/gems/logstash-output-mongodb-3.1.4/lib/logstash/outputs/mongodb.rb:81:in `receive'", 
  "/home/ec2-user/elastic/logstash-6.1.1/logstash-core/lib/logstash/outputs/base.rb:92:in `block in multi_receive'", "org/jruby/RubyArray.java:1734:in `each'", 
  "/home/ec2-user/elastic/logstash-6.1.1/logstash-core/lib/logstash/outputs/base.rb:92:in `multi_receive'", 
  "/home/ec2-user/elastic/logstash-6.1.1/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:22:in `multi_receive'", 
  "/home/ec2-user/elastic/logstash-6.1.1/logstash-core/lib/logstash/output_delegator.rb:50:in `multi_receive'", 
  "/home/ec2-user/elastic/logstash-6.1.1/logstash-core/lib/logstash/pipeline.rb:487:in `block in output_batch'", "org/jruby/RubyHash.java:1343:in `each'", 
  "/home/ec2-user/elastic/logstash-6.1.1/logstash-core/lib/logstash/pipeline.rb:486:in `output_batch'", 
  "/home/ec2-user/elastic/logstash-6.1.1/logstash-core/lib/logstash/pipeline.rb:438:in `worker_loop'", 
  "/home/ec2-user/elastic/logstash-6.1.1/logstash-core/lib/logstash/pipeline.rb:393:in `block in start_workers'"
]}

Failed to sent event with wrong argument numbers

I am trying to push some logs to my local MongoDB instance using Logstash with the configuration as follow:

   output{
 mongodb {
         id => "mongo-aws-cloudtrail"
         collection => "aws-cloudtrail"
         database => "unified-log"
         uri => "mongodb://localhost:27017/unified-log"
         codec => "json"
      }
}

Unfortunately, it gave an error message like this:

[WARN ] 2019-05-06 17:49:24.658 [[main]>worker3] mongodb - Failed to
send event to MongoDB, retrying in 3 seconds
{:event=>#LogStash::Event:0x790d54c4, :exception=>#<ArgumentError:
wrong number of arguments (given 2, expected 0..1)>}

Any ideas why it generates this error?

Reconnecting to Mongo after connection drop

output plugin does not reconnecting to Mongo if connection is dropped.
Retry logic stuck in a loop. It will never retry to get connection to Mongo.

Howto reproduce:

  1. Start logstash with a Mongo output plugin
  2. Write something to logstash --> Data is stored to Mongo
  3. Shutdown Mongo (I run Mongo as Service in Windows, so I stop Mongo service)
  4. Write something to logstash
  5. I get error: Failed to send event to MongoDB {:event=>2017-03-08T08:48:00.057Z _________ t4, :exception=>#<Mongo::Error::NoServerAvailable: No server is available matching preference: #<Mongo::ServerSelector::Primary:0x7354 @tag_sets=[], @options=:database=>"admin"},@server_selection_timeout=30>>,
  6. Start Mongo service
  7. Same error occurs again and again

I have no experience of Ruby.
but if this is called only once:

def register
   Mongo::Logger.logger = @logger
   conn = Mongo::Client.new(@uri)
   @db = conn.use(@database)
 end # def register

Maybe conn = Mongo::Client.new(@uri) and @db = conn.use(@database) should be called if connection is dropped in def receive(event) function.

Logstash-output-mongodb - bad file descriptor

  • Version: 3.1.3
  • Operating System: Ubuntu x64 16.04
  • Config File :
output {
    if [type] == "netflow" {
        elasticsearch {
            hosts => [ "${ELASTIFLOW_ES_HOST:127.0.0.1:9200}" ]
            user => "${ELASTIFLOW_ES_USER:elastic}"
            password => "${ELASTIFLOW_ES_PASSWD:changeme}"
            index => "netflow-%{+YYYY.MM.dd}"
            template => "${ELASTIFLOW_TEMPLATE_PATH:/etc/logstash/templates}/netflow.template.json"
            template_name => "netflow"
            template_overwrite => "true"
        }
        mongodb {
            uri => "mongodb://localhost:3001"
            database => "meteor"
            collection => "logstash_%{type}"
            isodate => true
        }
    }
}
  • Steps to Reproduce:

Since I'm using a meteor's built in mongodb, the db goes offline if the meteor app gets killed. Once the meteor app is back online, logstash continues to complain with the below message, until I restart logstash

[2017-11-21T15:34:12,447][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-11-21T15:34:14,195][DEBUG][logstash.outputs.mongodb ] MONGODB | COMMAND | namespace=admin.$cmd selector={:ismaster=>1} flags=[] limit=-1 skip=0 project=nil | runtime: 0.9999ms
[2017-11-21T15:34:14,195][DEBUG][logstash.outputs.mongodb ] MONGODB | Bad file descriptor - Bad file descriptor | runtime: 0.0000ms
[2017-11-21T15:34:14,695][DEBUG][logstash.outputs.mongodb ] MONGODB | COMMAND | namespace=admin.$cmd selector={:ismaster=>1} flags=[] limit=-1 skip=0 project=nil | runtime: 1.9999ms
[2017-11-21T15:34:14,695][DEBUG][logstash.outputs.mongodb ] MONGODB | Bad file descriptor - Bad file descriptor | runtime: 0.0000ms
[2017-11-21T15:34:15,195][DEBUG][logstash.outputs.mongodb ] MONGODB | COMMAND | namespace=admin.$cmd selector={:ismaster=>1} flags=[] limit=-1 skip=0 project=nil | runtime: 1.9999ms
[2017-11-21T15:34:15,195][DEBUG][logstash.outputs.mongodb ] MONGODB | Bad file descriptor - Bad file descriptor | runtime: 0.0000ms

How do I remedy this? Do I necessarily need to restart logstash each time my mongodb restarts?

Thanks.

Plugin version 3.1.6 issue

Hello,
I have upgraded the plugin from 3.1.5 to 3.1.6 to have the support of SSL against the MongoDB,
but it looks like version 3.1.6 is accidentally missing the "isodate" option (when I add it to the configuration file it complains of invalid number of arguments, and removing it fixes the issue).
Are there plans to release new version soon which will solve this issue?
Thanks,
Lior

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.