Coder Social home page Coder Social logo

logstash-input-beats's Introduction

Logstash Plugin

Travis Build Status

This is a plugin for Logstash.

It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way.

Documentation

Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one central location.

Need Help?

Need help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/logstash discussion forum.

Developing

1. Plugin Development and Testing

Code

  • To get started, you'll need JRuby with the Bundler gem installed.

  • Create a new plugin or clone and existing from the GitHub logstash-plugins organization. We also provide example plugins.

  • Install dependencies

bundle install

Test

  • Update your dependencies
bundle install
  • Run tests
bundle exec rspec
  • Run integration tests
bundle exec rake test:integration:setup
bundle exec rspec spec --tag integration  -fd

2. Running your unpublished Plugin in Logstash

2.1 Run in a local Logstash clone

  • Edit Logstash Gemfile and add the local plugin path, for example:
gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome"
  • Install plugin
# Logstash 2.3 and higher
bin/logstash-plugin install --no-verify

# Prior to Logstash 2.3
bin/plugin install --no-verify
  • Run Logstash with your plugin
bin/logstash -e 'filter {awesome {}}'

At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.

2.2 Run in an installed Logstash

You can use the same 2.1 method to run your plugin in an installed Logstash by editing its Gemfile and pointing the :path to your local plugin development directory or you can build the gem and install it using:

  • Build your plugin gem
gem build logstash-filter-awesome.gemspec
  • Install the plugin from the Logstash home
# Logstash 2.3 and higher
bin/logstash-plugin install --no-verify

# Prior to Logstash 2.3
bin/plugin install --no-verify
  • Start Logstash and proceed to test the plugin

Contributing

All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin.

Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here.

It is more important to the community that you are able to contribute.

For more information about contributing, see the CONTRIBUTING file.

logstash-input-beats's People

Contributors

111andre111 avatar andsel avatar ccannell67 avatar cwurm avatar dedemorton avatar edmocosta avatar electrical avatar failshell avatar fbacchella avatar jakelandis avatar jeanfabrice avatar jordansissel avatar jpcarey avatar jsvd avatar kaisecheng avatar karenzone avatar kares avatar masaruh avatar mashhurs avatar nickethier avatar original-brownbear avatar ph avatar praseodym avatar roaksoax avatar robbavey avatar sigmavirus24 avatar suyograo avatar tanji avatar tsaarni avatar yaauie avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

logstash-input-beats's Issues

unhanded exception - unsupported protocol 60

Hey there,

i got an unusual exception and don't know what todo:

Beats input: unhandled exception {:exception=>#<RuntimeError: unsupported protocol 60>, :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.0.3/lib/lumberjack/beats/server.rb:225:in `handle_version'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.0.3/lib/lumberjack/beats/server.rb:210:in `header'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.0.3/lib/lumberjack/beats/server.rb:163:in `feed'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.0.3/lib/lumberjack/beats/server.rb:342:in `read_socket'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.0.3/lib/lumberjack/beats/server.rb:319:in `run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.0.3/lib/logstash/inputs/beats.rb:184:in `invoke'", "org/jruby/RubyProc.java:281:in `call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/concurrent-ruby-0.9.2-java/lib/concurrent/executor/executor_service.rb:515:in `run'", "Concurrent$$JavaExecutorService$$Job_463958253.gen:13:in `run'"], :level=>:error}

my filbeat config looks like this:

filebeat:
  prospectors:
    -
      paths:
        - /var/log/*.log
      input_type: log
output:
  logstash:
    enabled: true
    hosts:
      - logstash01.xxx.com:5522

and my logstash config is this:

input {
  beats {
    type  => "my_events"
    port  => "5522"
  }
}
filter {
  ...
}
output {
  ...
}

would be great if somebody can help me... i don't see what is wrong...

My Setup is:

elasticsearch (3 nodes) => version 2.1.0-1
logstash (2 nodes) => version 2.1.1-1
kibana ( 1 node) => version 4.3.0
filebeat => version 1.0.0

Thanks

Correctly Handle the flush of the `IdentityMapcodec`

The IdentityMapCodec has an eviction mechanism that will flush unused codecs after 1 hour. The encode method of the codec need to receive a proc that can push to the queue. The current code base doesn't allow it since we use the encode method to return an event.

For more information see: #22 (comment)
157449206

Input Beats in much slower than the LSF

We have report on discuss and other users that the beats input is much slower than his predecessor LSF. See https://discuss.elastic.co/t/insufficient-throughput-from-filebeat/39564/11

@urso did a few testing on his side to isolate the issue and a lot of factor point to the direction the input.

The following tests were made with no filter and with the null output on a Hetzner test machine. (4 core i7 3.4GHz 32 gigs of ram)

Setup EPS
LSF -> Lumberjack 30k lines/sec
Filebeat -> Beats 18K lines/sec

What change drastically from LSF to Beats:

  • The protocol now use a JSON frame which contains multiples events. and its probably slower than our binary protocol
  • We use partial acks, we ack on every 20% on the events received. this call is a blocking call
  • 2.1.X introduces more object allocation but we have seen report of the performance issue with previous version too which doesn't use the refactored code.

This is what we can try in the short term

  • Make sense of the jruby profiling report from the tests
  • Remove the partial ack.
  • Measure the JSON performance on large payload.
  • Async model might be necessary to get more performance?

Jruby profiling report

flat: https://gist.github.com/ph/716c8876efd5ab760c15
graph: https://gist.github.com/
graph in json: https://gist.github.com/ph/0e05e51cff97d251b60c

cc @tsg @urso

(Theses tests were made with the metric filter instead of PV to be able to correlate data with a user, but PV show a similar slowdown)

Test failling because the port was already in use.

Port already in use on jenkins, I am pretty sure we use random port on each examples.

http://build-eu-00.elastic.co/job/logstash-plugin-input-beats-unit/jdk=JDK8,nodes=metal-pool/12/console

  1) A client using plain tcp connection When transmitting a payload v1 frame when sequence start at 0 supports single element
     Failure/Error: ssl_server = Lumberjack::Beats::Server.new(config_ssl)
     Errno::EADDRINUSE:
       Address already in use - bind - Address already in use
     # ./lib/lumberjack/beats/server.rb:51:in `initialize'
     # ./spec/integration_spec.rb:49:in `(root)'

  2) A client Mutual validation with certificate authorities directly signing a certificate when sequence start at 0 support sending multiple elements in one payload
     Failure/Error: tcp_server = Lumberjack::Beats::Server.new(config_tcp)
     Errno::EADDRINUSE:
       Address already in use - bind - Address already in use
     # ./lib/lumberjack/beats/server.rb:51:in `initialize'
     # ./spec/integration_spec.rb:48:in `(root)'

  3) A client Mutual validation with certificate authorities Mutiple CA different clients when the CA are defined individually in the configuration client from secondary CA when sequence doesn't start at zero supports single element
     Failure/Error: tcp_server = Lumberjack::Beats::Server.new(config_tcp)
     Errno::EADDRINUSE:
       Address already in use - bind - Address already in use
     # ./lib/lumberjack/beats/server.rb:51:in `initialize'
     # ./spec/integration_spec.rb:48:in `(root)'

IdentityMapCodec has reached 100% capacity

logstash 2.2.2 with buildin logstash-input-beats (2.1.3)
filebeat 1.1.2
arch: many log => filebeat x18 =>logstash x3 =>elasticsearch x3 nodes

filebeat continually output the following message, no log can be send to logstash

2016-03-28T14:03:29+08:00 INFO Error publishing events (retrying): EOF

logstash continually output the following error message, and log cannot be write to elasticsearch.

{:timestamp=>"2016-03-28T13:48:10.861000+0800", :message=>"IdentityMapCodec has reached 100% capacity", :current_size=>20000, :upper_limit=>20000, :level=>:error}
{:timestamp=>"2016-03-28T13:48:10.862000+0800", :message=>"Beats input: unhandled exception", :exception=>#<LogStash::Codecs::IdentityMapCodec::IdentityMapUpperLimitException: LogStash::Codecs::IdentityMapCodec::IdentityMapUpperLimitException>, :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-multiline-2.0.9/lib/logstash/codecs/identity_map_codec.rb:37:in `visit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-multiline-2.0.9/lib/logstash/codecs/identity_map_codec.rb:325:in `check_map_limits'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-multiline-2.0.9/lib/logstash/codecs/identity_map_codec.rb:304:in `record_codec_usage'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-multiline-2.0.9/lib/logstash/codecs/identity_map_codec.rb:295:in `stream_codec'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-multiline-2.0.9/lib/logstash/codecs/identity_map_codec.rb:177:in `accept'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.3/lib/logstash/inputs/beats_support/connection_handler.rb:58:in `process'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.3/lib/logstash/inputs/beats_support/connection_handler.rb:33:in `accept'", "org/jruby/RubyProc.java:281:in `call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.3/lib/lumberjack/beats/server.rb:429:in `data'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.3/lib/lumberjack/beats/server.rb:408:in `read_socket'", "org/jruby/RubyProc.java:281:in `call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.3/lib/lumberjack/beats/server.rb:420:in `ack_if_needed'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.3/lib/lumberjack/beats/server.rb:404:in `read_socket'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.3/lib/lumberjack/beats/server.rb:261:in `json_data_payload'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.3/lib/lumberjack/beats/server.rb:178:in `feed'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.3/lib/lumberjack/beats/server.rb:311:in `compressed_payload'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.3/lib/lumberjack/beats/server.rb:178:in `feed'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.3/lib/lumberjack/beats/server.rb:389:in `read_socket'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.3/lib/lumberjack/beats/server.rb:369:in `run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.3/lib/logstash/inputs/beats_support/connection_handler.rb:33:in `accept'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.3/lib/logstash/inputs/beats.rb:177:in `handle_new_connection'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.3/lib/logstash/inputs/beats_support/circuit_breaker.rb:4
2:in `execute'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.3/lib/logstash/inputs/beats.rb:177:in `handle_new_connection'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.3/lib/logstash/inputs/beats.rb:133:in `run'"], :level=>:error}

How to fix this?
Thanks

[@metadata][index] bug(s)?

The attached configuration was run on OS X with following versions. I'm seeing same behavior on x86_64 as well.

  • elasticsearch 1.7.3
  • logstash 1.5.4 with latest beats input plugin
  • filebeat 1.0.0-beta4

I see two issues or potential sources of confusion:

  1. The beats plugin seems to reserve [@metadata][index]: I can't remove it or redefine it.
  2. The interpolated index does not use the event's timestamp but the shipped timestamp instead.

Regarding item 1, I can workaround by using something other than [@metadata][index] in my config. I recommend addressing in the docs so others don't hit this unexpected name clash. Regarding item 2, not as important in my use case since I define the index explicitly but I can imagine that impacting quite a few users.

Hopefully this is enough information to reproduce:

$ ~/Downloads/logstash-1.5.4/bin/logstash -f /tmp/logstash.conf &
Oct 25, 2015 9:25:51 AM org.elasticsearch.plugins.PluginsService <init>
INFO: [logstash-ErikStephens.local-90663-11760] loaded [], sites []
Logstash startup completed

$ rm -f .filebeat; filebeat -e -d publish -c filebeat.yaml
publish.go:185: DBG  create output worker: 0x0, 0x0
publish.go:222: WARN No output is defined to store the topology. The server fields might not be filled.
async.go:95: DBG  create bulk processing worker (interval=1s, bulk size=10000)
crawler.go:77: INFO All prospectors initialised with 0 states to persist
output.go:37: DBG  output worker: no events to publish
output.go:37: DBG  output worker: no events to publish
client.go:58: DBG  send event
preprocess.go:37: DBG  preprocessor
publish.go:87: DBG  Publish: {
  "count": 1,
  "fields": null,
  "fileinfo": {},
  "line": 1,
  "message": "HELLO",
  "offset": 0,
  "shipper": "test",
  "source": "test.log",
  "timestamp": "2015-10-25T16:26:10.413Z",
  "type": "log"
}
preprocess.go:91: DBG  preprocessor forward
output.go:42: DBG  output worker: publish 1 events
output.go:37: DBG  output worker: no events to publish

{
       "message" => "HELLO",
      "@version" => "1",
    "@timestamp" => "2015-01-01T12:00:00.000Z",
          "type" => "log",
         "count" => 1,
        "fields" => nil,
      "fileinfo" => {},
          "line" => 1,
        "offset" => 0,
       "shipper" => "test",
        "source" => "test.log",
     "timestamp" => "2015-10-25T16:26:10.413Z",
            "ts" => "2015-01-01T12:00:00Z",
     "@metadata" => {
        "index" => "filebeat-2015.10.25",
         "type" => "log"
    }
}

$ curl localhost:9200/_cat/indices?v
health status index               pri rep docs.count docs.deleted store.size pri.store.size
yellow open   filebeat-2015.10.25   5   1          1            0        5kb            5kb

cat /tmp/logstash.conf

input {
  stdin {
    type => stdin
  }
  beats {
    type => beats
    port => 12345
  }
}
filter {
  mutate {
    replace => [
      'ts', '2015-01-01T12:00:00Z'
    ]
    remove_field => [
      '[@metadata][index]'
    ]
  }
  date {
    match => [ 'ts', 'ISO8601' ]
  }
}
output {
  stdout {
    codec => rubydebug { metadata => true }
  }
  elasticsearch {
    host => [ 'localhost' ]
    protocol => 'transport'
    cluster => 'elasticsearch'
    action => update
    doc_as_upsert => true
    index => '%{[@metadata][index]}'
    document_id => 'foo'
  }
}

Copy beat.hostname into host

The Beats are sending the hostname on which they run in the beat.hostname field. In Logstash, most input plugins are using the host field for this purpose. This is annoying for people that use Logstash with a lot of different sources.

After a bit of discussion on the topic, we decided that the best way forward would be to keep using beat.hostname in the Beats but have the logstash-input-beats plugin copy it to the host field.

plugin update logstash-input-beats major version upgrade confirmation

Plugin update appears to have a bug. I am not sure if it's related to the plugin itself or the logstash plugin command or both.

Looks like the plugin manager fetches information directly from Rubygems, hence why it warned me about upgrading to version 2.0.0, but then it still proceeded to install version 0.9.6 after my confirmation. See terminal output below

Before:

root@elasticsearch:[/opt/logstash]# find . -name "*beats*"                                                                                                                                         
./vendor/bundle/jruby/1.9/specifications/logstash-input-beats-0.9.2.gemspec
./vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.2
./vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.2/lib/lumberjack/beats.rb
./vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.2/lib/lumberjack/beats
./vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.2/lib/logstash/inputs/beats.rb
./vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.2/logstash-input-beats.gemspec

During:

root@elasticsearch:[/opt/logstash]# bin/plugin update logstash-input-beats                                                                                                                         
You are updating logstash-input-beats to a new version 2.0.0, which may not be compatible with 0.9.2. are you sure you want to proceed (Y/N)?
Y
Updating logstash-input-beats
Updated logstash-input-beats 0.9.2 to 0.9.6

After:

root@elasticsearch:[/opt/logstash]# find . -name "*beats*"                                                                                                                                         
./vendor/bundle/jruby/1.9/cache/logstash-input-beats-0.9.6.gem
./vendor/bundle/jruby/1.9/specifications/logstash-input-beats-0.9.6.gemspec
./vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.6
./vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.6/spec/inputs/beats_spec.rb
./vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.6/spec/lumberjack/beats
./vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.6/lib/lumberjack/beats.rb
./vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.6/lib/lumberjack/beats
./vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.6/lib/logstash/inputs/beats.rb
./vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.6/logstash-input-beats.gemspec

Beats Input: Remote connection closed unsupported protocol 42 warning

I'm seeing this filling up my logstash log:

{:timestamp=>"2016-01-15T11:46:37.690000+0000", :message=>"Beats Input: Remote connection closed", :peer=>"xx.xx.xx.xx:41919", :exception=>#<Lumberjack::Beats::Connection::ConnectionClosed: Lumberjack::Beats::Connection::ConnectionClosed wrapping: Lumberjack::Beats::Parser::UnsupportedProtocol, unsupported protocol 42>, :level=>:warn}

Prior to upgrading to the latest version of the logstash-input-beats plugin with was being reported as an error, however since upgrading the logs from filebeat are being processed by its completely spamming the logstash.log file with these warnings.

Versions
Filebeats 1.0.1
Logstash-input-beats 2.1.2
Logstash: 2.1.1

Codec multiliner send rogue events down to the pipeline

From: https://discuss.elastic.co/t/multiline-codec-is-not-working-with-filebeats-plugin-in-logstash-2-0-0/33768/3

Aggregating events with the multiline codec can cause the plugin to send rogue events down the pipeline.

Default settings used: Filter workers: 2
Beats: SSL Certificate will not be used {:level=>:warn}
Beats: SSL Key will not be used {:level=>:warn}
Logstash startup completed
NoMethodError: undefined method `to_hash_with_metadata' for ["0"]:Array
  encode_with_metadata at /Users/ph/es/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-rubydebug-2.0.3/lib/logstash/codecs/rubydebug.rb:38
                  call at org/jruby/RubyMethod.java:120
                encode at /Users/ph/es/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-rubydebug-2.0.3/lib/logstash/codecs/rubydebug.rb:30
               receive at /Users/ph/es/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-stdout-2.0.2/lib/logstash/outputs/stdout.rb:54
                handle at /Users/ph/es/logstash/lib/logstash/outputs/base.rb:78
           output_func at (eval):22
          outputworker at /Users/ph/es/logstash/lib/logstash/pipeline.rb:252
         start_outputs at /Users/ph/es/logstash/lib/logstash/pipeline.rb:169

logstash config

input {
  beats {
    port => 5555
    codec => multiline {
      max_lines => 2000
      pattern => "\d+"
      negate => true
      what => "previous"
    }
    ssl_certificate => "/Users/ph/es/logstash-config/certificates/logstash-forwarder.crt"
    ssl_key => "/Users/ph/es/logstash-config/certificates/logstash-forwarder.key"
  }
}
output {
  stdout { codec  => rubydebug { metadata => true } }
}

Sample log

70572
something else
70573
something else
70574
something else
7057

Add integration test for packetbeat

We have an integration testing for filebeat, I think we should also add other beats to make sure the input can ingest all of their specific data.

"add_field" and "tags" configuration options have no effect

I am using this configuration with logstash 2.0.0 ...

input {
  beats {
    port => 5044
    add_field => { "application" => "beats" }
    tags => [ "beats" ]
  }
}

output {
  stdout {
    codec  => rubydebug {
      metadata => true
    }
  }
}

... and send some data using 'topbeat'

In the output I see neither the field 'application' nor the tag 'beats'. This would be helpful to create filters for this input when other inputs are used in parallel

Date conversion

Hello,

I'm trying to run a basic sample with file tailing in filebeat and simply forwarding to logstash. However, I have the following error when logstash receives the message :

The field '@timestamp' must be a (LogStash::Timestamp, not a String (2015-11-23T10:49:43.423Z)>,

What is going wrong here? Is it the plugin? Is it the filebeat configuration?
Thanks !

Running :

logstash-input-beats (0.9.2)

Verify client certificates against CA

Currently, the beats input does not verify the client certificate against the CA. This means that any client can connect to any logstash server and submit arbitrary data. Besides cluttering the logstash server with meaningless data, attackers could try to take servers down by submitting huge amounts of data.

Analogous to filebeat/logstash-forwarder, the client certificate should be checked against the CA on the server side.

See also logstash-plugins/logstash-input-lumberjack#31.

Failling tests

http://build-eu-1.elasticsearch.org/job/logstash-plugin-input-beats-unit/jdk=JDK8,nodes=metal-pool/lastBuild/console

  27) LogStash::Inputs::BeatsSupport::EventTransformCommon when the `beat.hostname` exist in the event when `host` key exists on the event doesn't override it
     Failure/Error: i.register
     NoMethodError:
       undefined method `value' for nil:NilClass
     # ./lib/logstash/inputs/beats.rb:113:in `register'
     # ./spec/support/shared_examples.rb:14:in `input'
     # ./spec/support/shared_examples.rb:13:in `input'
     # ./spec/inputs/beats_support/event_transform_common_spec.rb:8:in `subject'
     # ./spec/support/shared_examples.rb:52:in `(root)'

  28) LogStash::Inputs::BeatsSupport::EventTransformCommon when the `beat.hostname` exist in the event when `host` key doesn't exist on the event copy the `beat.hostname` to `host` or backward compatibility
     Failure/Error: i.register
     NoMethodError:
       undefined method `value' for nil:NilClass
     # ./lib/logstash/inputs/beats.rb:113:in `register'
     # ./spec/support/shared_examples.rb:14:in `input'
     # ./spec/support/shared_examples.rb:13:in `input'
     # ./spec/inputs/beats_support/event_transform_common_spec.rb:8:in `subject'
     # ./spec/support/shared_examples.rb:43:in `(root)'

  29) LogStash::Inputs::BeatsSupport::EventTransformCommon when the `beast.hotname` doesnt exist on the event doesnt change the value
     Failure/Error: i.register
     NoMethodError:
       undefined method `value' for nil:NilClass
     # ./lib/logstash/inputs/beats.rb:113:in `register'
     # ./spec/support/shared_examples.rb:14:in `input'
     # ./spec/support/shared_examples.rb:13:in `input'
     # ./spec/inputs/beats_support/event_transform_common_spec.rb:8:in `subject'
     # ./spec/support/shared_examples.rb:33:in `(root)'

Finished in 46.71 seconds (files took 2.73 seconds to load)
122 examples, 29 failures, 5 pending

Failed examples:

Please delete!

Apologies for putting this in the wrong git repo. :(
Please delete this if possible/needed. I'm creating the issue in the correct repo now.

logstash for windows error on filebeat shutdown

I'm using filebeat and logstash for logging and I've noticed strange error in logstash output:

๏ฟฝ[31mBeats input: unhandled exception {:exception=>#<SystemCallError: Unknown error - ????????? ???? ????????????? ???????? ???????????? ???????????>, :backtrace=>["org/jruby/RubyIO.java:3020:in `sysread'", "D:/Repositories/logstashtests/logstash-1.5.4/logstash-1.5.4/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.2/lib/lumberjack/beats/server.rb:330:in `read_socket'", "D:/Repositories/logstashtests/logstash-1.5.4/logstash-1.5.4/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.2/lib/lumberjack/beats/server.rb:315:in `run'", "D:/Repositories/logstashtests/logstash-1.5.4/logstash-1.5.4/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.2/lib/logstash/inputs/beats.rb:150:in `invoke'", "org/jruby/RubyProc.java:271:in `call'", "D:/Repositories/logstashtests/logstash-1.5.4/logstash-1.5.4/vendor/bundle/jruby/1.9/gems/concurrent-ruby-0.9.1-java/lib/concurrent/executor/executor_service.rb:515:in `run'", "Concurrent$$JavaExecutorService$$Job_102551874.gen:13:in `run'"], :level=>:error}๏ฟฝ[0m

My logstash.conf file:

input {
    beats {
        port => 5044
    }
}

output {
    stdout {}
}

My filebeat.yml file:

filebeat:
  prospectors:
    -
      paths:
        - "D:\\Repositories\\logstashtests\\logs.txt"
      type: log
output:
  logstash:
    enabled: true

    hosts: ["127.0.0.1:5044"]

It seems that everything works fine even with this exception, You can find some details in filebeat repo: elastic/filebeat#145

Beats input crashing

So we've been trying to use this for a couple days, but I'm seeing these errors a lot more often and it just crashes the logstash process.

Logstash: 2.2.0
Beats Input Plugin: 2.1.2

My input.conf for beats is basic:

input {
  beats {
    port => 5044
  }
}
{:timestamp=>"2016-02-12T16:08:45.908000+0000", :message=>"Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash.", "exception"=>#<NoMethodError: undefined method `start_with?' for nil:NilClass>, "backtrace"=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-event-2.2.0-java/lib/logstash/event.rb:117:in `[]'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-de_dot-0.1.1/lib/logstash/filters/de_dot.rb:91:in `filter'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-de_dot-0.1.1/lib/logstash/filters/de_dot.rb:90:in `filter'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.0-java/lib/logstash/filters/base.rb:151:in `multi_filter'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.0-java/lib/logstash/filters/base.rb:148:in `multi_filter'", "(eval):1613:in `filter_func'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.0-java/lib/logstash/pipeline.rb:241:in `filter_batch'", "org/jruby/RubyArray.java:1613:in `each'", "org/jruby/RubyEnumerable.java:852:in `inject'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.0-java/lib/logstash/pipeline.rb:239:in `filter_batch'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.0-java/lib/logstash/pipeline.rb:197:in `worker_loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.0-java/lib/logstash/pipeline.rb:175:in `start_workers'"], :level=>:error}
{:timestamp=>"2016-02-12T16:08:45.986000+0000", :message=>"Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash.", "exception"=>#<NoMethodError: undefined method `start_with?' for nil:NilClass>, "backtrace"=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-event-2.2.0-java/lib/logstash/event.rb:117:in `[]'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-de_dot-0.1.1/lib/logstash/filters/de_dot.rb:91:in `filter'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-de_dot-0.1.1/lib/logstash/filters/de_dot.rb:90:in `filter'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.0-java/lib/logstash/filters/base.rb:151:in `multi_filter'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.0-java/lib/logstash/filters/base.rb:148:in `multi_filter'", "(eval):1613:in `filter_func'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.0-java/lib/logstash/pipeline.rb:241:in `filter_batch'", "org/jruby/RubyArray.java:1613:in `each'", "org/jruby/RubyEnumerable.java:852:in `inject'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.0-java/lib/logstash/pipeline.rb:239:in `filter_batch'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.0-java/lib/logstash/pipeline.rb:197:in `worker_loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.0-java/lib/logstash/pipeline.rb:175:in `start_workers'"], :level=>:error}
{:timestamp=>"2016-02-12T16:08:46.097000+0000", :message=>"Beats input: unhandled exception", :exception=>#<Errno::EBADF: Bad file descriptor - Bad file descriptor>, :backtrace=>["org/jruby/RubyIO.java:1331:in `syswrite'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/lumberjack/beats/server.rb:438:in `send_ack'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/lumberjack/beats/server.rb:421:in `ack_if_needed'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/lumberjack/beats/server.rb:404:in `read_socket'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/lumberjack/beats/server.rb:261:in `json_data_payload'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/lumberjack/beats/server.rb:178:in `feed'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/lumberjack/beats/server.rb:311:in `compressed_payload'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/lumberjack/beats/server.rb:178:in `feed'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/lumberjack/beats/server.rb:389:in `read_socket'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/lumberjack/beats/server.rb:369:in `run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/logstash/inputs/beats_support/connection_handler.rb:33:in `accept'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/logstash/inputs/beats.rb:177:in `handle_new_connection'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/logstash/inputs/beats_support/circuit_breaker.rb:42:in `execute'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/logstash/inputs/beats.rb:177:in `handle_new_connection'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/logstash/inputs/beats.rb:133:in `run'"], :level=>:error}
{:timestamp=>"2016-02-12T16:08:46.099000+0000", :message=>"Beats input: unhandled exception", :exception=>#<Errno::EBADF: Bad file descriptor - Bad file descriptor>, :backtrace=>["org/jruby/RubyIO.java:1331:in `syswrite'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/lumberjack/beats/server.rb:438:in `send_ack'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/lumberjack/beats/server.rb:421:in `ack_if_needed'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/lumberjack/beats/server.rb:404:in `read_socket'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/lumberjack/beats/server.rb:261:in `json_data_payload'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/lumberjack/beats/server.rb:178:in `feed'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/lumberjack/beats/server.rb:311:in `compressed_payload'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/lumberjack/beats/server.rb:178:in `feed'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/lumberjack/beats/server.rb:389:in `read_socket'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/lumberjack/beats/server.rb:369:in `run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/logstash/inputs/beats_support/connection_handler.rb:33:in `accept'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/logstash/inputs/beats.rb:177:in `handle_new_connection'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/logstash/inputs/beats_support/circuit_breaker.rb:42:in `execute'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/logstash/inputs/beats.rb:177:in `handle_new_connection'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/logstash/inputs/beats.rb:133:in `run'"], :level=>:error}

ConcurrencyError: No message available

Hello,
I am a user of Packetbeat trying to improve Logstash throughput. I'm using this guide: https://www.elastic.co/blog/logstash-configuration-tuning

The first objective would be to save the packets in a JSON file. Here is the Logstash config file:

input {
  beats {
    port => 5044
  }
}

output {
  file {
    codec => "json_lines"
    path => "/opt/packetbeat_shipper/packets_test.json"
  }
}

The following error message is what I get with this config file.

{:timestamp=>"2015-11-04T09:24:48.686000+0100", :message=>"Beats input: unhandled exception", :exception=>#<ConcurrencyError: No message available>, :backtrace=>["org/jruby/ext/thread/ConditionVariable.java:98:inwait'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-lumberjack-2.0.4/lib/logstash/sized_queue_timeout.rb:30:in push'", "org/jruby/ext/thread/Mutex.java:149:insynchronize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-lumberjack-2.0.4/lib/logstash/sized_queue_timeout.rb:27:in push'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.3/lib/logstash/inputs/beats.rb:110:inrun'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-lumberjack-2.0.4/lib/logstash/circuit_breaker.rb:42:in execute'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.3/lib/logstash/inputs/beats.rb:109:inrun'", "org/jruby/RubyProc.java:271:in call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.3/lib/logstash/inputs/beats.rb:167:ininvoke'", "org/jruby/RubyProc.java:271:in call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.3/lib/lumberjack/beats/server.rb:378:indata'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.3/lib/lumberjack/beats/server.rb:357:in read_socket'", "org/jruby/RubyProc.java:271:incall'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.3/lib/lumberjack/beats/server.rb:369:in ack_if_needed'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.3/lib/lumberjack/beats/server.rb:353:inread_socket'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.3/lib/lumberjack/beats/server.rb:246:in json_data_payload'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.3/lib/lumberjack/beats/server.rb:163:infeed'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.3/lib/lumberjack/beats/server.rb:296:in compressed_payload'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.3/lib/lumberjack/beats/server.rb:163:infeed'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.3/lib/lumberjack/beats/server.rb:338:in read_socket'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.3/lib/lumberjack/beats/server.rb:315:inrun'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.3/lib/logstash/inputs/beats.rb:167:in invoke'", "org/jruby/RubyProc.java:271:incall'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/concurrent-ruby-0.9.1-java/lib/concurrent/executor/executor_service.rb:515:in run'", "Concurrent$$JavaExecutorService$$Job_1833200224.gen:13:inrun'"], :level=>:error}
`

Display beed feedback message in the log when an SSL error occur, cert expiration or plain text.

I had problem with filebeat "msg: ERR SSL client failed to connect with: EOF"

After long searching I found that my issue was that my client cert expired.
Neither filebeat nor logstash produce any log at INFO,WARNING,ERROR that would point u to this even in debug mode.

all filebeat logs at debug level is :

2016-04-27T11:25:11+02:00 DBG  connect
2016-04-27T11:25:11+02:00 ERR SSL client failed to connect with: EOF
2016-04-27T11:25:11+02:00 INFO Connecting error publishing events (retrying): EOF
2016-04-27T11:25:11+02:00 INFO send fail
2016-04-27T11:25:11+02:00 INFO backoff retry: 1s

I think at least one or both of those endpoints should log this - WARNING for logstash, ERROR for filebeat.
I took me a lot of time especially that I did check with openssl s_client and it also was returning that every thing is just ok + disconnected.

#openssl s_client -connect loremipsum:5000 -CAfile CA-LOGS.crt -cert C-logger.crt -key C-logger.key
...
...

---
SSL handshake has read 3923 bytes and written 3831 bytes

---
New, TLSv1/SSLv3, Cipher is ECDHE-RSA-AES128-SHA256
Server public key is 2048 bit
Secure Renegotiation IS supported
Compression: NONE
Expansion: NONE
SSL-Session:
    Protocol  : TLSv1.2
    Cipher    : ECDHE-RSA-AES128-SHA256
    Session-ID: 5720<data>0ABC
    Session-ID-ctx:
    Master-Key: F265<data>5BF5
    Key-Arg   : None
    PSK identity: None
    PSK identity hint: None
    SRP username: None
    Start Time: 1461749495
    Timeout   : 300 (sec)
    Verify return code: 0 (ok)

---

cert expired

# openssl x509 -in C-logger.crt -text
Certificate:
    Data:
        Version: 3 (0x2)
        Serial Number: 4 (0x4)
    Signature Algorithm: sha256WithRSAEncryption
        Issuer: C=PL, <..... etc... >
        Validity
            Not Before: Apr 11 00:00:00 2016 GMT
            Not After : **Apr 20 23:59:59 2016** GMT

INFO:
filebeat version 1.2.1 and 1.2.2 (amd64)
Operating System: debian wheezy+jessie, ubuntu 15.10+16.04LTS - platform independent?
Config File:

filebeat:
  prospectors:
    -
      paths:
        - /var/log/auth.log
        - /var/log/syslog
      document_type: syslog
      fields:
        host: nicehost
  registry_file: /var/log/filebeat/.filebeat
output:
  logstash:
    hosts: ["loremipsum:5000"]

    # Set gzip compression level.
    compression_level: 3

    #index: syslog

    tls:
      certificate_authorities: ["/etc/filebeat/CA-LOGS.crt"]
      certificate: "/etc/filebeat/C-logger.crt"
      certificate_key: "/etc/filebeat/C-logger.key"
      insecure: false

  #debug only
  #console:
    # Pretty print json event
    #pretty: true

logging:
  # Available log levels are: critical, error, warning, info, debug
  level: debug

  #enable file rotation with default configuration
  to_files: true

  # do not log to syslog
  to_syslog: false

  files:
    path: /var/log/filebeat
    name: filebeat.log
    keepfiles: 7

Great software anyway - best regards.

Refactor goals

  • Remove the multiple levels of block inside the input itself and the library, there is way too much level of indirection and make the code hard to read and to test.
  • Remove the SizeQueue with timeout to either use one of the java stardard library
  • Make move the connection thread handling up to the library the input shouldn't have to deal with that.
  • Can we move to a complete non blocking way of handling connections?
  • Add more tests, we mostly rely on integration testing here.

unhandled exception Broken Pipe

Hello,
I'm testing the master branch of PacketBeat with Logstash 1.5.4. When there is only one instance of PacketBeat running, everything is fine. However, when multiple instances are running and feeding Logstash, I often get the following error. Also, documents are missing in ElasticSearch.

Beats input: unhandled exception {:exception=>#<Errno::EPIPE: Broken pipe - Broken pipe>, :backtrace=>["org/jruby/RubyIO.java:1334:in syswrite'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.2/lib/lumberjack/beats/server.rb:379:in send_ack'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.2/lib/lumberjack/beats/server.rb:362:in ack_if_needed'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.2/lib/lumberjack/beats/server.rb:345:in read_socket'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.2/lib/lumberjack/beats/server.rb:246:in json_data_payload'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.2/lib/lumberjack/beats/server.rb:163:in feed'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.2/lib/lumberjack/beats/server.rb:296:in compressed_payload'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.2/lib/lumberjack/beats/server.rb:163:in feed'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.2/lib/lumberjack/beats/server.rb:330:in read_socket'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.2/lib/lumberjack/beats/server.rb:315:in run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.2/lib/logstash/inputs/beats.rb:150:in invoke'", "org/jruby/RubyProc.java:271:in call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/concurrent-ruby-0.9.1-java/lib/concurrent/executor/executor_service.rb:515:in run'", "Concurrent$$JavaExecutorService$$Job_2076907911.gen:13:in run'"], :level=>:error}

Threading issue under high volume, ThreadError: Mutex is not locked

:message=>"Beats input: unhandled exception", :exception=>#<ThreadError: Mutex is not locked>, :backtrace=>["org/jruby/ext/thread/Mutex.java:106:in unlock'", "org/jruby/ext/thread/Mutex.java:151:insynchronize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-lumberjack-2.0.5/lib/logstash/sized_queue_timeout.rb:27:in `push'"...

This also raise this error #<ConcurrencyError: No message available>, which in turn do a Concurrent::RejectedExecutionError.

A few notes the sizequeue from the logstash input lumberjack seems to conflict with the one in the beats input. This shouldn't be a big problem since both code is the same but it may cause issue in debugging or updating, I will rename it the separation clearer.

Concerning the bug, I believe there is a an issue with the ConditionVariable inside the SizeQueueWith timeout, when a lot of servers are connected to logstash and the platform is under high load.

I think the best solution is to remove the custom code here and rely on Java Queue which already support a timeout values or by removing the timeout completely. This will have the benefit of reducing the custom code and make sure the class is free of any concurrent bug.

I haven't been able to reproduce it locally, but I still think removing that code an use the one shipped with jruby is the best bet here.

A small story, I was thinking about using it before, but before we were avoiding using pure java classes.

Add integration test for LSF to beats input

Discussed with @jordansissel earlier. He likes the idea of officially supporting LSF -> LS beats input to ease migration. He has tested this briefly before and it works because LSF and beats use the same underlying protocol, but would like to see us add integration tests so that we can officially confirm and document it as a supported path.

Error

When I run Filebeat -> LS -> after some time error messages start to pop up. It seems like all log messages are read until the following error pops up. The error is from then on repeated multiple times (every x seconds):

Beats input: unhandled exception {:exception=>#<ZeroDivisionError: divided by 0>, :backtrace=>["org/jruby/RubyFixnum.java:632:in `%'", "/Users/ruflin/Dev/test-setup/logstash-1.5.4/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.1/lib/lumberjack/beats/server.rb:421:in `ack?'", "/Users/ruflin/Dev/test-setup/logstash-1.5.4/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.1/lib/lumberjack/beats/server.rb:362:in `ack_if_needed'", "/Users/ruflin/Dev/test-setup/logstash-1.5.4/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.1/lib/lumberjack/beats/server.rb:345:in `read_socket'", "/Users/ruflin/Dev/test-setup/logstash-1.5.4/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.1/lib/lumberjack/beats/server.rb:246:in `json_data_payload'", "/Users/ruflin/Dev/test-setup/logstash-1.5.4/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.1/lib/lumberjack/beats/server.rb:163:in `feed'", "/Users/ruflin/Dev/test-setup/logstash-1.5.4/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.1/lib/lumberjack/beats/server.rb:296:in `compressed_payload'", "/Users/ruflin/Dev/test-setup/logstash-1.5.4/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.1/lib/lumberjack/beats/server.rb:163:in `feed'", "/Users/ruflin/Dev/test-setup/logstash-1.5.4/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.1/lib/lumberjack/beats/server.rb:330:in `read_socket'", "/Users/ruflin/Dev/test-setup/logstash-1.5.4/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.1/lib/lumberjack/beats/server.rb:315:in `run'", "/Users/ruflin/Dev/test-setup/logstash-1.5.4/vendor/bundle/jruby/1.9/gems/logstash-input-beats-0.9.1/lib/logstash/inputs/beats.rb:150:in `invoke'", "org/jruby/RubyProc.java:271:in `call'", "/Users/ruflin/Dev/test-setup/logstash-1.5.4/vendor/bundle/jruby/1.9/gems/concurrent-ruby-0.9.1-java/lib/concurrent/executor/executor_service.rb:515:in `run'", "Concurrent$$JavaExecutorService$$Job_379996920.gen:13:in `run'"], :level=>:error}

Versions:

  • Filebeat 1.0.0-beta4
  • Logstash 1.5.4
  • Elasticsearch 1.7.2

Provide an option to retain the message field when a codec is used

target_field = target_field_for_codec ? map.delete(target_field_for_codec) : nil

Currently, when a codec is used, we discard the message (by default) field. It will be nice to provide a configurable option for the end user to specify whether they want to retain the original message field or not:

  • For validation purposes (so they can see the parsed original)
  • For consistency since other log types that do not use a codec will have the message field (and they will be able to do a message:<terms> search across them).

Clarify use of TCP or UDP

Here we mention;

Logstash must also be configured to use TCP for Logstash input.

While here we don't mention anything.

It'd be worth further clarifying that filebeat uses TCP only to ensure delivery, rather than having it as a footnote.

"Beats Input: Remote connection closed" "IOError, Connection reset by peer"

I am getting logs full of these errors.

{:timestamp=>"2016-03-28T23:37:06.844000-0500", :message=>"Beats Input: Remote connection closed", :peer=>"xxx.xxx.xxx.xxx:xxxxx", :exception=>#<Lumberjack::Beats::Connection::ConnectionClosed: Lumberjack::Beats::Connection::ConnectionClosed wrapping: IOError, Connection reset by peer>, :level=>:warn}
Logstash Version: 2.2.2
WinLogBeat Version: 1.1.2

Logstash Config:

input {
    beats {
        port => 5043
        ssl => true
        ssl_certificate => "/path/to/cert.crt"
        ssl_key => "/path/to/key.key"
    }
}

filter {                                                                                                                                                                               
    if [type] == "syslog" {                                                                                                                                                              
        grok {                                                                                                                                                                             
            match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }  
            add_field => [ "received_at", "%{@timestamp}" ]                                                                                                                                  
            add_field => [ "received_from", "%{host}" ]                                                                                                                                      
        }                                                                                                                                                                                  
        syslog_pri { }                                                                                                                                                                     
        date {                                                                                                                                                                             
            match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]                                                                                                            
        }                                                                                                                                                                                  
    }                                                                                                                                                                                    
}

output {
    elasticsearch {
        hosts => ["localhost:9200"]
        sniffing => true
        manage_template => false
        index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
        document_type => "%{[@metadata][type]}"
    }
}                                                                                                                                                                                      

WinLogBeat Config:

winlogbeat:
  registry_file: C:/ProgramData/winlogbeat/.winlogbeat.yml
  event_logs:
    - name: TEMP
output:
  logstash:
    hosts: ["xxx.xxx.xxx.xxx:5043"]
    index: winlogbeat
    tls:
      certificate_authorities: ["path/to/cert.crt"]
logging:
  to_files: true
    path: C:/ProgramData/winlogbeat/Logs
    rotateeverybytes: 10485760 # = 10MB
    level: info

I am working through some other issues as well, but I am unsure if they are related. I can give information on those issues as well if you think they might be relevant.

Logstash gradually starts sending fewer and fewer logs to ES, and eventually kills itself when it runs out of memory. My logs have these very heavily, and then once Logstash gets closer to killing itself I start seeing the following errors

{:timestamp=>"2016-03-29T00:36:58.085000-0500", :message=>"execution expired", :class=>"MultiJson::ParseError", :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/jrjackson-0.3.8/lib/jrjackson/jrjackson.rb:87:in `is_time_string?'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/jrjackson-0.3.8/lib/jrjackson/jrjackson.rb:85:in `is_time_string?'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/jrjackson-0.3.8/lib/jrjackson/jrjackson.rb:34:in `load'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/multi_json-1.11.2/lib/multi_json/adapters/jr_jackson.rb:11:in `load'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/multi_json-1.11.2/lib/multi_json/adapter.rb:21:in `load'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/multi_json-1.11.2/lib/multi_json.rb:119:in `load'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/serializer/multi_json.rb:24:in `load'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/base.rb:259:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/http/manticore.rb:54:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/sniffer.rb:32:in `hosts'", "org/jruby/ext/timeout/Timeout.java:147:in `timeout'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/sniffer.rb:31:in `hosts'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/base.rb:76:in `reload_connections!'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.5.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:72:in `sniff!'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.5.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:60:in `start_sniffing!'", "org/jruby/ext/thread/Mutex.java:149:in `synchronize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.5.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:60:in `start_sniffing!'", "org/jruby/RubyKernel.java:1479:in `loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.5.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:59:in `start_sniffing!'"], :level=>:error}

After seeing a few of the above errors, I then see this as the final few errors:

{:timestamp=>"2016-03-29T00:40:09.785000-0500", :message=>"Connection pool shut down", :class=>"Manticore::ClientStoppedException", :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.5.2-java/lib/manticore/response.rb:37:in `initialize'", "org/jruby/RubyProc.java:281:in `call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.5.2-java/lib/manticore/response.rb:79:in `call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.5.2-java/lib/manticore/response.rb:256:in `call_once'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.5.2-java/lib/manticore/response.rb:153:in `code'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/http/manticore.rb:71:in `perform_request'", "org/jruby/RubyProc.java:281:in `call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/base.rb:201:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/http/manticore.rb:54:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/sniffer.rb:32:in `hosts'", "org/jruby/ext/timeout/Timeout.java:147:in `timeout'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/sniffer.rb:31:in `hosts'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/base.rb:76:in `reload_connections!'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.5.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:72:in `sniff!'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.5.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:60:in `start_sniffing!'", "org/jruby/ext/thread/Mutex.java:149:in `synchronize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.5.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:60:in `start_sniffing!'", "org/jruby/RubyKernel.java:1479:in `loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.5.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:59:in `start_sniffing!'"], :level=>:error}

If there is any other information I can get you just let me know.

"Beats Input: Remote connection closed" Connection::ConnectionClosed rapping: EOFError

{:timestamp=>"2016-03-03T17:22:34.579000+0530", :message=>"Beats Input: Remote connection closed", :peer=>"IP:42715", :exception=>#<Lumberjack::Beats::Connection::ConnectionClosed: Lumberjack::Beats::Connection::ConnectionClosed wrapping: EOFError, End of file reached>, :level=>:warn}

This has only come after updating it to latest version ..

I have clean installed on other server ..What can be done for this .. Because if i am not able to ship in production .. It will be a difficult task

Multiline codec with beats-input concatenates multilines and adds it to every line

I did some local testing to get this to work but was not able to, instead i discovered this weird behavior.

test log

2015-11-10 10:14:38,907 line 1
line 1.1
2015-11-10 10:16:38,907 line 2
line 2.1
line 2.2
line 2.3
2015-11-10 10:18:38,907 line 3
2015-11-10 10:20:38,907 line 4
line 4.1
2015-11-10 10:22:38,907 line 5
2015-11-10 10:24:38,907 line 6
2015-11-10 10:26:38,907 line 7
2015-11-10 10:28:38,907 line 8
line 8.1
2015-11-10 10:30:38,902 line 9

filebeat config

filebeat:
  prospectors:
    -
      paths:
        - "/tmp/test1.log"
      input_type: log

  registry_file: "/tmp/test_registry"

output:
  logstash:
    hosts: ["localhost:5044"]

shipper:

logging:
  files:
    rotateeverybytes: 10485760 # = 10MB

logstash config

input {
    beats {
        port => 5044
        #every line not starting with 2015 belongs to the previous line
        codec => multiline {
            pattern => "^2015"
            negate => true
            what => previous
        }
    }
}
output { stdout {} }

logstash output

Default settings used: Filter workers: 2
Logstash startup completed
2015-11-18T10:44:12.849Z %{host} 2015-11-10 10:14:38,907 line 1
line 1.1
2015-11-18T10:44:12.849Z %{host} 2015-11-10 10:14:38,907 line 1
line 1.1
line 2.1
line 2.2
line 2.3
2015-11-18T10:44:12.849Z %{host} 2015-11-10 10:14:38,907 line 1
line 1.1
line 2.1
line 2.2
line 2.3
2015-11-18T10:44:12.849Z %{host} 2015-11-10 10:14:38,907 line 1
line 1.1
line 2.1
line 2.2
line 2.3
line 4.1
2015-11-18T10:44:12.849Z %{host} 2015-11-10 10:14:38,907 line 1
line 1.1
line 2.1
line 2.2
line 2.3
line 4.1
2015-11-18T10:44:12.849Z %{host} 2015-11-10 10:14:38,907 line 1
line 1.1
line 2.1
line 2.2
line 2.3
line 4.1
2015-11-18T10:44:12.849Z %{host} 2015-11-10 10:14:38,907 line 1
line 1.1
line 2.1
line 2.2
line 2.3
line 4.1
2015-11-18T10:44:12.849Z %{host} 2015-11-10 10:14:38,907 line 1
line 1.1
line 2.1
line 2.2
line 2.3
line 4.1
line 8.1

versions
logstash-2.0
logstash-input-beats (2.0.0)
logstash-codec-multiline (2.0.3)
filebeat-rc2

works as expected with logstash-input-stdin

Default settings used: Filter workers: 2
Logstash startup completed
2015-11-18T10:58:20.199Z z-lxelktest01 2015-11-10 10:14:38,907 line 1
line 1.1
2015-11-18T10:58:20.203Z z-lxelktest01 2015-11-10 10:16:38,907 line 2
line 2.1
line 2.2
line 2.3
2015-11-18T10:58:20.219Z z-lxelktest01 2015-11-10 10:18:38,907 line 3
2015-11-18T10:58:20.223Z z-lxelktest01 2015-11-10 10:20:38,907 line 4
line 4.1
2015-11-18T10:58:20.224Z z-lxelktest01 2015-11-10 10:22:38,907 line 5
2015-11-18T10:58:20.224Z z-lxelktest01 2015-11-10 10:24:38,907 line 6
2015-11-18T10:58:20.225Z z-lxelktest01 2015-11-10 10:26:38,907 line 7
2015-11-18T10:58:20.236Z z-lxelktest01 2015-11-10 10:28:38,907 line 8
line 8.1
Logstash shutdown completed

gemspec doesn't require concurrent-ruby version 0.9.1

https://github.com/logstash-plugins/logstash-input-beats/blob/master/logstash-input-beats.gemspec#L25

s.add_runtime_dependency "concurrent-ruby"

This will install latest version (0.9.2 right now) but logstash requires 0.9.1 causing:

RuntimeError: Logstash expects concurrent-ruby version 0.9.1 and version 0.9.2 is installed, please verify this patch: /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.0.0-java/lib/logstash/patches/silence_concurrent_ruby_warning.rb

Make test faster by removing sleep

The test suite contains sleep to make sure the threads are correctly initialize, this make the test slow, harder to understand and more fragile we should remove all of them.

zlib buffer errors

{:timestamp=>"2015-12-08T17:20:11.966000-0500", :message=>"Beats input: unhandled exception", :exception=>#<Zlib::BufError: buffer error>, :backtrace=>["org/jruby/ext/zlib/ZStream.java:134:in `finish'", "org/jruby/ext/zlib/JZlibInflate.java:72:in `inflate'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/jls-lumberjack-0.0.26/lib/lumberjack/server.rb:292:in `compressed_payload'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/jls-lumberjack-0.0.26/lib/lumberjack/server.rb:163:in `feed'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/jls-lumberjack-0.0.26/lib/lumberjack/server.rb:296:in `compressed_payload'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/jls-lumberjack-0.0.26/lib/lumberjack/server.rb:163:in `feed'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.0.3/lib/lumberjack/beats/server.rb:342:in `read_socket'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.0.3/lib/lumberjack/beats/server.rb:319:in `run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.0.3/lib/logstash/inputs/beats.rb:184:in `invoke'", "org/jruby/RubyProc.java:271:in `call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/concurrent-ruby-0.9.2-java/lib/concurrent/executor/executor_service.rb:515:in `run'", "Concurrent$$JavaExecutorService$$Job_1486431032.gen:13:in `run'"], :level=>:error}

Seem to be getting this continuously on some nginx logs

Input is just

    beats {
      port               => 12348
      codec              => json
    }

And the nginx logs are JSON formatted. I'll see if I can create a reproducible case

Versions:
logstash 2.1.0-1
logstash-input-beats (2.0.3)
filebeat 1.0.0

Errors in logstash since upgrade to 2.2

After upgrading logstash to the newest version that came out today, I'm seeing these messages in the logs constantly using the beats input plugin.

{:timestamp=>"2016-02-06T06:21:15.454000+0000", :message=>"Beats Input: Remote connection closed", :peer=>"10.0.5.242:11572", :exception=>#<Lumberjack::Beats::Connection::ConnectionClosed: Lumberjack::Beats::Connection::ConnectionClosed wrapping: EOFError, End of file reached>, :level=>:warn}
{:timestamp=>"2016-02-06T06:21:19.290000+0000", :message=>"Beats Input: Remote connection closed", :peer=>"10.0.1.169:28246", :exception=>#<Lumberjack::Beats::Connection::ConnectionClosed: Lumberjack::Beats::Connection::ConnectionClosed wrapping: EOFError, End of file reached>, :level=>:warn}

TCP connections not being closed causes FD leak

As there is not (currently) a keepalive signal from beats or a timeout in beats, connections remain indefinitely open leading to a filedescriptor leak - over time a logstash agent running with the logstash-input-beats may have thousands of open TCP connections, eventually running against the system limit for filedescriptors.

Put TLS connection info in message metadata

Hey Y'all!

With the new @metadata field it seems like Logstash could expose some potentially-useful info regarding the status of the TLS connection there, for cases where certificate-based verification of message sources might be important. For example, when ssl_verify_mode => 'peer', a beat could connect with a client cert or without one, and it might be nice to know:

{
  "message": "Test message",
  "@metadata": {
    "tls_client_status": "verified",
    "tls_subject_cn": "some-server-zzzz.company.com"
  }
}

Or,

"@metadata": {
  "tls_client_status": "unverified"
}

Bug in `EventTransformCommon` when using the multiline codec

Exception:

Multiline: flush downstream error {:exception=>#<ArgumentError: wrong number of arguments calling `config` (0 for 1)>

From logstash issues 5084...

beats-input handle_new_connection ensure block calls flush with a block
multiline flush yields events to the block given.
block callback calls transformer.transform
DecodedEventTransform#transform calls codec_name
EventTransformCommon#codec_name calls @input.codec.base_codec.class.config

Should be @input.codec.base_codec.class.config_name

https://github.com/logstash-plugins/logstash-input-beats/blob/master/lib/logstash/inputs/beats_support/event_transform_common.rb#L36

Beats input blocked

{:timestamp=>"2016-03-10T18:09:00.402000-0500", :message=>"CircuitBreaker::rescuing exceptions", :name=>"Beats input", :exception=>LogStash::Inputs::Beats::InsertingToQueueTakeTooLong, :level=>:war
n}
{:timestamp=>"2016-03-10T18:09:00.408000-0500", :message=>"CircuitBreaker::rescuing exceptions", :name=>"Beats input", :exception=>LogStash::Inputs::Beats::InsertingToQueueTakeTooLong, :level=>:war
n}
{:timestamp=>"2016-03-10T18:09:00.413000-0500", :message=>"Beats input: The circuit breaker has detected a slowdown or stall in the pipeline, the input is closing the current connection and rejecti
ng new connection until the pipeline recover.", :exception=>LogStash::Inputs::BeatsSupport::CircuitBreaker::HalfOpenBreaker, :level=>:warn}
{:timestamp=>"2016-03-10T18:09:00.408000-0500", :message=>"CircuitBreaker::rescuing exceptions", :name=>"Beats input", :exception=>LogStash::Inputs::Beats::InsertingToQueueTakeTooLong, :level=>:war
n}
{:timestamp=>"2016-03-10T18:09:00.415000-0500", :message=>"Beats input: The circuit breaker has detected a slowdown or stall in the pipeline, the input is closing the current connection and rejecti
ng new connection until the pipeline recover.", :exception=>LogStash::Inputs::BeatsSupport::CircuitBreaker::HalfOpenBreaker, :level=>:warn}
{:timestamp=>"2016-03-10T18:09:00.417000-0500", :message=>"Beats input: The circuit breaker has detected a slowdown or stall in the pipeline, the input is closing the current connection and rejecti
ng new connection until the pipeline recover.", :exception=>LogStash::Inputs::BeatsSupport::CircuitBreaker::HalfOpenBreaker, :level=>:warn}
{:timestamp=>"2016-03-10T18:09:00.425000-0500", :message=>"CircuitBreaker::rescuing exceptions", :name=>"Beats input", :exception=>LogStash::Inputs::Beats::InsertingToQueueTakeTooLong, :level=>:war
n}
{:timestamp=>"2016-03-10T18:09:00.427000-0500", :message=>"Beats input: The circuit breaker has detected a slowdown or stall in the pipeline, the input is closing the current connection and rejecti
ng new connection until the pipeline recover.", :exception=>LogStash::Inputs::BeatsSupport::CircuitBreaker::HalfOpenBreaker, :level=>:warn}
{:timestamp=>"2016-03-10T18:09:06.457000-0500", :message=>"CircuitBreaker::rescuing exceptions", :name=>"Beats input", :exception=>LogStash::Inputs::Beats::InsertingToQueueTakeTooLong, :level=>:war
n}
{:timestamp=>"2016-03-10T18:09:06.459000-0500", :message=>"Beats input: The circuit breaker has detected a slowdown or stall in the pipeline, the input is closing the current connection and rejecti
ng new connection until the pipeline recover.", :exception=>LogStash::Inputs::BeatsSupport::CircuitBreaker::HalfOpenBreaker, :level=>:warn}
{:timestamp=>"2016-03-10T18:09:07.448000-0500", :message=>"CircuitBreaker::rescuing exceptions", :name=>"Beats input", :exception=>LogStash::Inputs::Beats::InsertingToQueueTakeTooLong, :level=>:war
n}
{:timestamp=>"2016-03-10T18:09:07.450000-0500", :message=>"Beats input: The circuit breaker has detected a slowdown or stall in the pipeline, the input is closing the current connection and rejecti
ng new connection until the pipeline recover.", :exception=>LogStash::Inputs::BeatsSupport::CircuitBreaker::HalfOpenBreaker, :level=>:warn}
{:timestamp=>"2016-03-10T18:09:07.451000-0500", :message=>"CircuitBreaker::rescuing exceptions", :name=>"Beats input", :exception=>LogStash::Inputs::Beats::InsertingToQueueTakeTooLong, :level=>:war
n}
{:timestamp=>"2016-03-10T18:09:07.453000-0500", :message=>"Beats input: The circuit breaker has detected a slowdown or stall in the pipeline, the input is closing the current connection and rejecti
ng new connection until the pipeline recover.", :exception=>LogStash::Inputs::BeatsSupport::CircuitBreaker::HalfOpenBreaker, :level=>:warn}
{:timestamp=>"2016-03-10T18:09:08.453000-0500", :message=>"Beats input: the pipeline is blocked, temporary refusing new connection.", :reconnect_backoff_sleep=>0.5, :level=>:warn}
{:timestamp=>"2016-03-10T18:09:08.484000-0500", :message=>"CircuitBreaker::Open", :name=>"Beats input", :level=>:warn}
{:timestamp=>"2016-03-10T18:09:08.486000-0500", :message=>"Beats input: The circuit breaker has detected a slowdown or stall in the pipeline, the input is closing the current connection and rejecti
ng new connection until the pipeline recover.", :exception=>LogStash::Inputs::BeatsSupport::CircuitBreaker::OpenBreaker, :level=>:warn}
...
{:timestamp=>"2016-03-10T18:09:55.256000-0500", :message=>"Beats input: the pipeline is blocked, temporary refusing new connection.", :reconnect_backoff_sleep=>0.5, :level=>:warn}
{:timestamp=>"2016-03-10T18:09:55.757000-0500", :message=>"Beats input: the pipeline is blocked, temporary refusing new connection.", :reconnect_backoff_sleep=>0.5, :level=>:warn}
{:timestamp=>"2016-03-10T18:09:55.780000-0500", :message=>"The error reported is: \n  execution expired"}
input {
  beats {
    # The port to listen on
    port => 9001
    tags => ['beats']
  }
}

logstash 2.2.2
logstash-input-beats 2.1.4

When running the tests inside logstash

This test was failling when we were running all the plugins tests under logtash.

1) LogStash::Filters::Date fill next year if january events arrive in december "Jan 01 01:00:00" when processed
     Failure/Error: Unable to find matching line from backtrace
     NoMethodError:
       undefined method `now' for Time:Class
     # ./vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/logstash/inputs/beats_support/circuit_breaker.rb:93:in `state'
     # ./vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/logstash/inputs/beats_support/circuit_breaker.rb:60:in `closed?'
     # ./vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/lib/logstash/inputs/beats.rb:126:in `run'
     # ./vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.1.2/spec/inputs/beats_spec.rb:93:in `(root)'

type field not being used

Tested using LS 2.2.2's beat input.

input {
  beats {
    port => 5044
    type => "beats"
  }
}

The type field specified is not being honored.

https://www.elastic.co/guide/en/logstash/2.2/plugins-inputs-beats.html#plugins-inputs-beats-type

When sending events from the filebeat, it retains the default "log" value as the type value even though type is set to "beats" in the LS beat input.

To get this to work, I ended up setting document_type at the beats level, and just let it pass all the way through to the LS pipeline.

      # Type to be published in the 'type' field. For Elasticsearch output,
      # the type defines the document type these entries should be stored
      # in. Default: log
      document_type: my_beats_type

Not sure if this is the intended behavior (LS beats input ignores its type parameter), if so, let's update our documentation accordingly.

@timestamp is not overwritten if it's present the message and decode with json codec

Hello,
That's my first issue on github. sorry if i'm not clear enough or if I don't post the issue at the right place.

Filebeat is parsing a file that produce a json document on each line.
if sent the result to a file I get something like that :

{
"@metadata":{"beat":"filebeat","type":"nts-event"},
"@timestamp":"2015-12-08T10:07:52.368Z",
"beat":{"hostname":"vi-mut-col-301","name":"vi-mut-col-301"},
"count":1,
"input_type":"log",
"message":"{\"EventType\":\"EventRegistered\",\"@timestamp\":\"2015-12-08T11:07:44.021+01:00\",\"TServer\":\"LM_D_SSA_NAT_301\",\"SW\":\"SW_LM_AGT_NAT_301\",\"ConnID\":\"null\",\"ThisDN\":\"SW_LM_AGT_NAT_301::\",\"ReferenceID\":\"2300\",\"EventSequenceNumber\":\"16897\",\"Extensions\":{},\"timeTs\":\"2015-12-08 11:07:44\"}",
"offset":12180,
"producer":"collecteurjson",
"source":"/var/product/genesys/data/Json-lm.log",
"subject":"lm-int",
"type":"nts-event"
}

As you can see filebeat put his own @timestamp key (2015-12-08T10:07:52.368Z) and there is also a @timestamp key (2015-12-08T11:07:44.021+01:00) into the message key string in json encoded.

here the logstash configuration with logstash input plugin :

input {
    beats {
        port => 5044
        codec => "json"
    }
}
output {
    stdout { codec => "rubydebug" }
}

here is what i get on logstash ruby debug stdout

{
              "EventType" => "EventRegistered",
             "@timestamp" => "2015-12-08T10:07:52.368Z",
                "TServer" => "LM_D_SSA_NAT_301",
                     "SW" => "SW_LM_AGT_NAT_301",
                 "ConnID" => "null",
                 "ThisDN" => "SW_LM_AGT_NAT_301::",
            "ReferenceID" => "2300",
    "EventSequenceNumber" => "16897",
             "Extensions" => {},
                 "timeTs" => "2015-12-08 11:07:44",
               "@version" => "1",
                   "beat" => {
        "hostname" => "vi-mut-col-301",
            "name" => "vi-mut-col-301"
    },
                  "count" => 1,
             "input_type" => "log",
                 "offset" => 12180,
               "producer" => "collecteurjson",
                 "source" => "/var/product/genesys/data/Json-lm.log",
                "subject" => "lm-int",
                   "type" => "nts-event",
                   "host" => "vi-mut-col-301"
}

the timestamp in the message has been lost.

I expect that it override the filebeat timestamp.

SSL chain certificates

Chained certificates don't seem to be working for beats input. I have the following configuration:

  beats {
    port => 5044
    ssl => true
    ssl_certificate => "/path/to/my/certificate.crt"
    ssl_key => "/path/to/my/certificate.key"
  }

The certificate.crt file has a chain of certificates, but the intermediate ones don't appear to be sent correctly, because an openssl s_client -host myhost -port 5044 -prexit -showcerts fails to verify the first certificate in the chain.

If I do the same with the tcp input, it just works:

  tcp {
    port => 5044
    ssl_enable => true
    ssl_cert => "/path/to/my/certificate.crt"
    ssl_key => "/path/to/my/certificate.key"
    ssl_extra_chain_certs => "/path/to/my/certificate.crt"
  }

Any idea how to proceed? Thanks

Can't update the plugin

While i'm trying to update the plugin i recieve confirmation dialog:
You are updating logstash-input-beats to a new version 3.0.2, which may not be compatible with 2.2.8. are you sure you want to proceed (Y/N)?
But still nothing happens:
Updating logstash-input-beats
No plugin updated
Plugin is still 2.2.8 and when i try to use it i recieve error:
"undefined method `each' for nil:NilClass>"
Logstash version: 2.3.2

Rewrite the beats plugin in pure java

There is an ongoing effort to address #45, we took the decision to rewrite the server/protocol parser in pure java using the netty library. The WIP Feature branch is located at https://github.com/logstash-plugins/logstash-input-beats/tree/feature/java-implementation and require the java-lumber library located at https://github.com/elastic/java-lumber.

The code in theses repositories are not ready for production but initial test show a big improvement in speed vs the initial ruby implementation.

TODO

  • Rename java-lumber to java-lumberjack
  • Remove debug mode from the library
  • Remove the fatJar and declare dependency with jar-dependency
  • Move the library into the plugin repository
  • Allow client auth to support multiple CAs.
  • Revert patch for MapJavaProxy requires elastic/logstash#5465
  • Add test with Private keys in the PKCS8 format.
  • Fix the integration test hang, the server shutdown sucessfully but something else is blocking the main thread.
  • Add tests with Encrypted private keys.
  • Stress tests with multiples clients
  • MessageListener specs
  • make all tests pass
  • release beta1 plugin
  • clean logger.debug statements.

WIP PR at #93

beats_input_codec_json_applied being added to tags

I have json strings being written to a log file, one per line and I'm using filebeats to ship off to logstash (doing this instead of the logstash input plugin because that seemed to miss lines randomly and filebeats doesn't)

Here is the basic config for the input in logstash

input {
  beats {
    codec => "json"
    host => "127.0.0.1"
    port => 5044
  }
}

My json object has a "tags" property that we use already and it seems that the beats input plugin adds beats_input_codec_json_applied to the tags property. I've had to right a filter to remove that tag (which also slows down logstash unnecessarily)

Is there any way it can avoid adding that tag to the object?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.