looplab / logspout-logstash Goto Github PK
View Code? Open in Web Editor NEWA minimalistic adapter for github.com/gliderlabs/logspout to write to Logstash
License: Apache License 2.0
A minimalistic adapter for github.com/gliderlabs/logspout to write to Logstash
License: Apache License 2.0
We have been seeing a bug when null
shows up as a line in our docker logs logspout will crash and restart.
I'm pretty sure the error is caused here -- https://github.com/looplab/logspout-logstash/blob/master/logstash.go#L134-L143
I was able to recreate this with the following script
package main;
import (
"encoding/json"
"fmt"
)
func main() {
var data map[string]interface{}
var err error
m_data := []byte(`null`)
if err = json.Unmarshal([]byte(m_data), &data); err != nil {
data = make(map[string]interface{})
data["message"] = m.Data
fmt.Println("This doesn't get executed because no error is thrown")
}
data["docker"] = `foo`
}
Which matches the stack trace that we're seeing
panic: assignment to entry in nil map
goroutine 21 [running]:
panic(0x5638366bc2e0, 0xc42052e8c0)
/usr/lib/go/src/runtime/panic.go:500 +0x1a5
github.com/looplab/logspout-logstash.(*LogstashAdapter).Stream(0xc420106390, 0xc420100300)
/go/src/github.com/looplab/logspout-logstash/logstash.go:143 +0x47d
github.com/gliderlabs/logspout/router.(*RouteManager).route(0xc4200554c0, 0xc4200a6580)
/go/src/github.com/gliderlabs/logspout/router/routes.go:147 +0xb5
github.com/gliderlabs/logspout/router.(*RouteManager).Run.func1(0xc4200554c0, 0xc4200a6580)
/go/src/github.com/gliderlabs/logspout/router/routes.go:170 +0x37
created by github.com/gliderlabs/logspout/router.(*RouteManager).Run
/go/src/github.com/gliderlabs/logspout/router/routes.go:172 +0xf2
Basically it seems like json.Unmarshal
is decoding null
to nil
(because just null
is actually valid JSON
)
Then instead of data
being a map[]
it is nil
and when you next try to assign a value to a key you get the panic.
I believe the fix is to add || data == nil
to the end of the if statement. (as you can confirm with the small script I included)
I think you can also add a test for this simply by copying one of the TestStream*
s and changing the string it passes to null
(here: https://github.com/looplab/logspout-logstash/blob/master/logstash_test.go#L79)
I'd be happy to submit the fix + test myself but I'm a bit of a noob with golang and having some trouble getting everything setup to hack on this repo ^_^
How can we know the latest version of logspout supported. And I like to know also if you have planify any release in future or you will keep only the master ?
...
Executing busybox-1.24.2-r0.trigger
Executing ca-certificates-20160104-r2.trigger
OK: 401 MiB in 48 packages
# github.com/go-check/check
../../go-check/check/error.go:4: "ERROR: the correct import path is gopkg.in/check.v1 ... " evaluated but not used
The command '/bin/sh -c cd /src && ./build.sh "$(cat VERSION)-custom"' returned a non-zero code: 2
I get this error trying to build the image whit this modules:
package main
import (
_ "github.com/looplab/logspout-logstash"
_ "github.com/gliderlabs/logspout/adapters/raw"
_ "github.com/gliderlabs/logspout/adapters/syslog"
_ "github.com/gliderlabs/logspout/httpstream"
_ "github.com/gliderlabs/logspout/routesapi"
_ "github.com/gliderlabs/logspout/transports/tcp"
_ "github.com/gliderlabs/logspout/transports/udp"
_ "github.com/gliderlabs/logspout/transports/tls"
)
Hi,
I have configured logstash and logspout to send logs from machine A to machine B. Everything works fine, but restarting logstash breaks the logs flow. All logs are still visible using http output and logspout stdout says logstash: write udp: connection refused
. That's expected: logstash is booting up. Restarting the logspout container makes it send the logs again.
How can I make logspout reconnect without restart?
I tried
DOCKER_LABELS=1
DOCKER_LABELS=a=b
DOCKER_LABELS=a:b
DOCKER_LABELS={"a":"b"}
all without luck, any examples?
Hi! I am running into an issue with error
# logspout v3.2-dev-custom by gliderlabs
# adapters: raw tcp logstash udp syslog
# options : persist:/mnt/routes
# jobs : http[]:80 pump routes
# routes :
# ADAPTER ADDRESS CONTAINERS SOURCES OPTIONS
# logstash 0.0.0.0:5000 map[]
2016/11/17 07:15:57 logstash: could not write:write udp 127.0.0.1:34146->127.0.0.1:5000: write: connection refused
after i start the adapter as
sudo docker run --name="logspout" --volume=/var/run/docker.sock:/var/run/docker.sock -e ROUTE_URIS=logstash://0.0.0.0:5000 c045f1a3472b
logstash 5.0 docker container is running with the log
Sending Logstash's logs to /var/log/logstash which is now configured via log4j2.properties
07:04:11.987 [[main]-pipeline-manager] INFO logstash.inputs.tcp - Automatically switching from json to json_lines codec {:plugin=>"tcp"}
07:04:11.987 [[main]<udp] INFO logstash.inputs.udp - Starting UDP listener {:address=>"0.0.0.0:5000"}
07:04:12.003 [[main]-pipeline-manager] INFO logstash.inputs.tcp - Starting tcp input listener {:address=>"0.0.0.0:5000"}
07:04:12.409 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>["https://~hidden~:[email protected]:9243"]}}
07:04:12.410 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Using mapping template from {:path=>nil}
07:04:13.612 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
07:04:13.705 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["fc4fba7c82d6102f5c1a224f0e9f2e9a.us-east-1.aws.found.io:9243"]}
07:04:13.710 [[main]-pipeline-manager] INFO logstash.pipeline - Starting pipeline {"id"=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>125}
07:04:13.717 [[main]-pipeline-manager] INFO logstash.pipeline - Pipeline main started
07:04:13.802 [Api Webserver] INFO logstash.agent - Successfully started Logstash API endpoint {:port=>9600}
I am trying to send my logs to the elastic cloud using logspout and logstash. Thanks for the help.
Could you think of adding another field for the node (or hostname of the docker engine) the container is running on? This would be very helpful for docker swarm clusters.
This is not actually an issue, Correct me if i am wrong, making change in logstash.go file of udp to tcp will do the trick,
Please advise
I've followed the instructions listed here and on the original logspout repository, but I appear to be having problems finding the logspout-logstash dependency (both locally and on CI)
The error I'm getting is:
modules.go:4:2: cannot find package "github.com/looplab/logspout-logstash" in any of:
/go/src/github.com/gliderlabs/logspout/vendor/github.com/looplab/logspout-logstash (vendor tree)
/usr/lib/go/src/github.com/looplab/logspout-logstash (from $GOROOT)
/go/src/github.com/looplab/logspout-logstash (from $GOPATH)
I'm sadly not a go expert (or even beginner) so I'm a bit stumped.
$ go version
go version go1.10.3 linux/amd64
$ echo $GOPATH
/home/ant/go
$ cat modules.go
package main
import (
_ "github.com/looplab/logspout-logstash"
_ "github.com/gliderlabs/logspout/healthcheck"
_ "github.com/gliderlabs/logspout/adapters/raw"
_ "github.com/gliderlabs/logspout/adapters/syslog"
_ "github.com/gliderlabs/logspout/adapters/multiline"
_ "github.com/gliderlabs/logspout/httpstream"
_ "github.com/gliderlabs/logspout/routesapi"
_ "github.com/gliderlabs/logspout/transports/tcp"
_ "github.com/gliderlabs/logspout/transports/udp"
_ "github.com/gliderlabs/logspout/transports/tls"
)
hi,
LOGSTASH_TAGS doesn't work. getopt only return logspout container's env.
To get other containers' env, i think we need to use container.config.Env. But i'm new to logspout, could we do that?
Using RAW_FORMAT to include docker labels gives an error in logstash and document is written. Really want docker labels. Any idea how to fix this?
version: '3.3'
services:
logspout:
build: ./
volumes:
# Logspout reads this in and attaches it to the log
- /etc/hostname:/etc/host_hostname:ro
- '/var/run/docker.sock:/tmp/docker.sock'
environment:
# IP and port for logstash host
ROUTE_URIS: "logstash://some.dns:6000"
# Include all docker labels
DOCKER_LABELS: "true"
# Add environment field to all logs sent to logstash
LOGSTASH_FIELDS: "environment=${NODE_ENV}"
RETRY_STARTUP: "true"
RAW_FORMAT: '{ "container" : "{{ .Container.Name }}", "labels": {{ toJSON .Container.Config.Labels }}, "source" : "{{ .Source }}", "message" : {{ toJSON .Data }} }'
command: 'raw+udp://some.dns:6000'
deploy:
mode: global
resources:
limits:
cpus: '0.20'
memory: 256M
reservations:
cpus: '0.10'
memory: 128M
restart: on-failure
[2021-06-28T21:18:49,754][WARN ][logstash.outputs.elasticsearch][main][ec5863da584df97e7e09b88e20aac31b53950dfc42b9a358d52a766e775ca61f] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash", :routing=>nil}, {"@version"=>"1", "type"=>"dockerlog", "container"=>"/logspout_logspout_1", "labels"=>{"com.docker.compose.container-number"=>"1", "com.docker.compose.project.config_files"=>"docker-compose.yaml", "com.docker.compose.version"=>"1.27.4", "com.docker.compose.oneoff"=>"False", "com.docker.compose.service"=>"logspout", "com.docker.compose.project.working_dir"=>"/home/pchost/docker/logspout", "com.docker.compose.config-hash"=>"f4c841a27a87d76eee562fc3329790dd68a1001be5e7c2376ca86313c3d2d21f", "com.docker.compose.project"=>"logspout"}, "@timestamp"=>2021-06-28T21:18:49.591Z, "host"=>"192.168.1.232", "source"=>"stdout", "message"=>"# ADAPTER\tADDRESS\t\t\tCONTAINERS\tSOURCES\tOPTIONS"}], :response=>{"index"=>{"_index"=>"logstash", "_type"=>"_doc", "_id"=>"s-99VHoB4-ZIuBnPNYth", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [labels.com.docker.compose.project] cannot be changed from type [text] to [ObjectMapper]"}}}}
Is it possible to use it with TLS and to use a client cert?
Any hints how to setup this?
With slef signed certs.
When logging JSON numbers from docker, ex:{"value":1000000}
the value is converted to scientific notation once it's gone through the Marshall/Unmarshalling. ie:{"value":1e+06}
While this is not technically incorrect, it prevents us from looking for specific values in the logs (ids) without manually converting them to scientific notation.
I tried to come up with a fix myself (https://stackoverflow.com/a/22346593/8125689 looks promising), but while the issue definitely happens with the Bekt/logspout-logstash
image, I wasn't able to reproduce it in a test case ๐
PS: Loads of thanks for this project, we've been using it on production server for a while as part of our monitoring system and it does a pretty good job ๐
My modules.go contains:
package main
import (
_ "github.com/looplab/logspout-logstash"
_ "github.com/gliderlabs/logspout/transports/udp"
)
run with
docker run --volume=/var/run/docker.sock:/var/run/docker.sock -it vysakh/logspoutudp -e ROUTE_URIS='logstash://localhost:5000'
docker log report
# logspout v3.2-dev-custom by gliderlabs
# adapters: udp logstash raw
# options : persist:/mnt/routes
!! bad adapter:
I was using a dockerhub image that is a self-contained ELK stack. I was trying to run the ELK stack and Logspout-Logstash from a single docker-compose.yml.
Seems that if the ELK stack is not FULLY up, Logspout-Logstash never starts logging. Restarting the logspout container does correct it. So, just noting that here, it would be nice if it would do re-tries or something. Unless perhaps it's an issue with my setup.
Here is my docker-compose file
elk:
image: sebp/elk
ports:
- "5601:5601"
- "9200:9200"
- "5044:5044"
- "5000:5000"
- "5000:5000/udp"
environment:
- ES_HEAP_SIZE=12g
- LS_HEAP_SIZE=12g
volumes:
- $HOME/dev/elk/logspout.conf:/etc/logstash/conf.d/logspout.conf
logspout:
image: local/logspout
container_name: logspout
environment:
- LOGSPOUT=ignore
- ROUTE_URIS=logstash+tcp://<HOSTIP>:5000
volumes:
- /var/run/docker.sock:/tmp/docker.sock
Is this possible to collect multiline logs such as java stack trace as an event and send to logstash?
Logstash multiline module doc says that
If you are using a Logstash input plugin that supports multiple hosts, such as the Beats input plugin input plugin, you should not use the multiline codec to handle multiline events. Doing so may result in the mixing of streams and corrupted event data. In this situation, you need to handle multiline events before sending the event data to Logstash.
So that I think that will be a wonderful thing if logspout can do this.
Hey there,
i don't know if i made a mistake, or not... but i passed the ENV-Variable LOGSTASH_TAGS but the Tags isn't assigned:
my docker_compose:
logspout-logstash:
ports:
- 9999:80/tcp
environment:
ROUTE_URIS: logstash+tcp://logstash01.int.com:6666
LOGSTASH_TAGS: test,test2
labels:
io.rancher.scheduler.global: 'true'
io.rancher.container.pull_image: always
tty: true
image: dockerregistry.int.com/foobar/logspout-logstash:latest
volumes:
- /var/run/docker.sock:/var/run/docker.sock
stdin_open: true
and my logstash output:
{"message":"x.x.x.x - - [26/Sep/2016:18:45:38 +0000] \"GET / HTTP/1.0\" 200 6286 \"-\" \"-\" \"-\"","stream":"stdout","docker":{"name":"/r-buttons-microservice","id":"ddd2a2dc7c00714461826f43c19ad918261ae653efcd2c679d93a6758db95318","image":"dockerregistry.int.com/foobar/docker_image-buttons-microservice:latest","hostname":"ddd2a2dc7c00"},"tags":[],"@version":"1","@timestamp":"2016-09-26T18:45:38.368Z","host":"x.x.x.x","port":59889,"type":"docker_events"}
Can somebody look into ?
Thanks
I am getting a bad adapter error.
My Dockerfile contains:
FROM gliderlabs/logspout:master
My modules.go contains:
package main
import (
_ "github.com/looplab/logspout-logstash"
)
My command line is:
docker run --name="logspout" --volume=/var/run/docker.sock:/tmp/docker.sock -e ROUTE_URIS='logstash://localhost:5143' mystuff/logspout-logstash
The message is:
# logspout v3-master-custom by gliderlabs
# adapters: logstash
# options : persist:/mnt/routes
!! unable to find adapter: logstash
Any suggestions? Thanks.
We are experiencing issues with High CPU usage on dockerd whenever we enable the logspout-logstash container.
More details on this issue report
I have just upgraded ELK to version 6.0.0
and then parsing docker logs with logspout-logstash stopped working.
The Logstash error is this
[2017-11-30T02:11:39,765][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2017.11.30", :_type=>"docker", :_routing=>nil}, #<LogStash::Event:0x6cf08450>], :response=>{"index"=>{"_index"=>"logstash-2017.11.30", "_type"=>"docker", "_id"=>nil, "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"Failed to parse mapping [_default_]: [include_in_all] is not allowed for indices created on or after version 6.0.0 as [_all] is deprecated. As a replacement, you can use an [copy_to] on mapping fields to create your own catch all field.", "caused_by"=>{"type"=>"mapper_parsing_exception", "reason"=>"[include_in_all] is not allowed for indices created on or after version 6.0.0 as [_all] is deprecated. As a replacement, you can use an [copy_to] on mapping fields to create your own catch all field."}}}}}
Is there a workaround?
Hey thanks so much for such an awesome tool. Would it be possible to add an environment variable like INCLUDE_CONTAINERS
or REQUIRED_LABEL
to only pull logs from certain containers?
I realize when running any other docker containers on the same server running logspout, the logs are grabbed by logspout too which isn't what I want.
I also realize it's possible to add a separate logstash filter to ignore all other images, but it'd be a more scalable solution for me to have the option on the log sending side.
I'm not seeing the 'severity' (e.g. DEBUG, INFO, ERROR, etc) tag being included in logspout-logstash messages. Is there a configuration option I'm missing or is this not supported yet?
I'm occasionally seeing the following error log in my logspout container, which causes it to exit.
logstash: could not write:write tcp 172.17.0.2:49194->192.168.1.178:5001: write: connection reset by peer
Port 5001 is the TCP port that my Logstash container is listening on. Any idea what is going on or if anything can be done? For now I am just using the docker run restart
flag to restart the container if it dies.
Hello, this adapter is great! I'm maintaining a hosted version of this at Bekt/logspout-logstash on Docker Hub. I was wondering if you could link to it somewhere in this project.
Logspout can already send logs directly to Logstash right? What does this adaptor add to that functionality?
(evaluating logging stacks atm)
run with
docker run -d \
--restart=always \
--volume=/var/run/docker.sock:/tmp/docker.sock \
-e "ROUTE_URIS=logstash://host:5505" \
my-docker-registry/logspout-logstash:latest
it starts, but nothing goes to logstash
docker logs report
# logspout v3-master-custom by gliderlabs
# adapters: logstash raw tcp syslog
# options : persist:/mnt/routes
# jobs : pump http[routes]:80
# routes :
# ADAPTER ADDRESS CONTAINERS SOURCES OPTIONS
# logstash host:5505 map[]
how to diagnose cause?
I'm seeing an issue, I'm not quite sure what's up. I switched one of my environments to use the logstash+tcp
functionality that recently got added and noticed that an extra comma is being added to my custom tags.
The other logspout container I'm using to forward logs isn't adding this comma which makes me believe it is the tcp functionality that is doing it. If the log doesn't hit a filter defined in my logstash config it doesn't append the extra comma.
Here's an example of what the tags
column looks like:
I've tried restarting the logspout-logstash containers, reverting back to udp, and restart logstash completely but the problem still persists.
Apologies if this is not a logspout-logstash issue, I just wasn't sure where else to start since the only change I made was to use the tcp adapter.
Elasticsearch 2.0 doesn't support dot in the field name:
In logstash.go it has:
type LogstashMessage struct {
Message string `json:"message"`
Name string `json:"docker.name"`
ID string `json:"docker.id"`
Image string `json:"docker.image"`
Hostname string `json:"docker.hostname"`
}
This will cause an issue either upgrading to or using Elasticsearch 2.0. I suggest renaming the dots with an underscore or using nested fields.
Certain ill-mannered programs (I'm looking at you, docker-registry
) emit JSON-formatted log entries, but use dot-delimited nesting in the key names, which breaks with ES 2.x. Since the JSON emitted by these programs isn't necessarily intended to be consumed by logstash, it's not necessarily their fault that they're producing logstash-incompatible JSON, but is instead logspout-logstash's responsibility to transmute the error-producing JSON into something that logstash will understand.
I can think of two options:
Would a PR to implement option 1 be merged, or should I wait for someone with better Go chops to implement 2?
I wanted to add some custom fields to my logspout-generated events.
The Readme says:
You can also add arbitrary logstash fields to the event using the LOGSTASH_FIELDS container environment variable:
# Add any number of arbitrary fields to your event -e LOGSTASH_FIELDS="myfield=something,anotherfield=something_else"
So my compose file:
...
logspout:
environment:
- ROUTE_URIS=logstash+tcp://logstash:5045
- LOGSTASH_FIELDS="collector=logspout"
...
Problem is:
logspout-logstash doesn't remove the quotes for my field collector
Looking at the available fields for my index in elasticsearch:
[root@2c2dfff9f247 elasticsearch]# curl 'elasticsearch:9200/unbekannt-2018.01.31/_mapping/*?pretty'
{
"unbekannt-2018.01.31" : {
"mappings" : {
"doc" : {
"properties" : {
"\"collector" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"@timestamp" : {
"type" : "date"
},
...
As the result in my logstash.conf I can't use the field e.g.
if [\"collector] {} or
if ["collector] {} or
if [collector] {}
Solution for the problem
In the code of this adapter quotes aren't removed.
Using the following environment variable
- LOGSTASH_FIELDS=collector=logspout
solves the problem
Please change the Readme or fix this in the code
I followed the instruction and build the docker.
I run the docker and login to to docker.
I checked /erc dir..I see modules.go has my content ...I didn't find logstash.go
also under adapter dir I don't see logstash.
Am I missing anything?
I got everything up and running logspout
is showing me the trace of my nginx
access_log
when running: curl http://127.0.0.1:8000/logs
.
logstash
is listening on the port 5044
I have run it on debug
mode and try netcat it with udp
and it's showing the calls on the log file:
echo nc "test" | nc -4u -w1 172.18.0.2 5044`
However it seems like logspout
it's not sending anything to logstash
for some reason and I can't figure out why that could be happening.
This is how my environment variables looks like on my logspout
container:
ENV ROUTE_URIS=logstash://172.18.0.2:5044
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.