cboscolo / elb2loggly Goto Github PK
View Code? Open in Web Editor NEWA Node.js AWS Lambda script that converts the ELB logs written to S3 into JSON and push them to Loggly
License: MIT License
A Node.js AWS Lambda script that converts the ELB logs written to S3 into JSON and push them to Loggly
License: MIT License
Hello, I have managed to setup the lambda script. The issue I have is, Loggly shows following error as event each time a log file is uploaded:
LogglyNotifications:
syslog:
severity: Warning
appName: avahi-daemon
timestamp: 2015-07-07T14:41:19.706596+03:00
facility: system daemons
pid: 970
priority: 28
host: linux
unparsed:
message: Invalid response packet from host 192.168.2.112.
Raw Message:
Invalid response packet from host 192.168.2.112.
What is the problem here?
I've just tried setting up this lamda function but am still running into problems. I am using our current ELB logs as a test.
The Lamda function appears to run on every new PUT but the cloudwatch log is showing the following error (I apologise for the screen shot but there does not appear to be a sensible way to COPY/PASTE the logs (i've deleted the bucket name from the screenshot):
Here is my bucket tag config:
I have also confirmed that the iam role can access the log files by using the role to access the same file via the S3 API.
I keep getting this in the output when I try to simply test the function:
START RequestId: 7a111162-fe11-4eee-a5f8-0651a0691b8f Version: $LATEST
2019-02-15T12:01:38.743Z 7a111162-fe11-4eee-a5f8-0651a0691b8f TypeError: Cannot read property '0' of undefined
at exports.handler (/var/task/index.js:9:33)
END RequestId: 7a111162-fe11-4eee-a5f8-0651a0691b8f
REPORT RequestId: 7a111162-fe11-4eee-a5f8-0651a0691b8f Duration: 54.30 ms Billed Duration: 100 ms Memory Size: 128 MB Max Memory Used: 29 MB
RequestId: 7a111162-fe11-4eee-a5f8-0651a0691b8f Process exited before completing request
What am I missing?
Hi,
we're interested in using this tool, but our IT/Ops person @lilmatt ran into issue when implementing:
...the csv parsing library they're including doesn't deal with escaped strings, so pip's useragent which has backslash-doublequote in it causes there to be too many fields and causes the stuff to error out.
Is this planed ?
Hey,
There are some additional columns needed to make this work with HTTPS load balancers
Here is an example log line from AWS ELB using HTTPS:
2015-07-23T10:45:34.971531Z LoadBalancerName xxx.xxx.xxx.xxx:12345 xx.xxx.xxx.xxx:80 0.000039 0.004119 0.000024 200 200 0 2316 "GET https://www.example.com:443/ HTTP/1.1" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" ECDHE-RSA-AES128-GCM-SHA256 TLSv1.2
user_agent, ssl_cipher and ssl_protocol are missing
I keep getting this in the output when I try to simply test the function:
{
"errorMessage": "Process exited before completing request",
"errorType": "TypeError: Cannot read property '0' of undefined"
}
Followed by:
START RequestId: 4b4459a1-eeb1-11e4-bd95-b14278e4e77b
Failure while running task: TypeError: Cannot read property '0' of undefined
at exports.handler (/var/task/elb2loggly.js:90:30)
Process exited before completing request
TypeError: Cannot read property '0' of undefined
What am I missing?
Great script! I have it working 99% of the time. Unfortunately, I'm getting occasional errors that result in the dropping of a logfile. They seem to occur most often when traffic is heaviest. Here's a sample error from the CloudWatch Stream:
2015-12-31T04:07:34.174Z 92fd45fa-af73-11e5-9e41-c1b36be05f9c TypeError: Uncaught, unspecified "error" event.
at TypeError (<anonymous>)
at Transform.emit (events.js:74:15)
at Transform.onerror (/var/task/node_modules/csv-streamify/node_modules/through2/node_modules/readable-stream/lib/_stream_readable.js:604:12)
at Transform.emit (events.js:95:17)
at onwriteError (_stream_writable.js:239:10)
at onwrite (_stream_writable.js:257:5)
at WritableState.onwrite (_stream_writable.js:97:5)
at afterTransform (_stream_transform.js:99:5)
at TransformState.afterTransform (_stream_transform.js:74:12)
at Transform.parse_s3_log [as _transform] (/var/task/elb2loggly.js:199:7)
Unfortunately, I'm not familiar enough with stream modules on to pin down the problem. I suspect the Loggly receiver is throwing an error. Can you point me in a good direction to try to track this down? Alternatively, is there some error handling that could be added so the file isn't just skipped? Thanks!
Have configured everything as per https://www.loggly.com/docs/aws-elb/ but all lambda events result in a failure, if I attempt to view the cloudtrail logs I get:
Don't know what to try next as debugging on AWS lambda appears to be rather poor.
Formatting/indentation of this file is very inconsistent. Would a PR to clean it up be welcomed? Curious whether spaces or tabs are preferred. If spaces, how many?
The numeric fields within the ELB data are being provided as JSON strings instead of numbers. This means I cant filter by response_time in loggly. I will try to take a look at this tomorrow, possibly create a pull request if I can suitably modify the parser.
Hey,
I'm trying to get everything working since a couple of hours and I just figured out it's not working because of the log size (around 4MB).
If I test it with a smaller log file it's easily pushing to Loggly.
Do have know a workaround? Maybe trying to avoid to send the whole logs at once?
Thanks!
ELB creates logs in defined directory structure with year/month/date in S3, Event type : PUT not working for lower level directories. Changed to ObjectCreated to make it work.
When an ELB is unable to contact a backend server due to timeout, it logs a 504 in access log with backend field set to a single '-' and not a host:port pair. The code currently splits the backend field by a ':', which increases data.length by 1. If this doesn't occur due to the single hyphen, then data.length does not equal COLUMNS.length, causing error to be logged to console and log record to be skipped.
https://github.com/cboscolo/elb2loggly/blob/master/elb2loggly.js#L74
Sample ELB access log entry showing '-' for backend and 504 HTTP response code:
2015-08-03T19:35:21.383076Z xxxxx-production 1.1.1.1:63484 - -1 -1 -1 504 0 0 0 "GET https://xxxx:443/xxxx"
CloudWatch log:
2015-08-03T19:45:33.221Z 33b9d1ae-3a18-11e5-ba96-ebe34fa0e019 ELB log length 14 did not match COLUMNS length 15
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.