Coder Social home page Coder Social logo

Comments (7)

benoittgt avatar benoittgt commented on July 30, 2024

Thanks for the pick. Do you think it's relevant to make that change by default in the lambda?

from lambda-streams-to-firehose.

jonchase avatar jonchase commented on July 30, 2024

Changing the value from 4->2MB resolved our issue. I'm not sure lowering the value is the right way to go about fixing it though - it seems like somehow conditions were allowing a >4MB payload to be submitted, which sounds like the root cause to identify and fix.

from lambda-streams-to-firehose.

ysamlan avatar ysamlan commented on July 30, 2024

I stumbled on this in some google spelunking - the AWS docs are a bit vague on this but while the 1MB per-record limit on Firehose is on the raw record data before encoding, I think the 4MB total request size limit might be on the full (encoded) payload. The SDK base64-encodes and JSON-wraps the data, so you probably need to pad this limit out by 33% for the base64, plus whatever the JSON around it works out to - someone at Amazon would have to confirm that though.

from lambda-streams-to-firehose.

IanMeyers avatar IanMeyers commented on July 30, 2024

Looking - are you using any transformers in your installation?

from lambda-streams-to-firehose.

IanMeyers avatar IanMeyers commented on July 30, 2024

I expect the issue is that the request limit is actually 4MiB, not 4MB as configured in the system. 1.5.2 and 374158a addresses this.

from lambda-streams-to-firehose.

ysamlan avatar ysamlan commented on July 30, 2024

Yup - I asked AWS support and they actually said the base64/JSON overhead shouldn't count. But that being said, they're a bit fuzzy on MiB vs MB depending which part of the docs you look at, so probably 4 * 1000 * 1000 is a safer bet than 4 * 1024 * 1024.

from lambda-streams-to-firehose.

mbaran90 avatar mbaran90 commented on July 30, 2024

Noticed that it calculates FIREHOSE_MAX_BATCH_BYTES based on current record size instead of actual next next size.

  • Current code will work only if we records are with constant size. other wise will get "Exceeding 4MB limit" issue.
    I have made merge request for code fix. #64

from lambda-streams-to-firehose.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.