Coder Social home page Coder Social logo

compress's Introduction

Koa Compress

Node.js CI codecov

Compress middleware for Koa

Example

const compress = require('koa-compress')
const Koa = require('koa')

const app = new Koa()
app.use(compress({
  filter (content_type) {
  	return /text/i.test(content_type)
  },
  threshold: 2048,
  gzip: {
    flush: require('zlib').constants.Z_SYNC_FLUSH
  },
  deflate: {
    flush: require('zlib').constants.Z_SYNC_FLUSH,
  },
  br: false // disable brotli
}))

Maintainers

Options

filter<Function>

function (mimeType: string): Boolean {

}

An optional function that checks the response content type to decide whether to compress. By default, it uses compressible.

options.threshold<String|Number>

Minimum response size in bytes to compress. Default 1024 bytes or 1kb.

options[encoding]<Object>

The current encodings are, in order of preference: br, gzip, deflate. Setting options[encoding] = {} will pass those options to the encoding function. Setting options[encoding] = false will disable that encoding.

options.br

Brotli compression is supported in node v11.7.0+, which includes it natively. As of v5.1.0, the default quality level is 4 for performance reasons.

options.defaultEncoding<String>

An optional string, which specifies what encoders to use for requests without Accept-Encoding. Default identity.

The standard dictates to treat such requests as * meaning that all compressions are permissible, yet it causes very practical problems when debugging servers with manual tools like curl, wget, and so on. If you want to enable the standard behavior, just set defaultEncoding to *.

Manually turning compression on and off

You can always enable compression by setting ctx.compress = true. You can always disable compression by setting ctx.compress = false. This bypasses the filter check.

app.use((ctx, next) => {
  ctx.compress = true
  ctx.body = fs.createReadStream(file)
})

compress's People

Contributors

3imed-jaberi avatar akmoulai avatar cortopy avatar dependabot[bot] avatar dobesv avatar fishrock123 avatar gengjiawen avatar greenkeeper[bot] avatar greenkeeperio-bot avatar jonathanong avatar leafgard avatar marcodeltongo avatar niftylettuce avatar omsmith avatar patrickhulce avatar tinovyatkin avatar tj avatar uhop avatar zacanger avatar zombieyang avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

compress's Issues

Save pre-compressed body for logging

Right now, I have to choose between compressing my responses and logging them...

I tried to go down the road of unzipping/inflating them after, but decided that maybe the library could help out a bit.

I purpose something along the lines of adding a new prop onto the response object called pre_compressed_body or something similar so if someone wants to log the response they can.

Error in threshold checking?

The option:

threshold

Minimum response size in bytes to compress. Default 1024 bytes or 1kb.

Can't be set to 0 (compress always) because the code var threshold = !options.threshold ? 1024 : .... When 0, the checking fails. It should be var threshold = options.threshold === undefined ? 1024 : ... (or even != null).

Would you accept a PR for this or do you think the minimum threshold should be 1 since there is no point in compressing a 0 byte response? (But not being able to use 0 to indicate "always" is kinda counter-intuitive).

4.x seems to be significantly slower (not related to brotli!)

I see that my app works with much higher CPU load and produces 20% throughput at the same time.

3.1.0 with no options: fast (119MB/s)
4.0.0 / 4.0.1 with no options: slow (20 MB/s)
4.0.x with { br: false }: slow
4.0.x with { br: false, gzip: false, deflate: false }: fast again

same machine, same nodejs version (12.16.3)

3.1.0 produces compressed output (I checked)

trying to trace it further

What is compressed?

Is both header and body compressed? or just body?

Also, what happens if i have already compressed the body myself (using brotli for example). Will thise cause an issue? (re-compress is not a good idea). What happens in this case?

consider adding `response.originalLength` property

It would be nice if this added a response.originalLength property to make it easy to send trace information about the "real" response length. This would mimic the existing request.originalUrl that Koa maintains.

If no Accept-Encoding header is sent, koa-compress may compress the response

I found it very confusing when I used curl to hit my endpoint and got a compressed response dumped to my console. It seems like if curl doesn't send the Accept-Encoding header up it does not decompress the response if it gets a compressed response in return.

Nevertheless, I think a client that does not send Accept-Encoding should not be assumed to support gzip or any other compression scheme - by default they should not get any compressed responses.

An in-range update of supertest is breaking the build 🚨

Version 3.3.0 of supertest was just published.

Branch Build failing 🚨
Dependency supertest
Current Version 3.2.0
Type devDependency

This version is covered by your current version range and after updating it in your project the build failed.

supertest is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • ❌ continuous-integration/travis-ci/push: The Travis CI build could not complete due to an error (Details).

Release Notes v3.3.0

#509 - Fix #486, bug in _assertBody, switch to deepStrictEqual (thanks @mikelax)
#510 - Refactor test files to use const/let (thanks @rimiti)

Commits

The new version differs by 10 commits.

  • e910e85 chore: Prepare for v3.3.0 release.
  • bd864de Merge pull request #511 from visionmedia/bugfix-486-equal
  • 101fbf5 Merge branch 'master' into bugfix-486-equal
  • 04230bb Merge pull request #510 from visionmedia/refact-const-let
  • 510a7ae bugfix: 486 Change method to use deepStrictEqual. (#509)
  • 913150d chore(.editorconfig) [*.md] block removed
  • 82e0828 refact(test/supertest.js) vars replaced by const and let
  • 5443136 chore(.editorconfig) configuration file created
  • 7233ba6 chore(.eslintrc) parserOptions option added to use es6
  • 322ebf6 bugfix: 486 Change method to use deepStrictEqual.

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

5.1 did not set brotli default quality to 4 correctly

The format for the brotli options is wrong. So brotli quality level still remains at 11.

Since brotli is the default this took me forever to find the reason why requests took so long. Since the actual compression is native, koa-compress doesn't show up on the profiler or when timing the middleware.

There is already an open PR #165

Error [ERR_HTTP2_HEADERS_SENT]: Cannot set headers after they are sent to the client

This error has been showing up on my production web server for ABA Collection. It appears to be due to a way compress is interacting with HTTP/2 or at least when using http2.

Based on all of my debugging, it seems as though it is returning after the response has already been sent.

I'll work on a PR and at least get a test up and running that shows the error.

An in-range update of koa is breaking the build 🚨

Version 2.4.0 of koa was just published.

Branch Build failing 🚨
Dependency koa
Current Version 2.3.0
Type devDependency

This version is covered by your current version range and after updating it in your project the build failed.

koa is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • ❌ continuous-integration/travis-ci/push The Travis CI build could not complete due to an error Details

Commits

The new version differs by 27 commits.

  • 418bb06 2.4.0 – added missing 2.3.0 changelog
  • c68a696 2.3.0
  • 687b732 travis: test node@9
  • 53a4446 expose the Application::handleRequest method (#950)
  • 85ff544 deps: update min engines (#1040)
  • 6029064 HTTP/2 has no status message (#1048) (#1049)
  • 18e4faf Update fresh to ^0.5.2 to close vulnerability (#1086)
  • e8a024c docs: ddd Chinese docs link for v2.x (#1092)
  • 1e81ea3 docs: update babel setup (#1077)
  • 43a1df8 test: Remove --forceExit flag for Jest (#1071)
  • 0168fd8 docs: Update middleware.gif (#1052)
  • e1e030c docs: command is wrong for running tests (#1065)
  • 77ca429 test: replace request(app.listen()) with request(app.callback())
  • 7f577af meta: update AUTHORS (#1067)
  • f3ede44 docs: fix dead link to logo image (#1069)

There are 27 commits in total.

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

stream.push() after EOF

Hi, occasionally getting following error

Error: stream.push() after EOF 

image

sometime also seeing

TypeError: Cannot read property 'write' of null
  File "zlib.js", line 469, col 32, in Zlib.callback

only package we use that references zlib is koa-compress with next config
configuration

app.use(compress({
  flush: require('zlib').Z_SYNC_FLUSH
}))

using node 8.

Any idea on why this happens and how to fix this?

NodeJs v8 stream.push() after EOF

After upgrade to v8 i got error in console

Error: stream.push() after EOF at readableAddChunk (_stream_readable.js:227:30) at Gzip.Readable.push (_stream_readable.js:195:10) at Gzip.Transform.push (_stream_transform.js:151:32) at Zlib.callback (zlib.js:430:16)

Brotli used, even if not in Accept-Encoding header (v4.0.0)

Sending Accept-Encoding: gzip, or Accept-Encoding: identity, if the response size is more than the threshold, the response get encoded with Brotli.

I used the default settings: app.use(compress());

I went back to 3.1.0 which works perfectly.

duplicate logic

maybe you have some ideas on this, but I'd definitely like if we didn't have to shadow the default stuff in Koa for these kinds of middleware, it'll definitely become error-prone (and just not fun to work with). Thankfully I don't think too many will sit "under" like this one does, it was a decent PITA with connect as well, but maybe we can work on improving that.

Expand documentation

This middleware looks useful, but I'm not sure how to use it. Perhaps more examples with explanations would help?

I'd like to use brotli if the client accepts it, and fallback to gzip if not. In the readme, your example disables brotli and later states,

Brotli compression is supported in node v11.7.0+, which includes it natively.

Is brotli disabled because Node includes it natively or because that's just the example?

The current encodings are, in order of preference: br, gzip, deflate

Does the order of the encodings in the options object matter or is the "preference" always the same order?

Does one encoding fallback to another if not supported?

I'm unable to help with contributing these changes because I don't understand how it works, but I think answering these questions in the readme would help others understand. Or maybe it's just me..

Check that response hasn't been sent already

I know it's a little bit out of the KOA architecture. But say we want to handle the response ourselves, koa-compress will always log an "ERR_HTTP_HEADERS_SENT"

Sure I can delete the body after it's sent or do some other weird hack, but it seems like koa-compress should check ctx.headerSent or ctx.respond`

A use case for this is with a next.js custom server, which I'm making a middleware for next.js to use in Strapi. Here's the middleware code

const router = new Router();
const handle = nextApp.getRequestHandler();
router.get('*', async (ctx, next) => {
  await next();
  if (ctx.response.status === 404) {
    ctx.respond = false;
    await handle(ctx.req, ctx.res);
  }
});
strapi.app.use(router.routes());

koa-compress not compressing response body.

I am using koa-compress module to compress responses.

Following ways I have tried to use it.

import compress from "koa-compress";
import { constants } from "zlib";

Method 1).
app.use(compress());

Method 2).
app.use(compress({
filter: function (content_type) { return ( /json/i.test(content_type) || /text/i.test(content_type)) },
threshold: 1024,
gzip: {
flush: constants.Z_NO_FLUSH,
level: constants.Z_BEST_COMPRESSION
}
}));

The response json object I receive is not compressed.

I have tried adding "Accept-Encoding" in both ways application/gzip & gzip.
But no success.

Following versions I am currently using.

"@types/koa-compress": "^4.0.0",
"koa-compress": "^4.0.1",

An in-range update of eslint-plugin-flowtype is breaking the build 🚨

Version 2.47.0 of eslint-plugin-flowtype was just published.

Branch Build failing 🚨
Dependency eslint-plugin-flowtype
Current Version 2.46.3
Type devDependency

This version is covered by your current version range and after updating it in your project the build failed.

eslint-plugin-flowtype is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • ❌ continuous-integration/travis-ci/push The Travis CI build could not complete due to an error Details

Release Notes v2.47.0

<a name"2.47.0">

2.47.0 (2018-05-22)

Features

  • Refactor array-style-... rules (6320beec)
  • Improve error messages (b95dd31d)
  • Change default array notation for simple types to "verbose" (687f82be)
  • Implement fixation in array style rules (4a6f03d9)
  • Implement array style rules (afd42108)
Commits

The new version differs by 10 commits.

  • 45e86d8 Merge branch 'pnevyk-feat/array-style'
  • 1d664d7 docs: correct documentation
  • a916617 Merge branch 'master' into feat/array-style
  • 8f86c4b docs: add eslint-config-flowtype-essential (#328)
  • 6320bee feat: Refactor array-style-... rules
  • b95dd31 feat: Improve error messages
  • 687f82b feat: Change default array notation for simple types to "verbose"
  • 4a6f03d feat: Implement fixation in array style rules
  • afd4210 feat: Implement array style rules
  • 1232069 docs: Add documentation for array style rules

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Issue in v2.1.0

We are seeing issues with koa-compress v2.1.0. The issue is that node v8 upgrade is mixed up in minor release of the module. So, there are references of async/await keywords in [email protected]

See below code in index.js of the published version.

return async (ctx, next) => {
    ctx.vary('Accept-Encoding')

    await next()

Not using the correct encoding

Expected Behavior

By sending Accept-Encoding: gzip, deflate the response should be encoded with either gzip or deflate.

Current Behavior

If there are any other previous requests that included other encodings, the response may be encoded with an unexpected algorithm.

Possible Solution

  1. Don't use a shared map encodingWeights by multiple requests.

Steps to Reproduce (for bugs)

  1. Send a request with Accept-Encoding: gzip, deflate, br
  2. The response will have Content-Encoding: br
  3. Send another request with Accept-Encoding: gzip, deflate
  4. The new response will also have Content-Encoding: br
const Koa = require('koa');
const compress = require('koa-compress');
const axios = require('axios');

const app = new Koa();
app.use(compress());

app.use((ctx) => {
  ctx.body = {...}; // long json
});

app.listen(3000);

async function test() {
  const res1 = await axios.post('http://localhost:3000', {}, {
    headers: { 'Accept-Encoding': 'gzip, deflate, br' },
  });
  console.log(res1.headers['content-encoding']); // br

  const res2 = await axios.post('http://localhost:3000', {}, {
    headers: { 'Accept-Encoding': 'gzip, deflate' },
  });
  console.log(res2.headers['content-encoding']); // br
}
setTimeout(test, 1000);

Context

If the encoding doesn't use one of the expected algorithms, it will break the clients

Your Environment

  • Server: Node 14.2.0, koa-compress 4.0.0, koa 2.11.0
  • Client: Node 14.2.0, axios 0.19.2

npm Koa2 package

Hi, could you reversion v2 of this module and make it available on npm? Many thanks!

Default Brotli compression level is too slow

I've recently upgraded from 3.1.0 to 5.0.1 and without any other change, I noticed a huge slow down in my application.

Requests that previously took about 50ms suddenly took over 1s to complete. Downgrading to 3.1.0 again solved this issue.

I wonder if this has something to do with the new Brotli support? But since Node 10 is not actually supporting that natively, maybe it should then be disabled there?

"flush" as it is used in tests

In __tests__.index.js there is a section:

it('should support Z_SYNC_FLUSH', (done) => {
  const app = new Koa()

  app.use(compress({
    flush: zlib.constants.Z_SYNC_FLUSH
  }))
  // and so on
})

Is flush really a top-level option? Or should it be:

app.use(compress({
  gzip: { // or deflate?
    flush: zlib.constants.Z_SYNC_FLUSH
  }
}))

The test succeeds anyway: with the existing code, with added gzip, even when the flush line is commented out.

I don't know what effects should be tested but it looks like they are not observed. I think it should be clarified and the test updated.

Response is not getting compressed or being corrupted?

this is my middleware code

app.use(compress({ filter: function (content_type) { return /text/i.test(content_type) }, threshold: 1, flush: require('zlib').Z_SYNC_FLUSH }))

And this is my response code

ctx.body = 'Hello world'
ctx.compress = true
ctx.set('Content-Type', 'text/plain')
ctx.set('content-encoding', 'gzip')

now when I hit the url localhost:3000/test, I get the error saying

ERR_CONTENT_DECODING_FAILED

But when I hit same url using CURL, I get the text saying

Hello World

I guess it's apparent that data is not getting compressed otherwise curl wouldn't have shown plain text. And I think Chrome error is due to content-encoding being gzip but actual data being plain text. May be I'm wrong but this is sure that there is some problem is my code otherwise chrome should have shown Hello World without error, isn't?

Q: `options.defaultEncoding` default `idenity`, typo?

The default for options.defaultEncoding is documented to be idenity. Is this a typo? The word "idenity" doesn't seem to be the name of anything I can find. If it's meant to be "identity", what does that even mean in this context?

Edit: I've come across other people referring to "no encoding" as identity encoding. I guess this jargon is like the identity function in fp.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.