Coder Social home page Coder Social logo

gios / gzipper Goto Github PK

View Code? Open in Web Editor NEW
67.0 5.0 13.0 89.01 MB

CLI for compressing files.

Home Page: https://www.npmjs.com/package/gzipper

License: GNU General Public License v3.0

JavaScript 0.65% CSS 0.12% HTML 0.06% TypeScript 99.17%
cli gzip compression zlib brotli nodejs algorithm deflate hacktoberfest

gzipper's Introduction

Gzipper

Support Ukraine Badge Build Status npm version

A tool for compressing files by means of Deflate, Brotli, gzip, Zopfli and zstd algorithms, works seamlessly with many CLI UI tools (Angular CLI, Vue CLI, create-react-app).

The flexibility of the algorithms could be extended by many options, including the --gzip-level, --gzip-strategy, --gzip-memory-level, --brotli-param-mode, --brotli-quality, --brotli-size-hint. All options can be declared via ENV variables (ENV variables have higher priority over CLI arguments).

You can enable --verbose mode for better visual representation, customize your file output using --output-file-format or compress with --incremental option if you have a lot of files that rarely change.

By default gzipper compress all the files but you could use --include or --exclude options for flexibility.

Install

  • Globally

    npm i gzipper -g

  • Locally to devDependencies.

    npm i gzipper -D

Usage

gzipper

Usage: gzipper [options] [command]

Options:
  -V, --version                             output the version number
  -h, --help                                display help for command

Commands:
  compress|c [options] <path> [outputPath]  compress selected path and optionally set output directory
  cache                                     manipulations with cache
  help [command]                            display help for command

compress|c

Usage: gzipper compress|c [options] <path> [outputPath]

compress selected path and optionally set output directory

Options:
  -v, --verbose                          detailed level of logs
  --incremental                          incremental compression
  -e, --exclude <extensions>             exclude file extensions from compression, example: jpeg,jpg...
  -i, --include <extensions>             include file extensions for compression, example: js,css,html...
  -t, --threshold <number>               exclude assets smaller than this byte size. 0 (default)
  --deflate                              enable deflate compression
  --brotli                               enable brotli compression
  --gzip                                 enable gzip compression
  --zopfli                               enable zopfli compression
  --zstd                                 enable zstd compression
  --gzip-level <number>                  gzip compression level 6 (default), 0 (no compression) - 9 (best compression)
  --gzip-memory-level <number>           amount of memory which will be allocated for gzip compression 8 (default), 1 (minimum memory) - 9 (maximum memory)
  --gzip-strategy <number>               gzip compression strategy 0 (default), 1 (filtered), 2 (huffman only), 3 (RLE), 4 (fixed)
  --deflate-level <number>               deflate compression level 6 (default), 0 (no compression) - 9 (best compression)
  --deflate-memory-level <number>        amount of memory which will be allocated for deflate compression 8 (default), 1 (minimum memory) - 9 (maximum memory)
  --deflate-strategy <number>            deflate compression strategy 0 (default), 1 (filtered), 2 (huffman only), 3 (RLE), 4 (fixed)
  --brotli-param-mode <value>            default, text (for UTF-8 text), font (for WOFF 2.0 fonts)
  --brotli-quality <number>              brotli compression quality 11 (default), 0 - 11
  --brotli-size-hint <number>            expected input size 0 (default)
  --zopfli-num-iterations <number>       maximum amount of times to rerun forward and backward pass to optimize LZ77 compression cost
  --zopfli-block-splitting               splits the data in multiple deflate blocks with optimal choice for the block boundaries
  --zopfli-block-splitting-max <number>  maximum amount of blocks to split into (0 for unlimited, but this can give extreme results that hurt compression on some files)
  --zstd-level <number>                  zstd compression level 1 (default), 5 (best compression)
  --output-file-format <value>           output file format with default artifacts [filename].[ext].[compressExt]
  --remove-larger                        remove compressed files if they larger than uncompressed originals
  --skip-compressed                      skip compressed files if they already exist
  --workers <number>                     numbers of workers which will be spawned, system CPU cores count (default)
  --no-color                             disable logger colorful messages
  -h, --help                             display help for command

cache

Usage: gzipper cache [options] [command]

manipulations with cache

Options:
  -h, --help      display help for command

Commands:
  purge           purge cache storage
  size            size of cached resources
  help [command]  display help for command

Examples

CLI

  • Globally usage

    gzipper compress [options] <path> [outputPath]

  • Locally usage

    1. Add module to scripts in your package.json and run compress command npm run compress

        "scripts": {
          "gzipper": "gzipper",
          "compress": "gzipper compress ./src"
        }
    2. Use npx command

        "scripts": {
          "compress": "npx gzipper compress ./src"
        }
  • UI build tools (e.g. Angular CLI)

      "scripts": {
        "build": "ng build && gzipper compress ./src"
      }
  • Compress files to a certain directory ./dist (folders structure inside src will be saved)

      "scripts": {
        "compress": "gzipper compress ./src ./dist"
      }
  • Compress files to very deep folder ./very/deep/folder/dist (all folders will be automatically created if not exist)

      "scripts": {
        "compress": "gzipper compress ./src ./very/deep/folder/dist"
      }
  • Compress a single file

      "scripts": {
        "compress": "gzipper compress ./src/awesomeness.txt"
      }

Node.js Module

import { Compress } from "gzipper";
const gzip = new Compress('./src', './dist', {
  verbose: true,
  brotli: true,
});

try {
  const files = await gzip.run();
  console.info('Compressed files: ', files);
} catch (err) {
  console.error(err);
}

Options

compress|c

CLI argument ENV variable
--incremental GZIPPER_INCREMENTAL (0 or 1)
-v, --verbose GZIPPER_VERBOSE (0 or 1)
-e, --exclude <extensions> GZIPPER_EXCLUDE
-i, --include <extensions> GZIPPER_INCLUDE
-t, --threshold <number> GZIPPER_THRESHOLD
--gzip GZIPPER_GZIP (0 or 1)
--deflate GZIPPER_DEFLATE (0 or 1)
--brotli GZIPPER_BROTLI (0 or 1)
--zopfli GZIPPER_ZOPFLI (0 or 1)
--zstd GZIPPER_ZSTD (0 or 1)
--gzip-level <number> GZIPPER_GZIP_LEVEL
--gzip-memory-level <number> GZIPPER_GZIP_MEMORY_LEVEL
--gzip-strategy <number> GZIPPER_GZIP_STRATEGY
--deflate-level <number> GZIPPER_DEFLATE_LEVEL
--deflate-memory-level <number> GZIPPER_DEFLATE_MEMORY_LEVEL
--deflate-strategy <number> GZIPPER_DEFLATE_STRATEGY
--brotli-param-mode <value> GZIPPER_BROTLI_PARAM_MODE
--brotli-quality <number> GZIPPER_BROTLI_QUALITY
--brotli-size-hint <number> GZIPPER_BROTLI_SIZE_HINT
--zopfli-num-iterations <number> GZIPPER_ZOPFLI_NUM_ITERATIONS
--zopfli-block-splitting GZIPPER_ZOPFLI_BLOCK_SPLITTING (0 or 1)
--zopfli-block-splitting-max <number> GZIPPER_ZOPFLI_BLOCK_SPLITTING_MAX
--zstd-level <number> GZIPPER_ZSTD_LEVEL
--output-file-format <value> GZIPPER_OUTPUT_FILE_FORMAT
--remove-larger GZIPPER_REMOVE_LARGER (0 or 1)
--skip-compressed GZIPPER_SKIP_COMPRESSED (0 or 1)
--workers <number> GZIPPER_WORKERS
--no-color GZIPPER_NO_COLOR or NO_COLOR (0 or 1)

ENV variables have higher priority over CLI arguments.

--incremental

gzipper c ./src --incremental

A special type of compression that significantly decreases the time of compression (on the second run) if you have a lot of big and rarely updated files. It creates a .gzipper folder with pre-compressed files (cache) and config that stores all necessary metadata (.gzipperconfig).

-v, --verbose

gzipper c ./src --verbose

Get more information about executed work. (Could increase time of compression because of gathering additional metrics)

-e, --exclude <extensions>

gzipper c ./src --exclude jpeg,jpg,png,ico

Exclude file extensions from compression. Compression extensions br, gz, zz and zst are excluded by default.

-i, --include <extensions>

gzipper c ./src --include js,css,html

Include file extensions for compression (exclude others).

-t, --threshold <number>

gzipper c ./src --threshold 900

Exclude assets smaller than this byte size. Default is 0 byte.

--gzip

gzipper c ./src --gzip

Enable gzip compression. (default behavior)

--deflate

gzipper c ./src --deflate

Enable Deflate compression.

--brotli

gzipper c ./src --brotli

Enable Brotli compression.

--zopfli

gzipper c ./src --zopfli

Enable Zopfli compression.

--zstd

gzipper c ./src --zstd

Enable Zstandard compression.

--gzip-level <number>

gzipper c ./src --gzip-level 8

gzip compression level: 6 (default), 0 (no compression) - 9 (best compression). Only for --gzip.

--gzip-memory-level <number>

gzipper c ./src --gzip-memory-level 2

Amount of memory that will be allocated for gzip compression: 8 (default), 1 (minimum memory) - 9 (maximum memory). Only for --gzip.

--gzip-strategy <number>

gzipper c ./src --gzip-strategy 3

gzip compression strategy: 0 (default), 1 (filtered), 2 (huffman only), 3 (RLE), 4 (fixed). Only for --gzip.

--deflate-level <number>

gzipper c ./src --deflate-level 8

Deflate compression level: 6 (default), 0 (no compression) - 9 (best compression). Only for --deflate.

--deflate-memory-level <number>

gzipper c ./src --deflate-memory-level 2

Amount of memory that will be allocated for deflate compression: 8 (default), 1 (minimum memory) - 9 (maximum memory). Only for --deflate.

--deflate-strategy <number>

gzipper c ./src --deflate-strategy 3

Deflate compression strategy: 0 (default), 1 (filtered), 2 (huffman only), 3 (RLE), 4 (fixed). Only for --deflate.

--brotli-param-mode <value>

gzipper c ./src --brotli-param-mode text

Available values are: text (for UTF-8 text, default) and font (for WOFF 2.0 fonts). Only for --brotli.

--brotli-quality <number>

gzipper c ./src --brotli-quality 10

Brotli compression quality: 11 (default), 0 - 11. Only for --brotli.

--brotli-size-hint <number>

gzipper c ./src --brotli-size-hint 6

Estimated total input size for all files to compress: 0 (default, which means that the size is unknown). Only for --brotli.

--zopfli-num-iterations <number>

gzipper c ./src --zopfli-num-iterations 15

Maximum amount of times to rerun forward and backward pass to optimize LZ77 compression cost. Good values: 10, 15 for small files, 5 for files over several MB in size or it will be too slow. Only for --zopfli.

--zopfli-block-splitting

gzipper c ./src --zopfli-block-splitting

If true, splits the data in multiple deflate blocks with optimal choice for the block boundaries. Block splitting gives better compression. Only for --zopfli.

--zopfli-block-splitting-max <number>

gzipper c ./src --zopfli-block-splitting-max 5

Maximum amount of blocks to split into. 0 for unlimited, but this can give extreme results that hurt compression on some files. Only for --zopfli.

--zstd-level <number>

gzipper c ./src --zstd-level 8

Zstd compression level: 1 (default), 5 (best compression). Only for --zstd.

--output-file-format <value>

Output file format with artifacts, default format: [filename].[ext].[compressExt]. Where: filename -> name of your file, ext -> file extension, compressExt -> compress extension (.gz, .br, etc), hash -> uniq hash.

Example: Expected project structure.

img
  rabbit.jpg
  cat.jpg
js
  main.js
  modules.js
xml
  main.xml
index.js
  • gzipper c ./src --output-file-format [filename].[compressExt].[ext]

    img
      rabbit.jpg
      rabbit.gz.jpg
      cat.jpg
      cat.gz.jpg
    js
      main.js
      main.gz.js
      modules.js
      modules.gz.js
    xml
      main.xml
      main.gz.xml
    index.js
    index.gz.js
    
  • gzipper c ./src --output-file-format test-[filename]-[hash].[compressExt].[ext]

    img
      rabbit.jpg
      cat.jpg
      test-rabbit-b4564011-ba7c-4bd6-834d-bf6c7791b7d4.gz.jpg
      test-cat-739c7d7d-53ca-4f8e-912c-bad3b2b515a9.gz.jpg
    js
      main.js
      modules.js
      test-main-4cc35dbd-36f7-4889-9f41-4d93e7a25bef.gz.js
      test-modules-bce90cbd-5bf2-43c2-8b61-33aa1599b704.gz.js
    xml
      main.xml
      test-main-a90fa10e-f7a4-4af9-af67-f887bb96f98b.gz.xml
    index.js
    test-index-067c1e2d-0e12-4b57-980b-97c880c24d57.gz.js
    
  • gzipper c ./src --output-file-format [filename]-[hash]-[filename]-tmp.[ext].[compressExt]

    img
      rabbit.jpg
      rabbit-b4564011-ba7c-4bd6-834d-bf6c7791b7d4-rabbit-tmp.jpg.gz
      cat.jpg
      cat-739c7d7d-53ca-4f8e-912c-bad3b2b515a9cat-tmp.jpg.gz
    js
      main.js
      main-4cc35dbd-36f7-4889-9f41-4d93e7a25bef-main-tmp.js.gz
      modules.js
      modules-bce90cbd-5bf2-43c2-8b61-33aa1599b704-modules-tmp.js.gz
    xml
      main.xml
      main-a90fa10e-f7a4-4af9-af67-f887bb96f98b-main-tmp.xml.gz
    index.js
    index-067c1e2d-0e12-4b57-980b-97c880c24d57-index-tmp.js.gz
    

--remove-larger

Removes compressed files larger than uncompressed originals in your directory.

--skip-compressed

Ignores compressed files that have already exist in your directory. Only with default --output-file-format.

--workers <number>

Spawn workers for parallel compression. Be aware of workers number because every worker creates an additional thread. More info at nodesource.com.

--no-color

Disable logger colorful messages.

cache

Command
purge
size

purge

gzipper cache purge

Removes all pre-compressed files from cache that was generated via --incremental argument.

size

gzipper cache size

Returns the size of all pre-compiled files from cache.

Changelog

CHANGELOG.md

Contribution

I appreciate every contribution, just fork the repository and send the pull request with your changes.

Support

  • Node.js >= 20.11.0

Prerequisites

If you want to use --zstd compression, you have to make sure that the appropriate library is installed and available at your environment:

where zstd.exe   # Windows
command -v zstd  # MacOS/Linux

If you didn't find executable zstd you have to install this manually:

  • MacOS using Brew

    brew install zstd   # zstd only
    brew install zlib   # whole library
  • Windows Subsystem for Linux (WSL), Ubuntu, Debian... using APT

    sudo apt install zstd

gzipper's People

Contributors

aaboyles avatar dependabot[bot] avatar gios avatar jasonraimondi avatar jonahsnider avatar kubk avatar micheartin avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

gzipper's Issues

Error when using `-e` or `--exclude` with file extensions

I see in the "example" it says example: jpeg,jpg..., but when I try gzipper -e jpg, I get gzipper: can't find a path.

I assume my sytax is wrong? But I'm not sure what it should be. I would like to exclude all image files from any zipping.

gzipper -e jpg png svg jpeg gives the error above, as well as adds new folders called with the names after jpg. I assume it's trying to save these things in that folder?

Setting gzip level doesn't work

If you run the gzipper CLI with npx, using the -gl option seems to have no effect.

npx gzipper -gl 9 build/static/js/

gzipper: GZIP -> gzipLevel: NaN, gzipMemoryLevel: NaN, gzipStrategy: NaN

If you use the --gzip-level option, the option is put into the output.

npx gzipper --gzip-level 9 build/static/js/

gzipper: GZIP -> gzipLevel: 9, gzipMemoryLevel: NaN, gzipStrategy: NaN

However, setting --gzip-level 1 and --gzip-level 9 does not change the size of the file output - when I run the gzip command in bash with -1 vs. -9 on the same files, it makes a fairly large difference.

Looking at the zlib docs for NPM - https://nodejs.org/api/zlib.html#zlib_class_options - it seems that the options needing to be set are level, memLevel, and strategy, but looking at the source for this library, they're being set to gzipLevel, gzipMemoryLevel, and gzipStrategy, so they're not taking effect.

README missing information

Doesn't tell:

  • what's the default output file name format,
  • whether it removes original files such as the gzip utility or places compressed files side-by-side with the originals.

UnhandledPromiseRejectionWarning when permission errors happen

This may be useful to handle/fix:

$ gzipper compress -v --level 9 --brotli --remove-larger /app
(node:2) UnhandledPromiseRejectionWarning: Error: EACCES: permission denied, open '/app/threema-web-2.3.14-gh/emoji/png64/1f1e9-1f1f2.png.br'
(Use `node --trace-warnings ...` to show where the warning was created)
(node:2) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1)

I do have /app permission errors (wrong chown i.e. no access), so maybe that is the exception you do not catch anywhere… 🙃

Node v14.16.1

add compression settings

Could you please add compression settings support?
For example reading settings from package.json

  1. Gzip settings could be exposed, at least compression lvevl
  2. optional use of brotli would be nice

Settings could be stored in package.json or in separate .json, or in env...

Change in output name format

Hi,

Just wanted to check if we have any option currently to change the output format.

For example,
The current output format is
[filename].[filetype].gz
can we have an output of the format like
[filename].[gz].[filetype]

If so, shall i make a pull request. Please confirm

Support for compressing with multiple CPUs

Hi

I'm working on a project that requires to compress 50k files to prepare a build and the number will only grow.
At the moment gzipper seems to use only one CPU, making the compression step quiet long.
Is it possible to add support for multiple cpus to improve performance on large folders?

Request --min-gain 1024

The Brotli plugin for Webpack has an interesting minRatio parameter:
https://www.npmjs.com/package/brotli-webpack-plugin

minRatio: Only assets that compress better that this ratio are processed. Defaults to 0.8.

We might add the option --min-ratio {value}. Note: the option --min-ratio 1 is the same as --remove-larger. This option applies to Gzip and to Brotli compressors.

However, we can also think further:

  • What about large files? Let's say 800 KB compressed file (original size is 1000 KB). The value --min-ratio 0.8 excludes this compressed file and the web server does not save 200 KB in local cache and bandwidth.

  • What about tiny files? Let's say 79 bytes compressed file (original size is 100 bytes). The value --min-ratio 0.8 keeps this compressed file, but for a ridiculous gain of 21 bytes while consuming CPU for decompression on the client side. Thanks to the option --threshold 1000, we can prevent this corner case.

Therefore, I propose the option:

--min-gain {bytes}

This option can also avoid using the options --remove-larger and --threshold {xxx}:

  1. --min-gain 1 (1 byte) is the same as --remove-larger
  2. --min-gain 100, by design, excludes all files smaller than 100 bytes

To clarify this last point, --min-gain 100 also excludes some files larger than 100 bytes: the smaller files that are not well compressed. For example, a PNG file of 200 KB compressed to 101 KB will be excluded by --min-gain 100.

From my personal point of view, --min-gain is more interesting than --threshold. For example, --threshold 1000 may exclude an index.html (990 bytes) and its index.html.br (80 bytes). It looks better to use something like --min-gain 900.

Default value 0 (or -1) --> Disable this option.

It is up to you whether you implement --min-ratio or --min-gain or both. However, in my personal opinion, --min-gain is superior to --remove-larger, --threshold and --min-ratio. I think --min-ratio is not necessary if --min-gain is implemented.

question about how gzipper can help me

Hello

I have a question about how gzipper can help me ..

In angular it generates the dist folder with the .zip file and the normal one. Is the angular smart enough to get the smallest file?

Otherwise my dist folder will get even bigger.

Output file same as original

Compressing a file to overwrite the original for quick S3 uploads of HTML / JS / CSS overwrites the file before reading it, causing gzipper to zip up an empty file.

When serving a website via S3, you upload the file with the original extension and just put the content encoding as gzip, in this instance gzipper does not work because it creates the new file first, rather than compressing in memory.

Delete if larger than uncompressed original

It would be nice to have an option to use if you only want to keep compressed files that is smaller than the uncompressed original. In some cases, especially with the GZip compression, the compressed file will be larger than the uncompressed one.

Proposal - add a CLI flag to filter file extensions

Hi, thanks for the library. Gzipper already supports --exclude option for excluding file extensions from compression, which is great. It would be nice to also have --include option for filtering file extensions. It will allow to use gzipper --include=html,js,css instead of gzipper --exclude=txt,ttf,eot,woff,woff2,svg,jpg,png...

Can make a PR.

Question: Will be there some caching option?

Hi devs,

Do you plan some caching mechanism as an option, so incremental builds could have been boosted?

I have installed this tool to use in most of our projects and it works great ( surprisingly fast ) for both gzip and brotli. However we have some media-heavy projects that are having rarely changing audio/video assets and that would be a big up if their compression could have been cached in a specified folder.

We do build our apps in clean folder so keeping previous .gz and .br files is not an option for us.

Thanks,
Danny

Running gzipper on .dist/ to ./zipped archives the files, but individually, not in a single archive!!

As the title says, the library works but archives each file individually.
Am i doing something wrong? My NPM script is:
"build": "react-scripts build && gzipper compress ./build ./zipped",

ls -l zipped 
total 96
-rw-r--r--  1 razvan.marian  1985064453   326 Dec  8 19:55 asset-manifest.json.gz
-rw-r--r--  1 razvan.marian  1985064453  3461 Dec  8 19:55 favicon.ico.gz
-rw-r--r--  1 razvan.marian  1985064453  1509 Dec  8 19:55 index.html.gz
-rw-r--r--  1 razvan.marian  1985064453  5332 Dec  8 19:55 logo192.png.gz
-rw-r--r--  1 razvan.marian  1985064453  9660 Dec  8 19:55 logo512.png.gz
-rw-r--r--  1 razvan.marian  1985064453   250 Dec  8 19:55 manifest.json.gz
-rw-r--r--  1 razvan.marian  1985064453   331 Dec  8 19:55 precache-manifest.01566b99c51188dfed91c26c7df6f1ca.js.gz
-rw-r--r--  1 razvan.marian  1985064453    78 Dec  8 19:55 robots.txt.gz
-rw-r--r--  1 razvan.marian  1985064453   684 Dec  8 19:55 service-worker.js.gz
drwxr-xr-x  4 razvan.marian  1985064453   128 Dec  8 19:55 static

Update 6.1.0 is a breaking change due to `node-zopfli`

The recent 6.1.0 update added a dependency on node-zopfli. This is a native dependency that broke my builds because it adds new build-time requirements. Here are the logs:

See logs
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR prebuild-install WARN install No prebuilt binaries found (target=3 runtime=napi arch=x64 libc=musl platform=linux)
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR prebuild-install WARN install No prebuilt binaries found (target=14.18.1 runtime=node arch=x64 libc=musl platform=linux)
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp info it worked if it ends with ok
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp info using [email protected]
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp info using [email protected] | linux | x64
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! find Python 
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! find Python Python is not set from command line or npm configuration
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! find Python Python is not set from environment variable PYTHON
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! find Python checking if "python3" can be used
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! find Python - "python3" is not in PATH or produced an error
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! find Python checking if "python" can be used
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! find Python - "python" is not in PATH or produced an error
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! find Python 
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! find Python **********************************************************
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! find Python You need to install the latest version of Python.
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! find Python Node-gyp should be able to find and use Python. If not,
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! find Python you can try one of the following options:
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! find Python - Use the switch --python="/path/to/pythonexecutable"
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! find Python   (accepted by both node-gyp and npm)
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! find Python - Set the environment variable PYTHON
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! find Python - Set the npm configuration variable python:
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! find Python   npm config set python "/path/to/pythonexecutable"
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! find Python For more information consult the documentation at:
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! find Python https://github.com/nodejs/node-gyp#installation
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! find Python **********************************************************
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! find Python 
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! configure error 
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! stack Error: Could not find any Python installation to use
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! stack     at PythonFinder.fail (/builds/demurgos/etwin/node_modules/node-gyp/lib/find-python.js:330:47)
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! stack     at PythonFinder.runChecks (/builds/demurgos/etwin/node_modules/node-gyp/lib/find-python.js:159:21)
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! stack     at PythonFinder.<anonymous> (/builds/demurgos/etwin/node_modules/node-gyp/lib/find-python.js:202:16)
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! stack     at PythonFinder.execFileCallback (/builds/demurgos/etwin/node_modules/node-gyp/lib/find-python.js:294:16)
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! stack     at exithandler (child_process.js:390:5)
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! stack     at ChildProcess.errorhandler (child_process.js:402:5)
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! stack     at ChildProcess.emit (events.js:400:28)
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! stack     at Process.ChildProcess._handle.onexit (internal/child_process.js:280:12)
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! stack     at onErrorNT (internal/child_process.js:469:16)
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! stack     at processTicksAndRejections (internal/process/task_queues.js:82:21)
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! System Linux 5.4.109+
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! command "/usr/bin/node" "/builds/demurgos/etwin/node_modules/node-gyp/bin/node-gyp.js" "rebuild"
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! cwd /builds/demurgos/etwin/node_modules/node-zopfli
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! node -v v14.18.1
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! node-gyp -v v8.4.1
➤ YN0000: │ node-zopfli@npm:2.1.4 STDERR gyp ERR! not ok 
➤ YN0009: │ node-zopfli@npm:2.1.4 couldn't be built successfully (exit code 1, logs can be found here: /tmp/xfs-18adf4dd/build.log)

I will try to fix my build environment, but you should be aware of the issue. I am not quite sure what can be done on the gzipper side. It seems a bit too late to release a 7.x version as the consumers using ^6.x.y are already affected. The best solution would be to use a pure node zopfli implementation.

Skip file if compressed version already exist

It would be nice to be able to only compress files where the corresponding compressed version does not exist.

For example, when file image.png is the current file to compress but image.png.gz exists it would skip that file and continue with the next. This would help in instances where static files are pre-compressed but mixed in with other non pre compressed files.
The flag could be something like --skip-compressed or maybe --skip-pre-compressed. Maybe even --skip-when-output-exists or something more specific. Proposed flag names are a bit long but the function is hopefully clear.

Thank you for this great tool!

Can't turn off logs completely

Basically there is no silent argument or something like that, so this part always fires

// In src/Compress.ts
Logger.log(
  `${filesCount} ${
    filesCount > 1 ? 'files have' : 'file has'
  } been compressed. (${Helpers.readableHrtime(hrtime)})`,
  LogLevel.SUCCESS,
);

Is there any way to make it work or sould there be a switch? Expecting something like silent: true

Can gzipper be used on a single file?

Help wanted

The documentation seems to indicate I can only zip on a directory.

I'd ideally like to only do this on a single file so I don't need to create any sub-directories. Did I miss something in the README or is this an unsupported feature right now?

Thanks.

/usr/bin/env: ‘node\r’: No such file or directory

Hello,

After bumping from 1.4.2 to 1.5.1, compression fails (on linux) with this message. Same error with latest v2.x release.

$ gzipper ./dist
/usr/bin/env: ‘node\r’: No such file or directory  
error Command failed with exit code 127.  

node version : 10.15.1

Sticking to 1.4.2 fixes the problem.

Thanks for investigating !

Add option for threshold / minFileSize

Feature request
Thank you for this very useful project!
It would be great to add an option to exclude files below a certain threshold, e.g. only compress files larger than 200 bytes.
More or less the same feature as offered by brotli-webpack-plugin.

I guess it can be implemented easily here:
https://github.com/gios/gzipper/blob/master/src/Gzipper.js

If you do not have this on your roadmap anyway, I could provide a PR, if you are interested?

Brotli quality default not correct

If I run gzipper using npx to do brotli compression:

npx gzipper --brotli build/static/js/

My files are compressed, but the size is larger than my gzipped files. The documentation states that the default for --brotli-quality is 11, but it doesn't seem to work that way. If I explicitly put --brotli-quality 11 on the command line, it then compresses smaller than my gzipped files and seems to work correctly.

npx gzipper --brotli --brotli-quality 11 build/static/js/

How to include all files regardless of extension?

I would like to use gzipper to compress all the files in my directory, regardless of their extension. I would prefer to avoid listing all the possible extensions manually: just compress everything in the directory.

It seems that only a handful of extensions are enabled by default and there don't seem to be any way to just ask for "any extension".

export const VALID_EXTENSIONS = [
'.js',
'.css',
'.html',
'.png',
'.jpg',
'.jpeg',
'.webp',
'.svg',
'.json',
'.csv',
'.txt',
'.xml',
'.ico',
'.md',
];

In particular, I expect the following command to mean "compress everything in ./app/browser, except for .gz files.

gzipper --exclude gz ./dist

This currently ignores files such as .gif files.

AWS Lambda error

I am getting a error on running the code sample bellow on AWS lambda with node version 16. On local it's ok.

The gzipper version is the 7.2.0

const { Compress } = require('gzipper');
const gzip = new Compress('./src', './dist', {
  verbose: true,
  brotli: true,
  deflate: true,
});

try {
  const files = await gzip.run();
  console.info('Compressed files: ', files);
} catch (err) {
  console.error(err);
}

Error logs:

Error: Initiated Worker with invalid execArgv flags: --expose-gc, --max-semi-space-size=102, --max-old-space-size=1844
    at Compress.<anonymous> (/var/task/node_modules/gzipper/src/Compress.js:184:31)
    at step (/var/task/node_modules/gzipper/src/Compress.js:44:23)
    at Object.throw (/var/task/node_modules/gzipper/src/Compress.js:25:53)
    at rejected (/var/task/node_modules/gzipper/src/Compress.js:17:65)

Request --brotli-param-mode best

I love gzipper and its option --incremental & cache.

In my case (see below), all files (except mainboard.js) are better compressed with --brotli-param-mode font (up to 2% better compression ratio). I would like to get the best of both modes (--brotli-param-mode text and --brotli-param-mode font). I propose to add the best mode:

--brotli-param-mode best

The help should warn about the little gain (often a couple of bytes) while doubling the whole compression time. However, thanks to --incremental we do not care about the compression time. Except in CI/CD environments without gzipper cache

$ yarnpkg gzipper c --brotli --brotli-param-mode text ./dist text
$ yarnpkg gzipper c --brotli --brotli-param-mode font ./dist font
$ du -bac text font | sort -n | grep br$
360	font/index.html.br
367	text/index.html.br
870	font/js/about.js.br
873	text/js/about.js.br
4618	font/favicon.ico.br
4661	text/favicon.ico.br
13625	text/js/mainboard.js.br
13629	font/js/mainboard.js.br
16834	font/js/app.js.br
16862	text/js/app.js.br
147711	font/js/chunk-vendors.js.br
147764	text/js/chunk-vendors.js.br

gzipper: command not found

Hi,

I am getting "gzipper: command not found" when i run any gzipper command.

I had installed it using the following command:
npm i gzipper -g

The result shows that the package was installed but i am unable to call any of the gzipper commands (even gzipper --help)

Kindly let me know how to fix this

VALID_EXTENSIONS doesn't contain all compressible web specific extensions

export const VALID_EXTENSIONS = [
  '.js',
  '.css',
  '.html',
  '.png',
  '.jpg',
  '.jpeg',
  '.webp',
  '.svg',
  '.json',
  '.csv',
];

The above prevents me from compressing files such as .txt, .xml and .ico even if I include them using --include.

Should we add more extensions in this array? Or is there a reason why this array has so few extensions?

Enable gzip and brotli with one command

Hello,

Pre-compressing assets with gzip and with brotli is a common requirement (as long as one wants to serve optimized assets to IE11).

Right now, I have the following setup to achieve this (using npm-run-all instead of && so that it also works on Windows):

  "scripts": {
    "build": "ng build",
    "compress:gzip": "gzipper --verbose dist",
    "compress:brotli": "gzipper --brotli --verbose dist",
    "postbuild": "npm-run-all --parallel compress:*"
  },

However, if gzipper could do this with in one CLI call (e.g. with something like a --gzip flag in order to be able to enable gzip and brotli), I could get rid of npm-run-all and make the scripts much simpler:

  "scripts": {
    "build": "ng build",
    "postbuild": "gzipper --gzip --brotli --verbose dist"
  },

Does it make sense?
Not sure how easy it would be to implement, as the code is currently only returning one instance of a compressor...

Great lib by the way, thanks a lot!
Pablo

Font files are not compressed by default

Hi,

I am using Gzipper for compressing my build files, but it looks like it ignores font files with extensions (ttf,woff,woff2, eot) by default. I am using this command to compress:

gzipper --verbose ./dist ./compressed

i have to explicitly define formats in --include option:

gzipper --verbose ./dist ./compressed --include ttf,woff2,woff,txt,svg,json,png,jpg,jpeg,gif,js,html,css,scss,eot,ico

Shouldn't it be better if it compress all file formats by default?

file without extension will have some wrong

the comressed file extension will be "..br" (brotli)
use --output-file-format [filename].[compressExt].[ext] will be ".br."

Is it possible to add a option [ext?] and use --output-file-format [filename].[compressExt][ext?] will be ".br"(without extension) or ".br.xx" (with extension)

zopfli Support

Hi,

I just found out about gzipper and wonder if you have ever considered supporting zopfli in addition to zlib? node-zopfli provides node.js bindings.

zip files not working

Hi, I am trying to use gzipper page and when I run "ng build && gzipper --verbose ./dist" command, it creates all files with gz extension , but after ng serve , my server still using uncompressed files. Please suggest where I have to define content-type as gzip for main.ts pollyfills... etc files.
Thanks!

Include don't work for multiple extensions

I'm trying to compress some files generated from an angular build.

image

Seems like in the version 4.0.1 the --include option is not working very well for multiple extensions.

Add support for compressing html files

Javascript and CSS are a good start, but it would be helpful to be able to compress html files as well, as they can get large in complex apps (especially if they have data baked in).

Allow multiple algorithms in single command from CLI

It's be really useful to be able to specify multiple algorithms be run as a single execution of the command-line tool.

ie if I specify --gzip --brotli, it does both brotli and gzip. Omitting any would do gzip as it does now, for backwards compatibility.

Currently this is possible through the node API, or just running it twice, but it'd be cleaner if the tool handled it

Additional option --copy-original-files when outputPath is provided

Thanks a lot for this amazing repo!

I'm currently setting up a GitHub action based on gzipper and noticed that until now, there is no option to simply copy the original files to the (newly created) outputPath directory first. This would come in really handy for immediate deployment however.

Hence I propose --copy-original-files flag for this purpose as a possible enhancement.

If anyone needs a workaround follow these step:

  1. Copy the uncompressed, original files to the compressed-branch
  2. Use this branch as in- and output (in my case gh-pages)

Step 2. can be done like this:

      # check out other branch (gh-pages) 
      - uses: actions/checkout@v2
        with:
          ref: gh-pages

      # compress all files in all directories
      - run: gzipper compress --include htm,html,css,js,svg,xml,map,json,img,png,jpg,jpeg --zopfli --brotli --remove-larger .

      # commit
      - uses: stefanzweifel/git-auto-commit-action@v4
        with:
          commit_message: Compress Files
          branch: gh-pages
          push_options: '--force'

Find the full action with mkdocs deploy here.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.