Coder Social home page Coder Social logo

node-stream-zip's Introduction

πŸ‘‹ Hallo!

Whois:

  • πŸ’° full-stack developer @ booking.com
  • πŸ”¨ most of my open-source is in JavaScript and C++
  • πŸ•ΈοΈ always learning WebThis or WebThat
  • 🌱 open technologies make our world better
  • πŸ“« best way to ask about my projects is via GitHub issues
  • πŸ…°οΈ website: antelle.net
  • πŸ“§ e-mail: [email protected]

Open-source projects I made:

node-stream-zip's People

Contributors

aearly avatar anupamjuniwal avatar blzaugg avatar brianxb avatar disconova avatar frenzzy avatar smulesoft avatar sregger avatar terite avatar yavanosta avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

node-stream-zip's Issues

Error handling in Typescript

Hello,

I'm trying to use this library in a NodeJS Typescript project. I want to throw a bad request exception (a built in NestJS exception that replies to the client's request with a 400 Bad Request Error). But when I try to throw the exception inside the error handler, the app is stopped and the error is logged in the console instead of sending the http error to the client. How can I solve that ? BTW, I used this tpe definition file

import { HttpException, HttpStatus, Injectable, BadRequestException } from "@nestjs/common";
import StreamZip from "node-stream-zip";
import { BufferedFile } from "./file.model";

@Injectable()
export class ImportService {

    async uploadBatchArchive(batchLabel: string, triggerFileExtension: string, file: BufferedFile) {

        if (!(file.mimetype.includes('zip'))) {
            throw new BadRequestException('Zip file expected.')
        }
       
        let triggerFile: string;

        const zip = new StreamZip({
            file: `./uploads/${file.originalname}`,
            storeEntries: true
        });

        zip.on('error', function (err) {
            throw new BadRequestionException(err.message.message);
        });

        zip.on('ready', () => {
            console.log('Entries read: ' + zip.entriesCount);


            for (let entry of Object.values(zip.entries())) {
                if (entry.isFile && entry.name.split('.').pop().toLowerCase() === triggerFileExtension.toLowerCase()) {
                    triggerFile = entry.name;
                }
                const desc = entry.isDirectory ? 'directory' : `${entry.size} bytes`;
                console.log(`Entry ${entry.name}: ${desc}`);
            }

            // TODO Extract the trigger file here and transfer it.
            if (triggerFile == undefined) {
                throw new BadRequestException(`No trigger file found.`);
            }

            zip.close();
        });
    }
}

Typescript error with .extract(null, ...)

Hi, I've been using this for awhile now but while trying to update some packages, this error pops out due to extract function requires entry to be a string.

So, for now, do we pass in empty string to extract all entries?

Thanks.

Is there an example of streaming unzipped file to S3?

I am trying to stream data to s3. Here is my code:
zip.on("ready", () => { zip.stream(fileName, (error, zstream) => { zstream.pipe(this.uploadFromStream(bucket, fileName)); .....

When I try to upload
const uploadPromise = this.S3.upload({ Bucket, Key, Body: writeStream }).promise();

I am getting Error: Cannot determine length of [object Object]

Accept stream in

We've got zip files stored in GridFS would be nice to be able to pass the stream into this.

Want a "entryDataAsync" function

i'm using this lib to read backup file in nodejs with typescript.
i have read many small BSON file (about 1~15 MB) into memory from Zip archive.

i find a entryDataSync function, but not find a async version.
i dont want the reading operate hang the hole process druing read disk.

can add that func ? thanks very much.

Read an Existing Stream

I'd like to be able to read a zip file stream coming over the network without having to save it to disk. This doesn't seem possible with the library, but would be a very helpful addition.

Document usage of zip.close()

Source:
https://github.com/antelle/node-stream-zip/blob/master/node_stream_zip.js#L567-L573

zip.on('ready', function() {
    console.log('Entries read: ' + zip.entriesCount);
    // extract file
    zip.extract('node/benchmark/net/tcp-raw-c2s.js', './temp/', function(err) {
        console.log('Entry extracted');
        zip.close();
    });
    // extract folder
    zip.extract('node/benchmark/', './temp/', function(err, count) {
        console.log('Extracted ' + count + ' entries');
        zip.close();
    });
    // extract all
    zip.extract(null, './temp/', function(err, count) {
        console.log('Extracted ' + count + ' entries');
        zip.close();
    });
    // read file as buffer in sync way
    var data = zip.entryDataSync('README.md');
});

Memory usage?

Hi!
What kind of memory usage is "normal" or "expected"? I'm using this library to extract zip files of about 400 Mb, and my node process uses >150 Mb of memory ({ rss: 152350720, heapTotal: 30214176, heapUsed: 16568248 }). That sounds like a lot, right?
Thanks!

Index out of range error

When running:
node server.js 21154fcf-1677-4a94-8ae4-587d42164d15.zip

I got the following output:

Opening zip: 21154fcf-1677-4a94-8ae4-587d42164d15.zip
buffer.js:816
    throw new RangeError('Index out of range');
    ^

RangeError: Index out of range
    at checkOffset (buffer.js:816:11)
    at Buffer.readUInt32LE (buffer.js:878:5)
    at FsRead.readUntilFoundCallback [as callback] (D:\output\node_modules\node-stream-zip\node_stream_zip.js:190:28)
    at FsRead.readCallback (D:\output\node_modules\node-stream-zip\node_stream_zip.js:800:21)
    at FSReqWrap.wrapper [as oncomplete] (fs.js:681:17)

Server.js file:

"use strict";

var StreamZip = require('node-stream-zip');

var args = process.argv.slice(2);

if (args.length !== 1) {
	console.log("you need to provide a zip file.");
}
else {
	console.log("Opening zip: " + args[0]);
	var zip = new StreamZip({
		file: args[0],
		storeEntries: true
	});
	zip.on('error', function(err) { console.log(err); });
	zip.on('ready', function() {
		console.log("Ready");
		console.log('Entries read: ' + zip.entriesCount);
		var data = zip.entryDataSync('properties.json');
	});
}

The zip file is big (~1.26GB) and contains ~35k files -> close to half of 65k limit?
I've posted the file on WeTransfer (link will be valid for one week).
The archive has been generated with zip-folder.js.

Thanks for your work: node-stream-zip is very useful to me!

Promise-style API needed

Recently, I found this library for extract a zip. However, there is no api with promise-style.
Besides, when i import the toolkit with the typescript, for example, es6-promisify, to transform the api, it not works very well, especially when it comes to the on(event: 'ready', handler: () => void): void or on(event: 'entry', handler: (entry: ZipEntry) => void): void, because the promisify toolkit can't recognize the signature of function.
It does not matter very much, but I believe that if there were api with promise-style, maybe the world will be better.
Thank you very much.

Stream-based initialization desired

A sweet addition to this project would be allowing ZipStream to be initialized either by filename or via a NodeJS.ReadableStream.

I need to read zips present in our storage solution, but the latter hands out streams instead of filenames, because we cannot assume that the bits are accessible via a filename.

zip.extract() doesn't create directories, has many EBADF errors

This code works, using zip.stream()

'use strict';

var StreamZip = require('node-stream-zip');
var fs = require('fs');
var path = require('path');
var mkdirp = require('mkdirp');

var zip = new StreamZip({
  file: './website.zip'
, storeEntries: true
});

zip.on('error', function (err) { console.error('[ERROR]', err); });

zip.on('ready', function () {
  console.log('[Index Complete] Read', zip.entriesCount, 'entries');
});

zip.on('entry', function (entry) {

  // Security Check
  var pathname = path.resolve('./temp', entry.name);
  if (/\.\./.test(path.relative('./temp', pathname))) {
      console.warn("[zip warn]: ignoring maliciously crafted paths in zip file:", entry.name);
      return;
  }

  // Ignore directories
  if ('/' === entry.name[entry.name.length - 1]) {
    console.log('[DIR]', entry.name);
    return;
  }

  // Stream out files
  console.log('[FILE]', entry.name);

  zip.stream(entry.name, function (err, stream) {
    if (err) { console.error('Error:', err); return; }
    stream.on('error', function (err) { console.log('[ERROR]:'); console.log(err); });

    mkdirp(path.dirname(pathname), function (err) {
      if (err) { console.log('[Error]', err); return; }
      stream.pipe(fs.createWriteStream(pathname));
    });
  });

});

This code is broken, using zip.extract()

'use strict';

var StreamZip = require('node-stream-zip');
var path = require('path');
var mkdirp = require('mkdirp');

var zip = new StreamZip({
  file: './website.zip'
, storeEntries: true
});

zip.on('error', function (err) { console.error('[ERROR]', err); });

zip.on('ready', function () {
  console.log('[Index Complete] Read', zip.entriesCount, 'entries');
});

zip.on('extract', function (entry, file) {
  console.log('Extracted ' + entry.name + ' to ' + file);
});

zip.on('entry', function (entry) {

  // Security Check
  var pathname = path.resolve('./temp', entry.name);
  if (/\.\./.test(path.relative('./temp', pathname))) {
      console.warn("[zip warn]: ignoring maliciously crafted paths in zip file:", entry.name);
      return;
  }

  // Ignore directories
  if ('/' === entry.name[entry.name.length - 1]) {
    console.log('[DIR]', entry.name);
    return;
  }

  // extract file with full path to `./temp`
  mkdirp(path.dirname(pathname), function (err) {
    zip.extract(entry.name, pathname, function (err) {
      if (err) {
        console.error('[Extract Error]', entry.name, err);
      }
      else {
        console.log('[Entry] extracted', entry.name);
      }
      zip.close();
    });
  });

});

What I mean by "broken"

First of all, extract doesn't create the directories (which I would kinda expect), but it's an easy workaround by wrapping it in mkdirp().

Next, it seems that there's some sort of race condition because it actually begins writing many files before it quits with a stacktraceless string error.

Third, it seems that perhaps it can't overwrite existing files?

Final callback or event when all entries have been processed

Does the library provide a final event when all entries have been streamed to disk?

I'd imagine something like this:

var zip = new ZStream({
    file: "archive.zip",
    storeEntries: false
});

zip.on("entry", function (entry) {
    zip.stream(entry, function (err, readStream) {
        readStream.pipe(fs.createWriteStream("./output/" + entry.name));
    });
});

zip.on("complete", function () {
    // all entries have been processed/saved to disk
});

Strange stream error

Hey.

Here's the stack :

  • I've got a node.js script that unzips a file, and post content to an API (using node-stream-zip, and request.js)
  • I've got a node.js server that handles the multipart query (using multer)

This doesn't work when streaming directly from a RAM entry stream :

Promise.promisify(zip.stream)(syntheseEntry.name)
.then(function(fileStream) {
  return requestPromise({
    url: 'http://localhost:1337/file',
    method: 'POST',
    formData: {
      file: {
        value: fileStream,
        options: {
          filename: syntheseEntry.name
        }
      }
    }
  });
})

The error raised on my server is "Unexpected end of multipart data" like this one


But this does work when going through a temp file :

Promise.promisify(zip.stream)(syntheseEntry.name)
.then(function(zipFileStream) {
  return new Promise(function(resolve, reject) {
    let tmpFilePath = path.join(
      os.tmpdir(),
      new ObjectID().toString()
    );
    let tmpFileWriteStream = fs.createWriteStream(tmpFilePath);
    zipFileStream.pipe(tmpFileWriteStream);
    zipFileStream.on('end', function(err) {
      if(err) return reject(err);
      return resolve(tmpFilePath);
    });
  });
})
.then(function(tmpFilePath) {
  return fs.createReadStream(tmpFilePath);
})
.then(function(fileStream) {
  return requestPromise({
    url: 'http://localhost:1337/file',
    method: 'POST',
    formData: {
      file: {
        value: fileStream,
        options: {
          filename: syntheseEntry.name
        }
      }
    }
  });
})

The strange this is that yauzl library leads to exactly the same trouble, but I can make work request.js + multer stack with any other kind of stream.. oO

Closing Zip Stream before 'ready' causes errors

When closing zip streams before all entries have been read causes an "EBADF: bad file descriptor" error as the main zip file descriptor is closed.

This is an issue when pulling entries from a large file. As you can finish your use of the zip file before the zip entries finished reading, closing it at this point should stop the reading of entries, but it does not.

I am working around it by using an error handler to swallow all errors before I close it.

Error when trying to unzip MIME type application/x-zip-compressed

When trying to unzip MIME type application/x-zip-compressed, I am getting the following Error

{ Error: invalid distance too far back
    at Zlib.zlibOnError [as onerror] (zlib.js:162:17) errno: -3, code: 'Z_DATA_ERROR' }

I have first tried with yauzl, but got Error: unsupported compression method: 9 which, from what I read here thejoshwolfe/yauzl#58, is meant to be deflate64. Looking for a package with deflate64 ability I tried your package, but ran into the Error above.

Not Extract all

Done in 6.641. Entries read: 225
Invalid local header

var zip = new StreamZip({
    file: zipFile
  });
  zip.on('error', function (err) { console.error('ERROR: ' + err); });
  zip.on('ready', function () {
    console.log('Done in ' + process.uptime() + '. Entries read: ' + zip.entriesCount);
    zip.extract(null, pathWriter + '/', function (err, count) {
      console.log(err ? err : ('Extracted ' + count + ' entries'));
    });
  });
  zip.on('extract', function (entry, file) {
    console.log('extract', entry.name, file);
  });

my program task finished but the process hang up

I use why-is-node-running to check the problem, dont know how to solve it.
I'm on Mac

There are 7 handle(s) keeping the process running

# SIGNALWRAP
/usr/local/lib/node_modules/why-is-node-running/include.js:3 - process.on('SIGUSR1', function() { why() })

# TTYWRAP
/Users/bung/js_works/great-voyage/node_modules/supports-color/index.js:129 - stdout: getSupportLevel(process.stdout),

# SIGNALWRAP
/Users/bung/js_works/great-voyage/node_modules/supports-color/index.js:129 - stdout: getSupportLevel(process.stdout),

# TTYWRAP
/Users/bung/js_works/great-voyage/node_modules/supports-color/index.js:130 - stderr: getSupportLevel(process.stderr)

# Timeout
/Users/bung/js_works/great-voyage/server/node_modules/tarn/lib/Pool.js:343 - this.interval = setInterval(() => this.check(), this.reapIntervalMillis);
/Users/bung/js_works/great-voyage/server/node_modules/tarn/lib/Pool.js:232 - this._startReaping();
/Users/bung/js_works/great-voyage/server/node_modules/tarn/lib/Pool.js:208 - this._doAcquire();
/Users/bung/js_works/great-voyage/server/node_modules/tarn/lib/Pool.js:264 - this._tryAcquireOrCreate();

# ZLIB
/Users/bung/js_works/great-voyage/server/node_modules/node-stream-zip/node_stream_zip.js:381 - entryStream = entryStream.pipe(zlib.createInflateRaw());
/Users/bung/js_works/great-voyage/server/node_modules/node-stream-zip/node_stream_zip.js:447 - callback(readEx, entry);
/Users/bung/js_works/great-voyage/server/node_modules/node-stream-zip/node_stream_zip.js:858 - return this.callback(err, this.bytesRead);

# ZLIB
/Users/bung/js_works/great-voyage/server/node_modules/node-stream-zip/node_stream_zip.js:410 - data = zlib.inflateRawSync(data);
/Users/bung/js_works/great-voyage/server/database.js:89                                      - let content = zip2.entryDataSync('ports.json').toString('utf8');
/Users/bung/js_works/great-voyage/server/node_modules/bluebird/js/release/util.js:16         - return target.apply(this, arguments);
/Users/bung/js_works/great-voyage/server/node_modules/bluebird/js/release/promise.js:547     - x = tryCatch(handler).call(receiver, value);
/Users/bung/js_works/great-voyage/server/node_modules/bluebird/js/release/promise.js:604     - this._settlePromiseFromHandler(handler, receiver, value, promise);
/Users/bung/js_works/great-voyage/server/node_modules/bluebird/js/release/promise.js:649     - this._settlePromise(promise, handler, receiver, value);
/Users/bung/js_works/great-voyage/server/node_modules/bluebird/js/release/promise.js:729     - this._settlePromise0(this._fulfillmentHandler0, value, bitField);
/Users/bung/js_works/great-voyage/server/node_modules/bluebird/js/release/async.js:93        - fn._settlePromises();
/Users/bung/js_works/great-voyage/server/node_modules/bluebird/js/release/async.js:86        - _drainQueueStep(queue);
/Users/bung/js_works/great-voyage/server/node_modules/bluebird/js/release/async.js:102       - _drainQueue(this._normalQueue);
/Users/bung/js_works/great-voyage/server/node_modules/bluebird/js/release/async.js:15        - self._drainQueues();

Cannot read more than 65535 entries

I have a zip archiv which contains 66430 jpeg files. When I open the archiv and fetch the entriesCount value it tells me that there are only 65535 entries available.

Errors thrown in data handler events are swallowed

Consider the code:

zip = new StreamZip({file: 'foo.zip'});

zip.on('ready', () => {
  zip.on('entry', (entry) => {
    entry.stream((err, stream) => {
      stream.on('data', (data) => {
        if (isBad(data)) throw new Error('descriptive message!');
      })

      stream.pipe(transformStream);
    })
  });
});

I would expect the uncaught error here to be the "descriptive message" error. Instead, what I get is:

     Uncaught Error: no writecb in Transform class
      at afterTransform (_stream_transform.js:71:33)
      at TransformState.afterTransform (_stream_transform.js:54:12)
      at EntryVerifyStream._transform (node_modules/node-stream-zip/node_stream_zip.js:913:9)
      at EntryVerifyStream.Transform._read (_stream_transform.js:167:10)

Investigating the source, it appears you are wrapping a callback in a try/catch block. Since I am throwing an error later in the callback chain, that try/catch causes the callback to be called twice, leading to the "no writecb" error caused by a confused transform stream.

Extract a folder from archive to disk may not work

I use node-stream-zip like Usage, but it do not work.

test()
async function test() {
    await zipTest({ originPath: "D:/mainTest/hello.zip", targetPath: "D:/mainTest/hello", ignorePaths: config.backupIgnore })
}
function zipTest({ originPath = "", targetPath = "" } = {}) {
    return new Promise((resolve, reject) => {
        const zip = new StreamZip({
            file: originPath, // need include .zip
            storeEntries: true
        });
        zip.on('ready', () => {
            if (!existsFile('extracted')) fs.mkdirSync('extracted');
            zip.extract("D:/mainTest/hello.zip",  './extracted', err => {
                console.log(err ? 'Extract error' : 'Extracted');
                if (err) reject()
                zip.close();
                resolve()
            });
        })
        zip.on('extract', (entry, file) => {
            console.log(`Extracted ${entry.name} to ${file}`);
        });
    })
}

using gz files

The problem is that I'm using gz files and your soft only works with zip files. Do you know other libraries like yours but with gzip files??

Regards

Unzip for some files fails with "Bad Archive" intermittently.

Below is my code to unzip the files. I am getting the bad archives for good files. if I retry the unzip is successful. Can someone please help find the issue?

       const files = new Array<any>();
       return new Promise((resolve, reject) => {
           const zip = new streamZip({
               file: zipFilePath
           });
           zip.on("error", (err) => {
               this.logger.error("Unable to unzip the file " + zipFilePath, err);
               return reject(err);
           });
           zip.on("ready", () => {
               const startTime = process.uptime();
               zip.extract(null, tempFolder, (err, count) => {
                   if (err) {
                       this.logger.error("Unable to extract the file " + zipFilePath, err);
                       return reject(err);
                   }
                   this.logger.info(`Extracted ${count} entries`);
                   zip.close();
                   const timeTaken = process.uptime() - startTime;
                   this.logger.info(`Time taken to unzip the file ${timeTaken} Entries read: ${zip.entriesCount}`);
                   return resolve(files);
               });
           });

           zip.on("extract", (entry, file) => {
               this.logger.info(`Extracted File ${file}`, entry);
               files.push(file);
           });
       });
   }

Error is thrown when the zip has a lot of folder hierarchies in them

So here's what I did. I tried extracting a folder with just one folder inside it. Now the parent folder and the folder inside it, both has images. So there wasn't any issue extracting it.

But when I try to extract a zip archive such as this: https://drive.google.com/file/d/1LdI3iy0Vp1fPPGdEgxW28za8-WV7J6Mo/view?usp=sharing

I get an error saying Error: ENOENT: no such file or directory, mkdir.

Now when I again go to the folder that I zipped and tried, and zip a sub folder, like just the src/app folder instead of the whole project folder that looks something like this:

image

I get the error. Also it's only creates empty folders at that time and doesn't create sub-folders starting from a point.

None of the events trigger inside aws lambda

I'm attempting to download and unzip a file in aws lambda. My lambda function downloads a file from S3 and stores it in /tmp/ which is the allowed area.

I can verify this by reading that the file exists and matches the expected filesize.

After that I have the relevant unzip code, for now just trying to list the entries. This code runs without problems in my local machine, but it silently fails in the aws lambda. No errors are displayed, the program simply continues execution suggesting the 'error' and 'ready' events are never fired.

Any ideas on how to at least get more info of what's happening?

  try {
    let stats = fs.statSync(FILE_PATH);
    let fileSizeInBytes = stats["size"];
    console.log("my file fileSize: ", fileSizeInBytes);

    const zip = new StreamZip({ file: FILE_PATH });
    zip.on("error", (err) => {
      console.error("unzipping error: ", err);
    });

    zip.on("ready", () => {
      console.log("Entries: ", zip.entries());
      for (const entry of Object.values(zip.entries())) {
        const desc = entry.isDirectory ? "directory" : `${entry.size} bytes`;
        console.log(`Entry ${entry.name}: ${desc}`);
      }
      zip.close();
    });
  } catch (error) {
    console.error("error ", error);
  }

[SECURITY] Is entry.name sanitized or raw?

Maliciously crafted zip files could contain illegal paths such as ../../../../etc/passwd or /etc/passwd or d://system32/win32.dll even though the relative path format such as etc/passwd is the legal, proper way to store a path.

Is entry.name sanitized?

Extraction fails on 6.5gb PKWARE (code 9) compressed file

OS: Windows 10 64bit
Node: v9.11.1 64bit
node-stream-zip: 1.6.0

The US Centers for Medicaid and Medicare publishes a zipped csv flat file of current and past providers. Every month they publish the entire database which is a 6.5gb csv uncompressed. We've use node-stream-zip to unpack it for several months but this months release failed with the following error:

Error: too many length or distance symbols
    at Zlib.zlibOnError [as onerror] (zlib.js:142:17)
Emitted 'error' event at:
    at EntryVerifyStream.onerror (_stream_readable.js:675:12)
    at EntryVerifyStream.emit (events.js:180:13)
    at InflateRaw.<anonymous> (S:\Development\projects\arbola\ccb\npi\node_modules\node-stream-zip\node_stream_zip.js:933:14)
    at InflateRaw.emit (events.js:185:15)
    at Zlib.zlibOnError [as onerror] (zlib.js:145:8)

The zip file is available here (it is public data). The file it is failing on is npidata_pfile_20050523-20180408.csv.

7-zip 18.01 and Unzip 6.0 both unpack it without issue.

I hope you have time to look at this. Thanks for a very useful library regardless.

Looking for typings

Does anyone have typings for this library? I would like to use it in TypeScript based project and if someone wrote the typing already, it will save me plenty of time.

Thanks

Cannot Extract Folders From Archive

Hello there!

I've recently started using this library to download and extract zip archives. However, one problem that I came across, I'm encountering an "ENOENT: no such file or directory, mkdir [path]" error when extracting a specific archive. I can provide the specific zip file if needed, but it looks similar to this:

mcheli.zip

  • assets
    -- mcheli
  • mcheli
    -- multiple other directories

My current code looks like so: Gist. Anything I'm doing wrong?

entryDataSync throws string instead of error

Test

var data = zip.entryDataSync('DOES_NOT_EXIST');

Result

zip-test/node_modules/node-stream-zip/node_stream_zip.js:395
            throw err;
            ^
Entry not found

Why this is important

It's not possible to see the stack trace to determine where the error is occurring when a string is thrown instead of an error.

Why did you create this? (I'm trying to find the "best" unzip module)

ZIP parsing code has been partially forked from cthackers/adm-zip (MIT license).

I've been going through a bunch of different unzip libraries trying to find the "best" one and I see tons and tons of forks. Seems like there must be 1,000 zip libraries that just don't work well enough (jszip, yauzl, zip, adm-zip, unzipper, ...) because people just keep creating more.

I'm curious as to why adm-zip didn't work for you and what you see as the major advantages to this library so that I can make the best possible choice.

Obviously I have to run various tests to make sure the library I pick works for my use case, but I'd like to know your thoughts on why the others didn't suit your needs.

(as far as I can tell, this is the best - complete, working, minimal - of all the forks and variations)

How to serialize reading of entries? Purpose of zip.close()?

I was suspecting that zip.close() would be the cue to read the next file (so that it's not reading multiple files at once), but it seems that entry continues to fire - even when stream or extract is in progress - without waiting for zip.close().

Buffer instead of file

Hello, would it be possible for the library to accept buffer/readable stream instead of only file name?
It happens that i get the user sent file as buffer already and i dont really want to store it on drive just to read it again.

Cannot extract large files (>4 GB)

Hello, there is an issue with extracting of large files from ZIP archives. It occurs with any files because CRC validation function operates with 32-bit values, but it needs to be changed when extracting more than 4GB of file data.

Here is my test code

                        var outputFile = fs.createWriteStream("extracted.bin");
                        var zip = new StreamZip({  
                            file: "test.zip",  
                            storeEntries: true    
                        });
                        zip.on('ready', function() {
                            zip.stream("large_file_10GB.iso", function(error, zstream) {
                                zstream.pipe(outputFile);
                                    zstream.on('end', function() { 
                                            console.log('entry end');
                                            return;
                                        });
                            });
                        });

It fails after extracting of 4GB with exception "Invalid CRC". Moreover, there is uncatched exception when the issue occurs

events.js:74
        throw TypeError('Uncaught, unspecified "error" event.');
              ^
TypeError: Uncaught, unspecified "error" event.
    at TypeError (<anonymous>)
    at emit (events.js:74:15)
    at onerror (_stream_readable.js:536:12)
    at emit (events.js:95:17)
    at onwriteError (_stream_writable.js:238:10)
    at onwrite (_stream_writable.js:256:5)
    at WritableState.onwrite (_stream_writable.js:96:5)
    at afterTransform (_stream_transform.js:99:5)
    at TransformState.afterTransform (_stream_transform.js:74:12)
    at EntryVerifyStream._transform (C:\Soft\Study\node_modules\node-stream-zip\
node_stream_zip.js:907:9)

Would it be possible to fix that?

Byte-array support

I see the rationale for not supporting streams in #8 - makes sense.

However, for cases where zip file bytes are already read completely into memory, it would be nice to support this case directly, instead of having to write out a tmp file first.

Perhaps a direct Buffer/Uint8Array parameter to the constructor config - or the ability to pass a file system implementation into the config (which would also facilitate unit testing).

Thanks!

Base64 Buffer

Is there anyway to take a Base64 Buffer and use it with this library? Basically I'm uploading a file and want to be able to process it, present some information to the user, and then process even more based upon the user's selection.

Missing entry error emitter

When you try to stream a zipEntry there is no event emitting of error coming from the entry if it fails.
Please add a ZipEntry.emit('error')

thanks

how to extract AES encrypted zip files?

how to extract AES encrypted zip files?

module.exports.error = {};
module.exports.error['enc_aes.zip'] = function(test) {
test.expect(1);
var zip = new StreamZip({ file: 'test/err/enc_aes.zip' });
zip.on('ready', function() {
zip.stream('README.md', function(err) {
test.equal(err, 'Entry encrypted');
test.done();
});
});
};
module.exports.error['enc_zipcrypto.zip'] = function(test) {
test.expect(1);
var zip = new StreamZip({ file: 'test/err/enc_zipcrypto.zip' });
zip.on('ready', function() {
zip.stream('README.md', function(err) {
test.equal(err, 'Entry encrypted');
test.done();
});
});
};

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.