Coder Social home page Coder Social logo

apiaryio / s3-streaming-upload Goto Github PK

View Code? Open in Web Editor NEW
123.0 19.0 15.0 282 KB

s3-streaming-upload is node.js library that listens to your stream and upload its data to Amazon S3 using ManagedUpload API.

License: MIT License

JavaScript 100.00%
sre

s3-streaming-upload's Introduction

s3-streaming-upload s3-streaming-upload

s3-streaming-upload is node.js library that listens to your stream and upload its data to Amazon S3 and OCI Bucket Store.

It is heavily inspired by knox-mpu, but unlike it, it does not buffer data to disk and is build on top of official AWS SDK instead of knox.

Changes

  • Version 0.3.2 NodeJS 12+ supported.

  • Version 0.3.x Change from Coffee-script to Javascript. NodeJS 6 and 8 supported.

  • Version 0.2.x using ManagedUpload API. NodeJS 0.10 and 0.12 supported.

  • Version 0.1.x using MultiPartUpload API. NodeJS 0.8 and 0.10 supported.

Installation

Installation is done via NPM, by running npm install s3-streaming-upload

Features

  • Super easy to use
  • No need to know data size beforehand
  • Stream is buffered up to specified size (default 5MBs) and then uploaded to S3
  • Segments are not written to disk and memory is freed as soon as possible after upload
  • Uploading is asynchronous
  • You can react to upload status through events

Quick example

var Uploader = require('s3-streaming-upload').Uploader,
  upload = null,
  stream = require('fs').createReadStream('/etc/resolv.conf');

upload = new Uploader({
  // credentials to access AWS
  accessKey: process.env.AWS_S3_ACCESS_KEY,
  secretKey: process.env.AWS_S3_SECRET_KEY,
  bucket: process.env.AWS_S3_TEST_BUCKET,
  objectName: 'myUploadedFile',
  stream: stream,
  debug: true,
});

upload.send(function(err) {
  if (err) {
    console.error('Upload error' + err);
  }
});

Setting up ACL

Pass it in objectParams to the Uploader:

upload = new Uploader({
  // credentials to access AWS
  accessKey: process.env.AWS_API_KEY,
  secretKey: process.env.AWS_SECRET,
  bucket: process.env.AWS_S3_TRAFFIC_BACKUP_BUCKET,
  objectName: 'myUploadedFile',
  stream: stream,
  objectParams: {
    ACL: 'public-read',
  },
});

Example usage with Oracle Cloud (OCI) compatible S3 API

region = process.env.OCI_REGION;
tenancy = process.env.OCI_TENANCY;
// define custom service
service = new aws.S3({
  apiVersion: '2006-03-01',
  credentials: {
    accessKeyId: process.env.BUCKET_ACCESS_KEY,
    secretAccessKey: process.env.BUCKET_SECRET_KEY,
  },
  params: { Bucket: process.env.BUCKET_NAME },
  endpoint: `${tenancy}.compat.objectstorage.${region}.oraclecloud.com`,
  region: region,
  signatureVersion: 'v4',
  s3ForcePathStyle: true,
});

uploader = new Uploader({
  accessKey: process.env.BUCKET_ACCESS_KEY,
  secretKey: process.env.BUCKET_SECRET_KEY,
  bucket: process.env.BUCKET_NAME,
  objectName: filename,
  stream: source,
  service: service,
  objectParams: {
    ContentType: 'text/csv',
  },
  debug: true,
});

s3-streaming-upload's People

Contributors

abtris avatar almad avatar chrisronline avatar dependabot[bot] avatar engstrom avatar honzajavorek avatar kuba-kubula avatar modax avatar nathanpeck avatar opichals avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

s3-streaming-upload's Issues

Question : Can be run on EC2 instance in context of IAM role?

Hi,

Having a look at this code https://github.com/apiaryio/s3-streaming-upload/blob/master/src/uploader.coffee#L10 it appears the requirement to have credentials or STS session token is mandatory. This negates any possibility of running this code on an EC2 instance within the context of an IAM role.

http://docs.aws.amazon.com/AWSJavaScriptSDK/guide/node-configuring.html#Credentials_from_IAM_Roles_for_EC2_Instances

Am I correct? Pondering a pull request but hoping you could confirm my suspicions.

Thanks

Multi-part upload fails on the last part in version 0.1.8

In version 0.1.6 all works in version 0.1.8 the following error is raised on on the last chunk :

Error: Not all parts uploaded. Uploaded: {"1":"\"60c62f1a7c704ba23118d5912e92204a\"","2":"\"37da78092f5cf712552981c1770d7919\"","3":"\"3717a85b3935f4b001d4e429146a69be\"","4":"\"d8b10331e730ba8e9de0659ef63efcde\"","5":"\"bfee39990a518e1572ce97f6194ea90d\"","6":"\"4e116d7e5d22339340b4f1893ff4affc\"","7":"\"4c9459b1c3cad397860bab44bfbd829d\""}, Reported by listParts as uploaded: []

The multi-part transaction stays open.

Cannot call method 'createMultipartUpload' of undefined

Every time that I try to use the s3-streaming-upload I end up with an error. Even if I try to follow the examples.

I'm probably doing something wrong, but could I have the some input on your part?

Instalation step: (it actually ends up installing the 0.1.13 version)

npm install s3-uploading-stream

My code, I tried to reduce it to the simplest possible case

var Uploader = require('s3-streaming-upload').Uploader,
    fs = require('fs');

var uploader = new Uploader({
    accessKey:  process.env.AWS_API_KEY,
    secretKey:  process.env.AWS_SECRET,
    bucket:     process.env.AWS_S3_TRAFFIC_BACKUP_BUCKET,
    objectName: 'somerandomfilename.jpg',
    stream:     fs.createReadStream('./somerandonimage.jpg')
});

uploader
    .on('completed', function () { console.log('completed', arguments); })
    .on('failed', function (err) { console.error(err); });

And the error returned.

/private/tmp/node_modules/s3-streaming-upload/lib/uploader.js:66
      return this.getNewClient().createMultipartUpload(this.objectParams, func
                                 ^
TypeError: Cannot call method 'createMultipartUpload' of undefined
    at Uploader.initiateTransfer (/private/tmp/node_modules/s3-streaming-upload/lib/uploader.js:66:34)
    at ReadStream.<anonymous> (/private/tmp/node_modules/s3-streaming-upload/lib/uploader.js:99:15)
    at ReadStream.emit (events.js:117:20)
    at _stream_readable.js:929:16
    at process._tickDomainCallback (node.js:463:13)

Trying to uplaod more thn 6 mb PDF file

Hello

I am trying to upload pdf file having size 6 mb. it shows upload sucessfully, but when i go to s3 the size is 5 mb. and file is not opening.

Here is my code

var Uploader = require('s3-streaming-upload').Uploader,
 upload = null;
//this function will upload the file to s3
exports.uploadFile = function (fileReadStream, filename, awsHeader, cb) {

    //set options for the streaming module


upload = new Uploader({
  // credentials to access AWS
  accessKey:  config.aws.accessKey,
  secretKey:  config.aws.secretKey,
  bucket:     'bibliohivepdf',
  objectName: "pdf1/"+filename,
  stream:     fileReadStream,
  objectParams: {
    ACL: 'public-read'
  },
  debug:      true
});

upload.send(function (err) {
  if (err) {
    console.log("Errro"+err);
    console.error('Upload error' + err);
  } else {
   console.log("Sucessfull");
  }
});


};

Node Engine Expected version ">12". Got "12.19.0"

In my package .json I have the engine configured as 12.19.0. Whenever I try to install this package, this error appears.
Please ensure that the package works on all versions of node!

error [email protected]: The engine "node" is incompatible with this module. Expected version ">12". Got "12.19.0"
error Found incompatible module.

Confusing names of variables

  accessKey: process.env.AWS_S3_ACCESS_KEY,
  secretKey: process.env.AWS_S3_SECRET_KEY,
  bucket: process.env.AWS_S3_TEST_BUCKET,

These should be OCI keys when setting the tools to work with OCI Object Storage. Naming of the environment variables is confusing in that situation.

Setting ACL

Is there any way to easily set the ACL to public-read?

EventEmitter memory leak

Version: 0.1.14

I followed the example provided but I intentionally entered incorrect s3 credentials to test the error handling. The upload 'failed' callback is invoked every 5 seconds over and over.
upload.on('failed', function (err) {
console.log('upload failed with error', err);
});

Then eventually node produces the following error and the 'failed' callback continues.

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Uploader.EventEmitter.addListener (events.js:160:15)
at Uploader.EventEmitter.once (events.js:185:8)
at Uploader.uploadChunks (D:\test\node_modules\s3-streaming-upload\lib\uploader.js:139:12)
at null. (D:\test\node_modules\s3-streaming-upload\lib\uploader.js:54:20)
at wrapper as _onTimeout
at Timer.listOnTimeout as ontimeout

One way I can stop it is to call upload.finishUploads();
Am I missing something?

Cannot initiate transfer

So i went to S3 ~> Bucket ~> Permissions

Added "Authenticated Users", masked "List", "Upload/Delete", "View Permissions", "Edit Permissions"

After that i tried to upload a file using the following code:

s = require './src/settings'

Uploader = require('s3-streaming-upload').Uploader

file   = __dirname + '/audio/recording.mp3'
stream = require('fs').createReadStream file

config = 
  accessKey : s.s3.key
  secretKey : s.s3.secret
  bucket    : s.s3.bucket
  objectName: "test"
  stream    : stream


upload = new Uploader config

upload.on 'completed', ( error, res ) ->
  console.log('upload completed');

upload.on 'failed', ( error ) ->
  console.log('upload failed with error' )

  if error then console.error error

being

  • s.s3.key = my user name from my credentials
  • s.s3.secret = my user credentials access key
  • s.s3.bucket = my bucket name

I'm sure the mp3 file exists.

Still it fails with [Error: Cannot initiate transfer]

Any ideas? Thank you

Getting 'This callback was already called' message

Doing 495MB stream upload, going great, got to Uploading 94, then the uploading events paused for a few minutes; however, the network showed still uploading at 2Mb/s so all appeared well, then started getting "This callback was already called, WTF; chunk <Buffer ...>" messages. Even with these messages, data continues to upload. Thoughts?

v4 auth mechanism

The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256

When using frankfurt DPC. Frankfurt DPC does not support V2 auth

Creating an Uploader w/o a stream fails

I was expecting to be able to create an Uploader and subsequently .pipe() my source stream to it, i.e.:

fs.createReadStream("/tmp/foo").pipe(new Uploader({
  bucket: myBucket,
  objectName: "foo"
});

This is the relevant stack trace:

/Users/seth/src/stamen/vapor-clock/node_modules/s3-streaming-upload/node_modules/readable-stream/lib/_stream_readable.js:822
  stream.on('end', function() {
         ^
TypeError: Cannot call method 'on' of undefined
    at Readable.wrap (/Users/seth/src/stamen/vapor-clock/node_modules/s3-streaming-upload/node_modules/readable-stream/lib/_stream_readable.js:822:10)
    at Uploader.handleStream (/Users/seth/src/stamen/vapor-clock/node_modules/s3-streaming-upload/lib/uploader.js:86:14)
    at new Uploader (/Users/seth/src/stamen/vapor-clock/node_modules/s3-streaming-upload/lib/uploader.js:54:10)

Keys have 0 size

I uploaded a buch of data, but the size is always 0 bytes after "completed" event.

folder support w/objectName

I'm trying to organize files under folders but there doesn't seem to be a way to make this work. When I specify a folder path as objectName the /'s get escaped.

Any suggestions?

Not all parts uploaded

Hey, I've been getting this error for several different upload attempts. Any ideas what could cause it?

Not all parts uploaded. Uploaded: {"1":""6a470e80da16391768669a553898bae3"","2":""d65a44fc9bec6c5ef5c73e9d179221d9""}, Reported by listParts as uploaded: [] after 5 atempts
upload failed with error { [InvalidPart: One or more of the specified parts could not be found. The part may not have been uploaded, or the specified entity tag may not match the part's entity tag.]
message: 'One or more of the specified parts could not be found. The part may not have been uploaded, or the specified entity tag may not match the part's entity tag.',
code: 'InvalidPart',
time: Mon Apr 07 2014 09:32:24 GMT-0300 (ADT),
statusCode: 400,
retryable: false,
_willRetry: false }
Complete!

Changes to objectParams can leak out of Uploader

Supplying the same objectParams to two separate instances of Uploader can cause other options to leak between uploaders.

params =
  ContentType: 'text/csv'

uploader1 = new Uploader objectParams: params, bucket: 'bucket-1', # ...
uploader2 = new Uploader objectParams: params, bucket: 'bucket-2', # ...

console.log uploader1.objectParams.Bucket # bucket-1
console.log uploader2.objectParams.Bucket # bucket-1

This can certainly be solved in userland, but copying objectParams (rather than holding a reference) may save some head-scratching when multiple uploaders coexist in the same process.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.