anacronw / multer-s3 Goto Github PK
View Code? Open in Web Editor NEWmulter storage engine for amazon s3
License: MIT License
multer storage engine for amazon s3
License: MIT License
Hi,
Thanks for your code. Very inspiring for my project.
I would like to send quite big files to S3, so I am using upload with multipart but as you know aws s3 js sdk does not manage client side encryption π’
The use cas is a browser upload a file to my server unencrypted, and while I received streams of this upload, I send these streams to s3 with s3.upload (sample here: https://devcenter.heroku.com/articles/s3-upload-node) That way, I don't have to wait for the whole download to finish before uploading to S3. That way, I can use client side encryption to my own server.
Do you think your approach/code could work for this use case (multipart upload) ? For instance, If I overide the s3.upload method to add client side encryption params.
I am worrying that s3 will not be able to recompose the file with all the differents parts if they have been encrypted on their own side.
I know it is not completely clear. But If someone understand my need, your helpful advices are welcome π
How to set path to upload to folder inside a given bucket?
Thank you.
Just being curious, is it anyhow possible to hook into multer-s3 before upload and access the locally cached file?
What I like to to:
I really appreciate your help to get a quick dive into multer and multer-s3!
Hi there,
The dependency file-type is set to ^3.3.0 but they are currently at 7.2.0
I tested in my fork and it all works as expected.
Thanks
Is there a support to delete files?
If yes can you please give example
Have a way to prevent a broken file to be uploaded?
I see in my API some files uploaded just by the half, the other part of the image is just a gray block. Maybe some bytes are lost I guess.
How I can avoid this behavior? In other cases we use a MD5, but when we're working with streams, maybe this is a complicated think.
Any ideas?
I need to to be able to change s3 credentials depending on request. If you make s3 option a function then a new instance can be created and returned if required. Does it work?
I have the following code to save Files in S3. Following code I have
var express = require('express'),
aws = require('aws-sdk'),
bodyParser = require('body-parser'),
multer = require('multer'),
multerS3 = require('multer-s3');
aws.config.update({
secretAccessKey: 'XXXXXXXXXX',
accessKeyId: 'XXXXXXXXXX'
});
var app = express(),
s3 = new aws.S3();
app.use(bodyParser.json());
var upload = multer({
storage: multerS3({
s3: s3,
bucket: 'XXXXXXX',
key: function (req, file, cb) {
console.log(file);
cb(null, Date.now()+file.originalname);
}
})
});
app.post('/upload', upload.any(), function (req, res, next) {
// Here I want to get the File Name
res.send("Uploaded!");
});
I want to get the FileName/KEY at the method level once the Upload is successful. I would be saving the files with Unique IDs. Once Saved, I will be saving the KEY in a Relational DB to keep a track of the Files. How can I get the file name [Date.now()+file.originalname]
where i put my my s3 access key and
where i put my my s3 access key and
**"accessKeyId": "xxxxxxxxxxxxxxxx",
"secretAccessKey": **"xxxxxxxxxxxxxx",
i get this error
AWS Missing credentials when i try send something to my S3 Bucket
Hey, I wanted to see if there is currently a way to specify in the options to process the image(s) through an image processor like Sharp, to reduce the file size or dimensions?
why file is not opening in browers its only download from amazon
Is just one idea, but what you think about send the buffer across the middlewares?
maybe something link: req.file.buffer
like we have in MemoryStorage
.
In my case this is useful, but I don't know if this are useful for keep in the public api of multer-s3
As far as I can tell there is no way to set a Cache-Control
header from the MulterS3
parameters. I really really don't wanna make a separate request to S3 just to update this setting π
Should be pretty easy to bake this into MulterS3
, would you accept a PR for this?
I'm thinking an optional parameter, something like this:
var upload = multer({
storage: multerS3({
s3: s3,
bucket: 'some-bucket',
cacheControl: 'max-age=31536000',
acl: 'public-read',
key: function (req, file, cb) {
cb(null, Date.now().toString())
}
})
});
I have following for upload files into S3
const upload = extension => multer({
storage: multerS3({
s3,
bucket,
acl,
metadata(req, file, cb) {
cb(null, { fieldName: file.fieldname });
},
...
}),
});
router.post(
'/upload',
upload('jpg').array('files', 5),
(req, res) => {
console.log(req.files);
...
It is possible to make some kind of function for each single uploading file (to write some information about file into DB before multer begin uploading next file from array('files', 5)
)
(for example, if writing data into DB failed multer need to abort uploading rest of files)
or for this I need upload each file separately?
How can I do multipart uploads?
I don't see an option to pass ACL param when storing a file, is this missing ? or it can be done somehow ?
There should be a way to disable file overwriting. Maybe something like:
var upload = multer({
storage: s3({
bucket: 'some-bucket',
secretAccessKey: 'some secret',
accessKeyId: 'some key',
region: 'us-east-1',
overwrite: false,
key: function (req, file, cb) {
cb(null, Date.now().toString())
}
})
})
Is there any way to cancel the file upload? Say if the request is unauthorized and we don't want the file from the request to be uploaded to S3.
Hi I am using your library with some success. But I was curious what your recommended way to add a file extension is? I tried the key attribute that is passed into the options but It seems the docs are outdated. Let me know how you normally handle this.
Thanks,
Jordy
I need to resize the image keeping aspect-ratio before or after uploading image.
Better if I can generate ratina version of the image also.
It would be ideal, if the users are able to pass in ACL modes, during uploads :). Something like this:
var upload = that.s3.upload({
Bucket: that.options.bucket,
Key: filePath,
ACL: 'public-read', <-- // (passed via main func)
ContentType: contentType,
Body: (_stream || file.stream)
})
ref: http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#listObjects-property
Something, we can pass it in a func. I can submit a pull request, with the change, if ok?
Cheers,
Jeremy
i'm trying render an image via an S3 url in the following form:
"'https://s3.amazonaws.com/bucket-name/'+ req.file.key"
where req.file.key is my storage's dirname attribute; however, i keep getting the following error when i navigate to said URL: "This XML file does not appear to have any style information associated with it. The document tree is shown below." i'm not sure if this is an issue with multer itself or this module for storing to my s3 bucket. i do indeed have my storage acl set to 'public-read'.
if this helps, logging 'req.file' to the console returns an object with all the proper metadata, where the size is "undefined." if you could let me know what you think Duncan, that'd be great!
Duncan,
FYI - I noticed that npm is not installing the latest version of your package. The version numbers match but the index.js from npm does not match the index.js from this git repo. Using the version from this repo works great.
Steve
The version of this package published on npm is at version 2.7.0
https://www.npmjs.com/package/multer-s3
The repo has no 2.7.0 tag, and it's head package.json is still at 2.5.0
Perhaps the repo is just missing the 2.7.0 tag, and package.json needs to be bumped
I think it would be a good improvement to drop the dependency on s3fs
and instead use aws-sdk
directly. This would save the extra dependencies and give us access to upstream updates quicker.
I would be happy to provide the code if no one beats me to it.
On a related note, it seems like uploading to s3 from multer is very popular. How would you feel about getting some help on maintaining this module? I would personally be happy to hop on as a maintainer. Since it's such a small module I think the most work would be to cut new releases when dependencies (probably only aws-sdk
) updates, and to answer questions from users.
I'm linusu
on npm π
Hi! Thanks for this useful middleware.
I was looking for the possibility to specify a prefix in the endpoint like:
app.post('/', upload.array('images', 5, 'user_images_prefix'), req, res) => {}
When sending a payload with a file and additional text fields I would expect req.body
to contain these text field but it just returns an empty object. Is this intentional?
My use-case is I want to construct the file path based on these additional fields.
Hello,
I've implemented setup for multer-s3 exactly as you have demonstrated. It looks like file uploading to s3 is working! However, the file object passed to the metadata and key functions does not have a key or location, even though it is successfully uploading.
Ultimately, my goal is to upload the file, but also return the filepath as a json response back down to my client. How would you recommend doing this? Why does the file object seem to be empty, and what are the metadata and key functions even meant to do?
Thanks,
Adam
All files show up as application/octet-stream
in S3.
This should be set to the actual content type of the file.
Hey there, was looking for something like this that would stream a file uploaded through API request and just wanted to validate the functionality of this helper package. I was curious if the file that is being uploaded through the request is being stored in local system memory completely before uploading to S3, or is this streaming the file to S3 as the API is receiving the file in parts? The reason I am curious and looking for a explicit answer is I am wondering if Ill be able to use this method of uploading to S3 if the files being uploaded are a bit larger than those of simple photos, such as small video clips. If it will store the full video clip before streaming to S3, then I fear that multiple requests to upload files would result in over memory usage faster than if each request was only storing a small amount of the file that needs to be uploaded to S3 at a time.
storage: multerS3({
s3: s3,
bucket: 'bucket',
metadata: function(req, file, cb) {
cb(null, {
fieldName: file.fieldname
});
},
key: function(req, file, cb) {
console.log('req.body', req.params.id); //not getting
console.log('req.body', req.body);
//Not getting param here that passed in api
//Need to save file on s3 at specific location i.e /foldername/filename
//But the folder name not getting from API
cb(null, file.originalname)
}
}) }).array('userFile', 1);
Above is multer s3 code
app.post('/saveData', function(req, res, next) {
upload(req, res, function(err) {
console.log('err' + err);
var status = '';
var result = '';
var link = '';
if (err) {
status = false;
} else {
status = true;
}
result = {
"status": status,
"link": link
}
});
res.send(result); });
Above code where calling multer upload function. I am parsing data in API (from Angular2, set -Content-Type": "multipart/form-data) as formdata
`let formData: FormData = new FormData();
formData.append('userFile', file);
formData.append('fileName', fileName);`
I require the req.body data from API like folder name and other, so I can put the file to specific place on S3. Need req.body data inside the key: function(req, file, cb) { of multerS3.
I used it this way.
var multer = require('multer');
var s3 = require('multer-s3');
var upload = multer({
storage: s3({
dirname: 'uploads/photos',
bucket: 'testing-bucket-yousa',
secretAccessKey: 'abcd',
accessKeyId: 'zz',
region: 'us-east-1'
})
})
app.post('/admin/uploadImagetoS3', upload.array('file'), function (req, res, next) {
console.log(res)
})
what does upload.array('file')
do?
Please clarify me why is dirname required, I donβt want to store the data on my local machine and how to capture the response from s3, if it was successful or not?
As per the given example there is no way to provide bucket name dynamically.
Scenario is something like user would send data from request with bucket name and that name can be passed in bucket name.
var upload = multer({
storage: multerS3({
s3: s3,
bucket: 'web-test-2016',
acl: 'public-read',
key: function (req, file, cb) {
cb(null, makeid() + '_' + file.originalname)
//Date.now().toString()
}
})
})
In this case, how can I set the destination folder name? Thanks.
Have a way to referencing the link on the req
object after the complete upload to s3?
I am not able to upload large files using multer-s3
. It is not giving me any error as well. It just doesn't upload the file, doesn't even enters the callback and gets timeout. Any way to handle uploading large files to s3 Bucket?
I am using it like this:
var uploadSingle = upload.single('uploadFile');
router.post('/uploadVideo',function(req,res,next){
uploadSingle(req,res,function(err){
// doesn't come here if the file is large
if(err){
//Error Response , Error while uploading Module PDF;
}
else{
//handling file upload
// success response
}
});
}
It doesnt enters the callback of uploadSingle
.
As aws-sdk
auto-loads configurations and credentials from ~/.aws/
directory and the recommended way of loading credentials is by letting the aws-sdk
do so, I suggest we drop the requirements for the region, secretAccessKey and accessKeyId.
The dirname option also should not be required, as it is not mandatory to upload the file to a directory (though useful for most cases).
What do you think?
if (!opts.secretAccessKey) throw new Error('secretAccessKey is required')
if (!opts.accessKeyId) throw new Error('accessKeyId is required')
if (!opts.region) throw new Error('region is required')
if (!opts.dirname) throw new Error('dirname is required')
Hello!
Thank you for this great module!
However, what would be the minimal possible AWS-policy needed for file upload using this module?
Thanks!
Hi, I was wondering whether it was possible to perform some image preprocessing with lwip (https://github.com/EyalAr/lwip#usage) prior to upload?
Hi,
The sample code is not at all working. I've changed with my credentials and using ACL with public-read. Please check the issue.
var upload = multer({
storage : multerS3({
s3 : s3,
bucket : 'mybucket',
ACL : 'public-read',
metadata : function(req, file, cb) {
cb(null, {
fieldName : file.fieldname
});
},
limits : {
fileSize : 10 * 1024 * 1024
},
key : function(req, file, cb) {
cb(null, Date.now().toString())
}
})
}).single('photo');
app.post('/upload', function(req, res, next) {
upload(req, res, function(err) {
res.send('Successfully uploaded files!')
});
})
Files are being uploaded successfully to S3 but I am unable to show any progress bar events in browser.
What must I do to get progress events ?
Is there a delete object method implemented?
Without acl: 'public-read',
files get uploaded.when I click the file link it says access denied.
So I added acl: 'public-read',
. It gives error
AccessDenied: Access Denied
and files didn't get uploaded.
aws.config.update({
secretAccessKey: 'key',
accessKeyId: 'secret',
});
var s3 = new aws.S3()
var upload = multer({
storage: multerS3({
s3: s3,
bucket: 'pdsfiles',
metadata: function (req, file, cb) {
cb(null, {fieldName: file.fieldname});
},
key: function (req, file, cb) {
cb(null, Date.now().toString())
}
})
})
This is what the error looks like:
TypeError: this.s3.upload is not a function
at S3Storage. (/home/simran/Downloads/NODE-BACKEND/node_modules/multer-s3/index.js:172:26)
at /home/simran/Downloads/NODE-BACKEND/node_modules/multer-s3/index.js:58:10
at S3Storage.getContentType (/home/simran/Downloads/NODE-BACKEND/node_modules/multer-s3/index.js:8:5)
at /home/simran/Downloads/NODE-BACKEND/node_modules/multer-s3/index.js:55:13
at end (/home/simran/Downloads/NODE-BACKEND/node_modules/run-parallel/index.js:16:15)
at _combinedTickCallback (internal/process/next_tick.js:73:7)
at process._tickDomainCallback (internal/process/next_tick.js:128:9)
Call:
router.put('/updatePatternGrading/:pattern_number', upload.any(), function(req, res){
console.log("request obj after manipulation", req.files);
callAPI(req, res, fn.bind(apiObj, 'updatePatternGrading'));
})
Why do I get this ?
How to save uploaded file with extension like .jpg or *.png
Thank you.
Minio.io server is a replacement server for s3. I tried to use Minio sever as replacement for s3 and I get and error. More info is in the issue. Please have a look.
Thank you.
Hi is it possible to create a new directory in the bucket before uploading? via dirname
?
Thanks.
**Missing credentials in config
Error: connect ENETUNREACH 169.254.169.254:80
at Object.exports._errnoException (util.js:870:11)
at exports._exceptionWithHostPort (util.js:893:20)
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1061:14)**
If I npm install multer-s3
, it downloads v1.4.1
. However, the documentation is for versions of multer ahead of v1.4.1
where dirname
is not longer a required option.
I suggest that while [email protected]
is the default on npm
we bring back all the documentation needed for using the module.
It took me a while to figure out what was going on.
Hi,
It's possible to fix limit size to upload?
Thkx
A declarative, efficient, and flexible JavaScript library for building user interfaces.
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. πππ
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google β€οΈ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.