rafe / papercut Goto Github PK
View Code? Open in Web Editor NEWnode module to resize and crop image
node module to resize and crop image
Im trying to set the extension to 'pdf'. It works fine when I set it globally with papercut.set
, but not on a specific schema version.
schema.version({name: 'original', extension: 'pdf'})
I also tried using 'format' in place of extension.
For every file uploaded to s3, the value for the x-amz-acl is set to 'public-read'. Is there a way to upload the files with custom permissions, like authenticated-read.
I am iterating over a set of images and using async.parralel and the hash of image urls in the callback from process contains urls from many different calls to process:
Schema.process name, path, (err, images)->
console.log "processPhoto images: ", _.prettyPrint(images)
->
2013-11-14T18:03:27.503Z - debug: processPhoto images: {
"avatar-icon": "https://s3.amazonaws.com/mw-development/impact-avatar-icon.jpg",
"avatar-column": "https://s3.amazonaws.com/mw-development/impact-avatar-column.jpg",
"avatar-card": "https://s3.amazonaws.com/mw-development/team-avatar-card.jpg",
"large": "https://s3.amazonaws.com/mw-development/city-large.jpg"
}
In this example I would expect the urls to all contain 'impact' and not 'team' or 'city'
It's worth to have an ability to set custom headers for uploaded S3 files. For example, I'd like to set Cache-Control: max-age=<...>
.
(For S3 Store)
This is problematic because the content-type for jpg's is "image/jpeg".
Maybe it would be an idea to find another way of detecting/guessing/setting content-type?
How about adding the ability to configure a particular bucket for a given schema as opposed to bucket per papercut instance?
Eg,
uploader.remove('image1', function(err){
//err handling
})
Awesome project!
Please can you change name as "PaperCut" is a registered trade mark of @PaperCutSoftware
Many thanks
It would be interesting if it were possible to compress and optimize the image size, setting a percentage of quality.
Like this
https://github.com/rsms/node-imagemagick#resizeoptions-callbackerr-stdout-stderr
I am getting the time out error for the large size images(morethan 5 mb). Can we use some timeout for that or any another solution?
In case of such an error make sure you have imagemagick CLI tools installed as it is required by node-imagemagick. I got it only after looking through the dependencies.
Getting the following error. Not sure why. ImageMagick doesn't complain that it can't find the image. Using local storage, not S3.
events.js:72
throw er; // Unhandled 'error' event
^
Error: spawn ENOENT
at errnoException (child_process.js:1001:11)
at Process.ChildProcess._handle.onexit (child_process.js:792:34)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.