How to compress images in s3 bucket ?

actually i have a backend in nodejs and confused to how to compress the images and then store them again to my s3 bucket .Moreover i also want to decompress thode images whenever the client fetches them ? Any help will be appreciated, Thank you !
9 Replies
Joao
Joao8mo ago
Most image formats already are compressed so you may end up with even larger files after trying to compress them again. You could reduce file size by using lowering the quality settings when applying the compression i.e., when the image file is generated, or reducing the dimensions of the image. There's a nifty CLI tool called ImageMagick that you can use for that purpose. I'm sure there are packages to integrate it with Node, but I know there's one that runs with WebAssembly in the browser directly: https://github.com/KnicKnic/WASM-ImageMagick Actually you can use imagemagick to lower the quality even for existing images as well so that may worth exploring.
nick11
nick118mo ago
thanks for replying @Joao , let me tell you the flow , actually when front end is getting any images it is sending them to the server and at serverside( i.e in nodejs), we are uploading them to our aws s3 bucket . here i dont want to directly upload the images to s3 but compress them before uploading to s3 also i need uncompressed images (good qaulity image) whenever the client wants them .
Joao
Joao8mo ago
I suggest using sharp https://github.com/lovell/sharp then
nick11
nick118mo ago
can i decompress the images back (almost) to their original quality so that i can serve the client . i have seen people using sharp but they just show the tutorial for compressing not for decompressing to original shape ?
Joao
Joao8mo ago
You could create several copies of the image and deliver the most appropriate to the client. Actually you probably want to store the largest image possible and then create subsequent smaller images on the fly as requested by the client. But as far as compression goes, images (jpg, png, webp, etc) are compressed formats already. PNG is lossless compression so you can actually apply further compression to that. WebP can use both. Either way, once you apply lossly compression you will lose information so you won't be able to revert that process. So it depends on what image format are you expecting to deal with?
nick11
nick118mo ago
Actually i am building an ecommerce app and want to store seller's product images and seller's documentand also other images . My concern is that should i store them as it is in my s3 bucket or should i apply further compression but i also want to fetch the good image (is there something as decompression i want to ask) . I f i only store images as they are , wouldn't they increase my storage costs?
Joao
Joao8mo ago
The biggest contributor to image file size are the dimensions and format. Using formats like webp already goes a long way and is widely supported everywhere. As for the size, you want to have responsive design in your site anyway which means you need to store the image in the largest size that it'll ever be viewed. If that's something like 960x540 for example, then that's what you'll have to store and from there derive, as needed, different sizes based on media queries and whatnot. So you could first try and understand hwo the user will interact with the website and figure the optimal dimensions for your images. But yeah as far as storage is concerned, images are expensive.
nick11
nick118mo ago
Okay thanks @Joao
randomwebbits
randomwebbits8mo ago
Before exploring cli methods, I would use https://squoosh.app/ to find what settings I like, then run a method for compression in bulk tailored to the settings found. ++ Props to squoosh for a great front-end!
Squoosh
Squoosh is the ultimate image optimizer that allows you to compress and compare images with different codecs in your browser.