Ketan Bhatt

Home Archive Notes

Stream Files to Amazon S3

April 20, 2015 programming

For any SaaS platform it is common to use a 3rd party hosting service for uploading files and serving them through a CDN. Amazon S3 is a common choice.

Usually, the file upload from the client side (say, AngluarJS) is sent to the server (node server running on GNU/Linux box) which is then forwarded to Amazon S3. This approach is inefficient because the file needs to be opened and “read” by the server and then forwarded to S3. The solution provided here:

  1. Extracts all the metadata (file name, size, mime type)
  2. Opens a File stream to the Amazon S3 bucket
  3. And writes the file directly to it.


var express = require('express');
var router = express.Router();
var multer = require('multer'), //for handling multipart/form-data
fs = require('fs'),
S3FS = require('s3fs'), //abstraction over Amazon S3's SDK
s3fsImpl = new S3FS('your-bucket-here', {
accessKeyId: 'Your-IAM-Access',
secretAccessKey: 'Your-IAM-Secret'
// POST a new Path'/', [ multer(), function(req, res) {
var file = req.files.file;
/* Output:
fieldname: 'file',
originalname: 'ice-boxes.jpg',
name: '2658a8f666e33ab1ec39dc8b7b20970b.jpg',
encoding: '7bit',
mimetype: 'image/jpeg',
path: 'public/uploads/2658a8f666e33ab1ec39dc8b7b20970b.jpg',
extension: 'jpg',
size: 88076,
truncated: false,
buffer: null
//Create a file stream
var stream = fs.createReadStream(file.path);
//writeFile calls putObject behind the scenes
s3fsImpl.writeFile(, stream).then(function () {
fs.unlink(file.path, function (err) {
if (err) {
module.exports = router;
view raw stream_files.js hosted with ❤ by GitHub

I struggled with it for quite some time when I needed to implement this feature in one of my projects. I hope it will help the community in the future.


  1. Multer
  2. S3FS

Get new posts sent to you

Subscribe to my new work on programming, productivity, and a few other topics. Published once a month.

    Powered By ConvertKit