Integrating Node.js with AWS S3: A Step-by-Step Guide

Ankit Gupta
3 min readJan 19, 2025

--

AWS S3 (Simple Storage Service) is a robust and scalable object storage solution offered by Amazon Web Services. It allows you to store and retrieve files efficiently, making it a preferred choice for developers managing files in their applications. In this tutorial, we’ll walk through the process of integrating Node.js with AWS S3 to perform operations like file upload, download, and listing.

Why Integrate AWS S3 with Node.js?

  1. Scalable Storage: AWS S3 is built for handling massive amounts of data.
  2. Secure File Management: You can define granular access controls for buckets and files.
  3. Global Accessibility: Files can be accessed publicly or via secure endpoints globally.
  4. Programmatic Control: Node.js offers flexibility to perform various S3 operations programmatically, such as uploading, downloading, and listing files.

Step 1: Setting Up AWS S3

1. Create an S3 Bucket

  1. Log in to the AWS Management Console.
  2. Navigate to S3 and click on Create Bucket.
  3. Provide a unique bucket name and select a region.
  4. Configure permissions and enable features like versioning and logging (optional).
  5. Click Create Bucket.

2. Configure IAM Permissions

  1. Go to the IAM Console and create a new user.
  2. Select Programmatic Access.
  3. Attach the policy AmazonS3FullAccess or create a custom policy for specific bucket actions.
  4. Note the Access Key ID and Secret Access Key.

Step 2: Setting Up the Node.js Project

1. Initialize the Project

Create a new Node.js project:

mkdir s3-integration
cd s3-integration
npm init -y

2. Install Required Packages

Install the necessary dependencies:

npm install aws-sdk multer multer-s3 dotenv
  • aws-sdk: Official AWS SDK for Node.js to interact with AWS services.
  • multer: Middleware for handling file uploads.
  • multer-s3: Integration of Multer with S3.
  • dotenv: For environment variable management.

3. Configure Environment Variables

Create a .env file for secure credential management:

AWS_ACCESS_KEY_ID=your-access-key-id
AWS_SECRET_ACCESS_KEY=your-secret-access-key
AWS_REGION=your-region
S3_BUCKET_NAME=your-bucket-name

Step 3: Writing the Node.js Code

1. Initialize AWS SDK

Set up the AWS SDK in your index.js file:

require('dotenv').config();
const AWS = require('aws-sdk');
const express = require('express');
const multer = require('multer');
const multerS3 = require('multer-s3');

const app = express();

AWS.config.update({
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
region: process.env.AWS_REGION,
});

const s3 = new AWS.S3();

2. Configure Multer for File Uploads

Define a Multer instance with S3 as the storage engine:

const upload = multer({
storage: multerS3({
s3: s3,
bucket: process.env.S3_BUCKET_NAME,
acl: 'public-read', // Control file visibility
key: function (req, file, cb) {
cb(null, Date.now().toString() + '-' + file.originalname); // Generate unique file names
},
}),
});

3. Define API Endpoints

Upload File

app.post('/upload', upload.single('file'), (req, res) => {
res.status(200).json({
message: 'File uploaded successfully',
fileUrl: req.file.location,
});
});

List Files

app.get('/list', async (req, res) => {
try {
const data = await s3.listObjectsV2({ Bucket: process.env.S3_BUCKET_NAME }).promise();
res.status(200).json(data.Contents);
} catch (error) {
res.status(500).json({ error: error.message });
}
});

Download File

app.get('/download/:key', (req, res) => {
const params = {
Bucket: process.env.S3_BUCKET_NAME,
Key: req.params.key,
};

s3.getObject(params, (err, data) => {
if (err) {
res.status(500).json({ error: err.message });
} else {
res.attachment(req.params.key);
res.send(data.Body);
}
});
});

4. Start the Server

Run the server:

const PORT = process.env.PORT || 3000;

app.listen(PORT, () => {
console.log(`Server is running on port ${PORT}`);
});

Step 4: Testing the Integration

1. Run the Server

node index.js

2. Test Endpoints

Use Postman or similar tools to test the endpoints:

  • Upload Files: Send a POST request to /upload with a file.
  • List Files: Send a GET request to /list.
  • Download Files: Send a GET request to /download/:key with the file key.

Step 5: Best Practices

  1. Secure Credentials: Use .env files or AWS Secrets Manager for managing credentials.
  2. Restrict Access: Limit IAM permissions to specific actions and buckets.
  3. Enable Bucket Policies: Define rules to control access and ensure compliance.
  4. Optimize Costs: Monitor S3 storage usage to avoid unnecessary costs.
  5. Logging and Monitoring: Enable logging for troubleshooting and tracking activities.

Advanced Use Cases

  • Multipart Uploads: Briefly discuss multipart uploads for large files.
  • Server-Side Rendering (SSR): Briefly explain how to use S3 to serve static files for SSR applications.
  • Data Archiving and Backups: Briefly discuss using S3 for data archiving and backups.

Integrating AWS S3 with Node.js enables efficient file management for modern applications. By following the steps outlined above, you can seamlessly upload, retrieve, and manage files on S3 from your Node.js application.

For more details, check out:

--

--

Ankit Gupta
Ankit Gupta

Written by Ankit Gupta

A Software Developer with extensive experience in building and scaling web and AI-based applications. Passionate about LLMs and AI agents.

No responses yet