6 min read
Implementing Rate Limiting in Your Node.js API
This tutorial builds on our authenticated API by adding rate limiting to prevent abuse and ensure fair usage. We'll implement multiple rate limiting strategies and cover both memory-based and Redis-based solutions.
Prerequisites
- Completed Parts 1-3 of the tutorial
- Redis installed (optional, for Redis-based rate limiting)
- Basic understanding of API security concepts
Project Setup
Install the required dependencies:
npm install express-rate-limit rate-limit-redis ioredis
Updated Project Structure
Add these new files to your project:
books-api/
├── src/
│ ├── config/
│ │ ├── database.js
│ │ ├── auth.js
│ │ └── rateLimiter.js
│ ├── middleware/
│ │ ├── errorHandler.js
│ │ ├── auth.js
│ │ ├── roleCheck.js
│ │ └── rateLimiter.js
│ ├── utils/
│ │ └── redis.js
│ └── server.js
Environment Setup
Update your .env
file:
MONGODB_URI=mongodb://localhost:27017/books_api
REDIS_URI=redis://localhost:6379
NODE_ENV=development
PORT=3000
JWT_SECRET=your_jwt_secret_key_here
JWT_EXPIRE=24h
RATE_LIMIT_WINDOW_MS=900000
RATE_LIMIT_MAX=100
Redis Configuration
Create src/utils/redis.js
:
const Redis = require('ioredis');
const redisClient = new Redis(process.env.REDIS_URI, {
maxRetriesPerRequest: 3,
enableOfflineQueue: false
});
redisClient.on('error', (err) => {
console.error('Redis Error:', err);
});
redisClient.on('connect', () => {
console.log('Redis connected successfully');
});
module.exports = redisClient;
Rate Limiter Configuration
Create src/config/rateLimiter.js
:
const rateLimit = require('express-rate-limit');
const RedisStore = require('rate-limit-redis');
const redisClient = require('../utils/redis');
// Basic rate limiter options
const basicLimiterOptions = {
windowMs: parseInt(process.env.RATE_LIMIT_WINDOW_MS) || 15 * 60 * 1000, // 15 minutes
max: parseInt(process.env.RATE_LIMIT_MAX) || 100, // Limit each IP to 100 requests per windowMs
message: {
error: 'Too many requests, please try again later.'
},
standardHeaders: true, // Return rate limit info in the `RateLimit-*` headers
legacyHeaders: false // Disable the `X-RateLimit-*` headers
};
// Redis store configuration
const redisStoreLimiterOptions = {
...basicLimiterOptions,
store: new RedisStore({
sendCommand: (...args) => redisClient.call(...args)
})
};
// Authentication endpoints limiter (more strict)
const authLimiterOptions = {
windowMs: 15 * 60 * 1000, // 15 minutes
max: 5, // Limit each IP to 5 requests per windowMs
message: {
error: 'Too many login attempts, please try again later.'
}
};
// Dynamic rate limiter based on user role
const createDynamicLimiter = (options = {}) => {
return rateLimit({
...basicLimiterOptions,
...options,
keyGenerator: (req) => {
// Use user ID if authenticated, otherwise use IP
return req.user ? req.user.id : req.ip;
},
handler: (req, res) => {
res.status(429).json({
error: 'Too many requests',
retryAfter: Math.ceil(options.windowMs / 1000)
});
}
});
};
module.exports = {
basicLimiter: rateLimit(basicLimiterOptions),
redisLimiter: rateLimit(redisStoreLimiterOptions),
authLimiter: rateLimit(authLimiterOptions),
createDynamicLimiter
};
Rate Limiter Middleware
Create src/middleware/rateLimiter.js
:
const { basicLimiter, redisLimiter, authLimiter } = require('../config/rateLimiter');
// Role-based rate limits
const roleLimits = {
admin: {
windowMs: 15 * 60 * 1000, // 15 minutes
max: 1000
},
editor: {
windowMs: 15 * 60 * 1000,
max: 500
},
user: {
windowMs: 15 * 60 * 1000,
max: 100
}
};
// Dynamic rate limiter based on user role
const dynamicRateLimiter = (req, res, next) => {
const userRole = req.user ? req.user.role : 'anonymous';
const limits = roleLimits[userRole] || roleLimits.user;
const limiter = createDynamicLimiter({
windowMs: limits.windowMs,
max: limits.max
});
return limiter(req, res, next);
};
// Sliding window rate limiter
const slidingWindowLimiter = (windowMs, max) => {
const requests = new Map();
return (req, res, next) => {
const key = req.ip;
const now = Date.now();
const windowStart = now - windowMs;
// Get existing requests for this IP
let userRequests = requests.get(key) || [];
// Filter out requests outside current window
userRequests = userRequests.filter(time => time > windowStart);
if (userRequests.length >= max) {
return res.status(429).json({
error: 'Too many requests',
retryAfter: Math.ceil((userRequests[0] - windowStart) / 1000)
});
}
// Add current request
userRequests.push(now);
requests.set(key, userRequests);
next();
};
};
module.exports = {
basicRateLimiter: basicLimiter,
redisRateLimiter: redisLimiter,
authRateLimiter: authLimiter,
dynamicRateLimiter,
slidingWindowLimiter
};
Updated Server File
Update src/server.js
:
require('dotenv').config();
const express = require('express');
const bodyParser = require('body-parser');
const connectDB = require('./config/database');
const errorHandler = require('./middleware/errorHandler');
const {
basicRateLimiter,
authRateLimiter
} = require('./middleware/rateLimiter');
const booksRouter = require('./routes/books');
const authRouter = require('./routes/auth');
const app = express();
const PORT = process.env.PORT || 3000;
// Connect to MongoDB
connectDB();
// Middleware
app.use(bodyParser.json());
// Apply basic rate limiter to all requests
app.use(basicRateLimiter);
// Routes with specific rate limits
app.use('/api/auth', authRateLimiter, authRouter);
app.use('/api/books', booksRouter);
// Error handling middleware
app.use(errorHandler);
// Start server
app.listen(PORT, () => {
console.log(`Server running on port ${PORT}`);
});
Updated Routes with Rate Limiting
Update your route files to include role-based rate limiting. Here's an example for books:
const express = require('express');
const router = express.Router();
const { dynamicRateLimiter } = require('../middleware/rateLimiter');
const protect = require('../middleware/auth');
// Apply dynamic rate limiting to protected routes
router.use(protect, dynamicRateLimiter);
// Your existing routes...
Implementing Different Rate Limiting Strategies
1. Token Bucket Algorithm
class TokenBucket {
constructor(capacity, fillPerSecond) {
this.capacity = capacity;
this.fillPerSecond = fillPerSecond;
this.tokens = capacity;
this.lastFill = Date.now();
}
consume(tokens = 1) {
// Add tokens based on time passed
const now = Date.now();
const timePassed = (now - this.lastFill) / 1000;
this.tokens = Math.min(
this.capacity,
this.tokens + timePassed * this.fillPerSecond
);
this.lastFill = now;
if (this.tokens < tokens) {
return false;
}
this.tokens -= tokens;
return true;
}
}
// Middleware implementation
const tokenBucketMiddleware = (capacity, fillPerSecond) => {
const buckets = new Map();
return (req, res, next) => {
const key = req.ip;
if (!buckets.has(key)) {
buckets.set(key, new TokenBucket(capacity, fillPerSecond));
}
const bucket = buckets.get(key);
if (!bucket.consume()) {
return res.status(429).json({
error: 'Rate limit exceeded'
});
}
next();
};
};
2. Fixed Window Counter
const fixedWindowLimiter = (windowMs, max) => {
const requests = new Map();
return (req, res, next) => {
const key = req.ip;
const now = Date.now();
const windowStart = Math.floor(now / windowMs) * windowMs;
const currentRequests = requests.get(key) || { count: 0, start: windowStart };
if (currentRequests.start < windowStart) {
currentRequests.count = 0;
currentRequests.start = windowStart;
}
if (currentRequests.count >= max) {
return res.status(429).json({
error: 'Too many requests',
retryAfter: Math.ceil((windowStart + windowMs - now) / 1000)
});
}
currentRequests.count++;
requests.set(key, currentRequests);
next();
};
};
Rate Limiting Headers
Add custom headers to track rate limit status:
const addRateLimitHeaders = (req, res, next) => {
res.on('finish', () => {
const limit = req.rateLimit;
if (limit) {
res.setHeader('X-RateLimit-Limit', limit.limit);
res.setHeader('X-RateLimit-Remaining', limit.remaining);
res.setHeader('X-RateLimit-Reset', limit.reset);
}
});
next();
};
Monitoring and Logging
Add rate limit monitoring:
const winston = require('winston');
const rateLimitLogger = winston.createLogger({
level: 'info',
format: winston.format.json(),
transports: [
new winston.transports.File({ filename: 'rate-limits.log' })
]
});
const monitorRateLimits = (req, res, next) => {
res.on('finish', () => {
if (res.statusCode === 429) {
rateLimitLogger.warn({
ip: req.ip,
endpoint: req.originalUrl,
userAgent: req.get('user-agent'),
timestamp: new Date().toISOString()
});
}
});
next();
};
Testing Rate Limiting
Test your rate limits using these curl commands:
# Test basic rate limiting
for i in {1..150}; do
curl http://localhost:3000/api/books
done
# Test auth rate limiting
for i in {1..10}; do
curl -X POST -H "Content-Type: application/json" \
-d '{"email":"test@example.com","password":"password123"}' \
http://localhost:3000/api/auth/login
done
# Test with authentication token
TOKEN="your_jwt_token"
for i in {1..200}; do
curl -H "Authorization: Bearer $TOKEN" \
http://localhost:3000/api/books
done
Best Practices Implemented
Multiple Strategies
- Memory-based rate limiting
- Redis-based rate limiting
- Role-based dynamic limits
- Sliding window implementation
Security Features
- IP-based limiting
- User-based limiting
- Route-specific limits
- Proper error responses
Performance Considerations
- Redis for distributed systems
- Efficient algorithms
- Memory management
- Cleanup of expired data
Monitoring and Logging
- Rate limit tracking
- Header information
- Error logging
- Analytics support
Next Steps
To further enhance your API:
- Adding API documentation using Swagger
- Setting up automated testing
- Adding pagination for GET requests
- Implementing proper logging
Conclusion
You now have a robust rate limiting system that can:
- Prevent API abuse
- Ensure fair usage
- Scale across multiple servers
- Adapt to different user roles
- Monitor and log rate limit events
Remember to:
- Monitor rate limit effectiveness
- Adjust limits based on usage patterns
- Keep Redis backed up
- Monitor system resources
- Regular maintenance and cleanup
The next tutorial covers adding API documentation using Swagger.
Related Posts
• 5 min read
APIs (Application Programming Interfaces) are the backbone of modern digital applications. They allow different software systems to communicate, exchange data, and collaborate seamlessly. As businesse...
• 4 min read
In today’s interconnected digital world, APIs (Application Programming Interfaces) are the backbone of communication between different software applications. From mobile apps to cloud services, APIs e...
• 5 min read
In the modern digital ecosystem, APIs (Application Programming Interfaces) serve as the backbone of connectivity. Whether you're building microservices, enabling integrations, or crafting data pipelin...