When your Node.js application repeatedly fetches the same data from APIs, it slows down response times and may hit API rate limits. Redis caching solves this by storing frequently accessed data in memory, making subsequent requests much faster. This tutorial shows you how to implement Redis caching in Node.js with practical examples.
Introduction
Most web apps need data from APIs to work properly. Every time your app asks for data from an API, it sends a request over the internet and waits for a response. This takes time and can make your app slow for users.
The good news is that you can solve this problem with caching. Instead of asking the API for the same data over and over, you can store the data after the first request and use that stored copy for future requests. Redis is perfect for this - it's a super fast database that keeps data in memory.
In this tutorial, I'll show you how to build an Express app that gets data from an API and then add Redis caching to make it much faster.
Prerequisites
Before starting, make sure you have these 5 things ready:
Node.js installed on your computer
Redis installed and running
Some basic knowledge of JavaScript promises and async/await
Familiarity with Express.js basics
Step 1: Setting up the project
Let's start by creating a new project. I'm going to build a simple fish wiki that gets information about different fish species from an API.
First, create a new folder for your project:
mkdir fish_wiki
cd fish_wiki
Initialize a new Node.js project:
npm init -y
Now install the packages we'll need:
npm install express axios redis
Here's what each package does:
express: Web framework for creating our server
axios: HTTP client for making API requests
redis: Client for connecting to Redis
Let's create a basic Express server to start with. Create a file called server.js
:
const express = require("express");
const app = express();
const port = process.env.PORT || 3000;
app.listen(port, () => {
console.log(`App listening on port ${port}`);
});
Run this to make sure everything works:
node server.js
You should see "App listening on port 3000" in your terminal. Press Ctrl+C to stop the server for now.
Step 2: Retrieving data from a RESTful API without caching
Now let's add the ability to fetch data from an API. We'll use the FishWatch API, which provides information about different fish species.
Update your server.js
file:
const express = require("express");
const axios = require("axios");
const app = express();
const port = process.env.PORT || 3000;
async function fetchApiData(species) {
const apiResponse = await axios.get(
`https://www.fishwatch.gov/api/species/${species}`
);
console.log("Request sent to the API");
return apiResponse.data;
}
async function getSpeciesData(req, res) {
const species = req.params.species;
let results;
try {
results = await fetchApiData(species);
if (results.length === 0) {
throw "API returned an empty array";
}
res.send({
fromCache: false,
data: results,
});
} catch (error) {
console.error(error);
res.status(404).send("Data unavailable");
}
}
app.get("/fish/:species", getSpeciesData);
app.listen(port, () => {
console.log(`App listening on port ${port}`);
});
Here's what this code does:
fetchApiData() - This function makes the actual API request and logs when it happens
getSpeciesData() - This handles the Express route, gets the species name from the URL, and returns the data
The route
/fish/:species
captures whatever species name you put in the URL
Start your server again:
node server.js
Now open your browser and go to http://localhost:3000/fish/red-snapper
. You should see JSON data about red snapper fish, with fromCache: false
.
Try refreshing the page a few times and watch your terminal. You'll see "Request sent to the API" logged every single time you refresh. This means we're hitting the API repeatedly for the same data, which is wasteful and slow.
Step 3: Caching RESTful API requests using Redis
Now let's add Redis to cache our API responses. This way, we'll only hit the API once and serve cached data for subsequent requests.
First, update your imports and add Redis connection code:
const express = require("express");
const axios = require("axios");
const redis = require("redis");
const app = express();
const port = process.env.PORT || 3000;
let redisClient;
(async () => {
redisClient = redis.createClient();
redisClient.on("error", (error) => console.error(`Error : ${error}`));
await redisClient.connect();
})();
The Redis connection code creates a client, sets up error handling, and connects to Redis on the default port (6379).
Now let's modify the getSpeciesData
function to use caching:
async function getSpeciesData(req, res) {
const species = req.params.species;
let results;
let isCached = false;
try {
// First, check if we have this data in cache
const cacheResults = await redisClient.get(species);
if (cacheResults) {
isCached = true;
results = JSON.parse(cacheResults);
} else {
// If not in cache, fetch from API
results = await fetchApiData(species);
if (results.length === 0) {
throw "API returned an empty array";
}
// Store in cache for next time
await redisClient.set(species, JSON.stringify(results));
}
res.send({
fromCache: isCached,
data: results,
});
} catch (error) {
console.error(error);
res.status(404).send("Data unavailable");
}
}
Here's how the caching works:
First, we try to get data from Redis using the species name as the key
If we find cached data, we parse it from JSON and return it
If there's no cached data, we fetch from the API and store it in Redis
We use
JSON.stringify()
to store data andJSON.parse()
to retrieve it because Redis stores strings
Your complete server.js
should now look like this:
const express = require("express");
const axios = require("axios");
const redis = require("redis");
const app = express();
const port = process.env.PORT || 3000;
let redisClient;
(async () => {
redisClient = redis.createClient();
redisClient.on("error", (error) => console.error(`Error : ${error}`));
await redisClient.connect();
})();
async function fetchApiData(species) {
const apiResponse = await axios.get(
`https://www.fishwatch.gov/api/species/${species}`
);
console.log("Request sent to the API");
return apiResponse.data;
}
async function getSpeciesData(req, res) {
const species = req.params.species;
let results;
let isCached = false;
try {
const cacheResults = await redisClient.get(species);
if (cacheResults) {
isCached = true;
results = JSON.parse(cacheResults);
} else {
results = await fetchApiData(species);
if (results.length === 0) {
throw "API returned an empty array";
}
await redisClient.set(species, JSON.stringify(results));
}
res.send({
fromCache: isCached,
data: results,
});
} catch (error) {
console.error(error);
res.status(404).send("Data unavailable");
}
}
app.get("/fish/:species", getSpeciesData);
app.listen(port, () => {
console.log(`App listening on port ${port}`);
});
Start your server and test it:
node server.js
Go to http://localhost:3000/fish/red-snapper
again. The first time you visit, you'll see fromCache: false
and "Request sent to the API" in your terminal.
Refresh the page and now you should see fromCache: true
, and no new API request in the terminal! The data is coming from Redis cache now.
Step 4: Implementing cache validity
Cached data can become outdated, so it's important to set expiration times. Different types of data need different expiration periods - some data changes hourly, others might be good for days or weeks.
For our example, let's set the cache to expire after 3 minutes (180 seconds). Update the line where we store data in Redis:
await redisClient.set(species, JSON.stringify(results), {
EX: 180,
NX: true,
});
The options object has two properties:
EX: 180 - Cache expires after 180 seconds
NX: true - Only set the key if it doesn't already exist
Now when you test your app, the cache will automatically expire after 3 minutes, and the next request will fetch fresh data from the API.
Step 5: Caching data in middleware
As your app grows, you might want to organize your caching logic better. Express middleware is perfect for this - it lets you handle caching separately from your main route logic.
Let's refactor our code to use middleware. First, create a middleware function that only handles checking the cache:
async function cacheData(req, res, next) {
const species = req.params.species;
let results;
try {
const cacheResults = await redisClient.get(species);
if (cacheResults) {
results = JSON.parse(cacheResults);
res.send({
fromCache: true,
data: results,
});
} else {
next(); // Continue to the next function
}
} catch (error) {
console.error(error);
res.status(404);
}
}
Now simplify the getSpeciesData
function to only handle API requests and caching new data:
async function getSpeciesData(req, res) {
const species = req.params.species;
let results;
try {
results = await fetchApiData(species);
if (results.length === 0) {
throw "API returned an empty array";
}
await redisClient.set(species, JSON.stringify(results), {
EX: 180,
NX: true,
});
res.send({
fromCache: false,
data: results,
});
} catch (error) {
console.error(error);
res.status(404).send("Data unavailable");
}
}
Finally, update your route to use both functions:
app.get("/fish/:species", cacheData, getSpeciesData);
Now when someone visits /fish/red-snapper
:
The
cacheData
middleware runs first and checks RedisIf data is found, it returns the cached response and stops
If no cached data exists, it calls
next()
which runsgetSpeciesData
getSpeciesData
fetches from the API, caches the result, and returns the response
This approach keeps your code organized and makes it easy to add caching to other routes too.
Conclusion
In this tutorial, you built an application that fetches data from an API and returns the data as a response to users. You then modified the app to cache the API response in Redis on the initial visit and serve the data from the cache for all subsequent requests. You also learned how to set cache expiration times and use middleware to organize your caching logic. This approach significantly improves your application's performance by reducing API calls and provides a better user experience with faster response times.