0
0
GraphqlHow-ToBeginner · 4 min read

How to Use DataLoader in GraphQL for Efficient Data Fetching

Use DataLoader in GraphQL to batch multiple requests for the same data into a single query and cache results to avoid redundant fetching. Create a DataLoader instance with a batch loading function, then use it inside your resolvers to load data efficiently.
📐

Syntax

The basic syntax to use DataLoader involves creating a new instance with a batch loading function that accepts an array of keys and returns a Promise resolving to an array of results. Then, inside your GraphQL resolvers, you call loader.load(key) to queue the key for batch loading.

  • new DataLoader(batchLoadFn): Creates a loader with a batch function.
  • batchLoadFn(keys): Function that receives keys and returns results in the same order.
  • loader.load(key): Loads a single key, batching it with others.
javascript
const DataLoader = require('dataloader');

const loader = new DataLoader(async (keys) => {
  // keys is an array of IDs
  // Return an array of results matching keys order
  return await batchFetchFromDatabase(keys);
});

// In resolver
const result = await loader.load(key);
💻

Example

This example shows how to use DataLoader in a GraphQL resolver to batch user data fetching by IDs, reducing multiple database calls into one.

javascript
const { ApolloServer, gql } = require('apollo-server');
const DataLoader = require('dataloader');

// Mock database
const usersDB = [
  { id: '1', name: 'Alice' },
  { id: '2', name: 'Bob' },
  { id: '3', name: 'Charlie' }
];

// Batch function to fetch users by IDs
async function batchGetUsers(ids) {
  console.log('Batch fetching users:', ids);
  return ids.map(id => usersDB.find(user => user.id === id));
}

// GraphQL schema
const typeDefs = gql`
  type User {
    id: ID!
    name: String!
  }
  type Query {
    user(id: ID!): User
  }
`;

// Resolvers
const resolvers = {
  Query: {
    user: async (_, { id }, { loaders }) => {
      return loaders.userLoader.load(id);
    }
  }
};

// Create Apollo Server
const server = new ApolloServer({
  typeDefs,
  resolvers,
  context: () => ({
    loaders: {
      userLoader: new DataLoader(batchGetUsers)
    }
  })
});

// Start server
server.listen().then(({ url }) => {
  console.log(`Server ready at ${url}`);
});
Output
Batch fetching users: [ '1', '2', '3' ]
⚠️

Common Pitfalls

Common mistakes when using DataLoader include:

  • Creating a single DataLoader instance globally instead of per request, causing incorrect caching across users.
  • Not returning results in the same order as the keys array in the batch function.
  • Using loadMany incorrectly or mixing load and direct database calls, which defeats batching.

Always create DataLoader instances inside the GraphQL context to isolate caches per request.

javascript
/* Wrong: Global DataLoader instance (bad caching) */
const userLoader = new DataLoader(batchGetUsers);

const resolvers = {
  Query: {
    user: (_, { id }) => userLoader.load(id) // cache shared across requests
  }
};

/* Right: Create DataLoader per request in context */
const server = new ApolloServer({
  context: () => ({
    loaders: {
      userLoader: new DataLoader(batchGetUsers)
    }
  })
});
📊

Quick Reference

ConceptDescription
DataLoader instanceCreated per request to isolate cache and batching
Batch functionReceives array of keys, returns Promise of results in same order
load(key)Queues a key for batch loading, returns a Promise for the result
CachingDataLoader caches results per request to avoid duplicate fetches
BatchingMultiple load calls in one tick are combined into one batch function call

Key Takeaways

Create a new DataLoader instance inside each GraphQL request context to avoid cross-request caching.
The batch loading function must return results in the same order as the keys array.
Use loader.load(key) inside resolvers to batch and cache data fetching automatically.
DataLoader improves performance by reducing redundant database calls and combining requests.
Avoid mixing direct database calls with DataLoader to keep batching effective.