0
0
MongodbHow-ToBeginner · 4 min read

How to Use GridFS in MongoDB: Store and Retrieve Large Files

Use GridFS in MongoDB to store files larger than 16MB by splitting them into chunks. You can upload, download, and manage large files using the GridFSBucket API in MongoDB drivers.
📐

Syntax

GridFS uses two collections: fs.files to store file metadata and fs.chunks to store file data in chunks.

The main API is GridFSBucket, which provides methods like openUploadStream() to upload and openDownloadStream() to download files.

Basic usage pattern:

  • Create a GridFSBucket instance from your MongoDB database connection.
  • Use openUploadStream(filename) to write a file stream.
  • Use openDownloadStream(fileId) to read a file stream.
javascript
const { MongoClient, GridFSBucket } = require('mongodb');

async function run() {
  const client = new MongoClient('mongodb://localhost:27017');
  await client.connect();
  const db = client.db('mydatabase');

  // Create GridFS bucket
  const bucket = new GridFSBucket(db);

  // Upload a file
  const uploadStream = bucket.openUploadStream('myfile.txt');
  // Write data to uploadStream

  // Download a file
  const downloadStream = bucket.openDownloadStream(uploadStream.id);
  // Read data from downloadStream

  await client.close();
}

run();
💻

Example

This example shows how to upload a local file to MongoDB using GridFS and then download it back to a new file.

javascript
const { MongoClient, GridFSBucket } = require('mongodb');
const fs = require('fs');

async function run() {
  const client = new MongoClient('mongodb://localhost:27017');
  await client.connect();
  const db = client.db('mydatabase');
  const bucket = new GridFSBucket(db);

  // Upload local file to GridFS
  const uploadStream = bucket.openUploadStream('example.txt');
  fs.createReadStream('example.txt').pipe(uploadStream)
    .on('error', (error) => console.error('Upload error:', error))
    .on('finish', () => {
      console.log('File uploaded with id:', uploadStream.id);

      // Download file from GridFS
      const downloadStream = bucket.openDownloadStream(uploadStream.id);
      downloadStream.pipe(fs.createWriteStream('downloaded_example.txt'))
        .on('error', (error) => console.error('Download error:', error))
        .on('finish', () => {
          console.log('File downloaded successfully');
          client.close();
        });
    });
}

run();
Output
File uploaded with id: <ObjectId> File downloaded successfully
⚠️

Common Pitfalls

  • Not handling stream errors can cause silent failures during upload or download.
  • Trying to store files larger than 16MB without GridFS will fail because MongoDB document size limit is 16MB.
  • Forgetting to close the MongoDB client connection can cause resource leaks.
  • Using incorrect file IDs or filenames when downloading will cause errors.

Always check for errors on streams and ensure you use the correct ObjectId when downloading files.

javascript
/* Wrong: Not handling errors and closing client */
const uploadStream = bucket.openUploadStream('file.txt');
fs.createReadStream('file.txt').pipe(uploadStream);

/* Right: Handle errors and close client */
fs.createReadStream('file.txt').pipe(uploadStream)
  .on('error', (err) => console.error('Upload error:', err))
  .on('finish', () => {
    console.log('Upload finished');
    client.close();
  });
📊

Quick Reference

  • openUploadStream(filename): Upload a file stream.
  • openDownloadStream(fileId): Download a file stream by ID.
  • fs.files: Collection storing file metadata.
  • fs.chunks: Collection storing file chunks.
  • Use ObjectId to identify files.

Key Takeaways

GridFS splits large files into chunks stored in two collections: fs.files and fs.chunks.
Use GridFSBucket's openUploadStream and openDownloadStream methods to handle file storage and retrieval.
Always handle stream errors and close the MongoDB client to avoid resource leaks.
GridFS is necessary for files larger than MongoDB's 16MB document size limit.
Use the file's ObjectId to download or manage stored files accurately.