0
0
NodejsHow-ToBeginner · 4 min read

How to Read Large Files Using Stream in Node.js Efficiently

In Node.js, you can read large files efficiently using the fs.createReadStream() method, which reads the file in small chunks instead of loading it all at once. This approach uses streams to handle data piece by piece, preventing memory overload and improving performance.
📐

Syntax

The fs.createReadStream(path, options) method creates a readable stream for the file at the given path. It reads the file in chunks, emitting data events for each chunk. You can listen to events like data, end, and error to process the file content.

  • path: String path to the file.
  • options: Optional settings like encoding and highWaterMark (chunk size).
javascript
const fs = require('fs');

const stream = fs.createReadStream('file.txt', { encoding: 'utf8', highWaterMark: 1024 });

stream.on('data', (chunk) => {
  console.log('Received chunk:', chunk);
});

stream.on('end', () => {
  console.log('Finished reading file.');
});

stream.on('error', (err) => {
  console.error('Error reading file:', err);
});
💻

Example

This example reads a large text file named largefile.txt using a stream. It prints each chunk of data as it is read, then signals when the reading is complete. This method avoids loading the entire file into memory at once.

javascript
import fs from 'fs';

const readStream = fs.createReadStream('largefile.txt', { encoding: 'utf8', highWaterMark: 64 * 1024 });

readStream.on('data', (chunk) => {
  console.log('Chunk received:', chunk.length, 'characters');
});

readStream.on('end', () => {
  console.log('File reading completed.');
});

readStream.on('error', (error) => {
  console.error('Error:', error.message);
});
Output
Chunk received: 65536 characters Chunk received: 65536 characters ... File reading completed.
⚠️

Common Pitfalls

Common mistakes when reading large files with streams include:

  • Not handling the error event, which can crash your program if the file is missing or unreadable.
  • Trying to read the entire file into memory instead of using streams, causing memory overload.
  • Not setting the correct encoding, resulting in buffers instead of strings.
  • Ignoring backpressure, which can cause slow processing or memory issues if the consumer is slower than the stream.
javascript
const fs = require('fs');

// Wrong: reading entire file at once (bad for large files)
fs.readFile('largefile.txt', 'utf8', (err, data) => {
  if (err) throw err;
  console.log(data);
});

// Right: using stream with error handling
const stream = fs.createReadStream('largefile.txt', { encoding: 'utf8' });
stream.on('data', chunk => console.log(chunk));
stream.on('error', err => console.error('Error:', err));
📊

Quick Reference

Tips for reading large files with streams in Node.js:

  • Use fs.createReadStream() to read files chunk by chunk.
  • Set encoding to get strings instead of buffers.
  • Listen to data, end, and error events.
  • Adjust highWaterMark to control chunk size (default is 64 KB).
  • Always handle errors to avoid crashes.

Key Takeaways

Use fs.createReadStream() to read large files efficiently in chunks.
Always handle 'error' events to prevent your program from crashing.
Set encoding to 'utf8' to work with string data instead of buffers.
Avoid reading entire large files into memory to prevent overload.
Adjust highWaterMark to optimize chunk size based on your needs.